Science.gov

Sample records for additional systematic uncertainty

  1. On LBNE neutrino flux systematic uncertainties

    SciTech Connect

    Lebrun, Paul L. G.; Hylen, James; Marchionni, Alberto; Fields, Laura; Bashyal, Amit; Park, Seongtae; Watson, Blake

    2015-10-15

    The systematic uncertainties in the neutrino flux of the Long-Baseline Neutrino Experiment, due to alignment uncertanties and tolerances of the neutrino beamline components, are estimated. In particular residual systematics are evaluated in the determination of the neutrino flux at the far detector, assuming that the experiment will be equipped with a near detector with the same target material of the far detector, thereby canceling most of the uncertainties from hadroproduction and neutrino cross sections. This calculation is based on a detailed Geant4-based model of the neutrino beam line that includes the target, two focusing horns, the decay pipe and ancillary items, such as shielding.

  2. Planck 2015 results. III. LFI systematic uncertainties

    NASA Astrophysics Data System (ADS)

    Planck Collaboration; Ade, P. A. R.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartolo, N.; Basak, S.; Battaglia, P.; Battaner, E.; Benabed, K.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Burigana, C.; Butler, R. C.; Calabrese, E.; Catalano, A.; Christensen, P. R.; Colombo, L. P. L.; Cruz, M.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Dickinson, C.; Diego, J. M.; Doré, O.; Ducout, A.; Dupac, X.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Finelli, F.; Frailis, M.; Franceschet, C.; Franceschi, E.; Galeotta, S.; Galli, S.; Ganga, K.; Ghosh, T.; Giard, M.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gregorio, A.; Gruppuso, A.; Hansen, F. K.; Harrison, D. L.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Keihänen, E.; Keskitalo, R.; Kiiveri, K.; Kisner, T. S.; Knoche, J.; Krachmalnicoff, N.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Leahy, J. P.; Leonardi, R.; Levrier, F.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; Lindholm, V.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maffei, B.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Maris, M.; Martin, P. G.; Martínez-González, E.; Masi, S.; Matarrese, S.; Meinhold, P. R.; Mennella, A.; Migliaccio, M.; Mitra, S.; Montier, L.; Morgante, G.; Mortlock, D.; Munshi, D.; Murphy, J. A.; Nati, F.; Natoli, P.; Noviello, F.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Partridge, B.; Pasian, F.; Pearson, T. J.; Perdereau, O.; Pettorino, V.; Piacentini, F.; Pointecouteau, E.; Polenta, G.; Pratt, G. W.; Puget, J.-L.; Rachen, J. P.; Reinecke, M.; Remazeilles, M.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Rosset, C.; Rossetti, M.; Roudier, G.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Santos, D.; Savelainen, M.; Scott, D.; Stolyarov, V.; Stompor, R.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Tavagnacco, D.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Umana, G.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vassallo, T.; Vielva, P.; Villa, F.; Wade, L. A.; Wandelt, B. D.; Watson, R.; Wehus, I. K.; Yvon, D.; Zacchei, A.; Zibin, J. P.; Zonca, A.

    2016-09-01

    We present the current accounting of systematic effect uncertainties for the Low Frequency Instrument (LFI) that are relevant to the 2015 release of the Planck cosmological results, showing the robustness and consistency of our data set, especially for polarization analysis. We use two complementary approaches: (i) simulations based on measured data and physical models of the known systematic effects; and (ii) analysis of difference maps containing the same sky signal ("null-maps"). The LFI temperature data are limited by instrumental noise. At large angular scales the systematic effects are below the cosmic microwave background (CMB) temperature power spectrum by several orders of magnitude. In polarization the systematic uncertainties are dominated by calibration uncertainties and compete with the CMB E-modes in the multipole range 10-20. Based on our model of all known systematic effects, we show that these effects introduce a slight bias of around 0.2σ on the reionization optical depth derived from the 70GHz EE spectrum using the 30 and 353GHz channels as foreground templates. At 30GHz the systematic effects are smaller than the Galactic foreground at all scales in temperature and polarization, which allows us to consider this channel as a reliable template of synchrotron emission. We assess the residual uncertainties due to LFI effects on CMB maps and power spectra after component separation and show that these effects are smaller than the CMB amplitude at all scales. We also assess the impact on non-Gaussianity studies and find it to be negligible. Some residuals still appear in null maps from particular sky survey pairs, particularly at 30 GHz, suggesting possible straylight contamination due to an imperfect knowledge of the beam far sidelobes.

  3. Extending BEAMS to incorporate correlated systematic uncertainties

    SciTech Connect

    Knights, Michelle; Bassett, Bruce A.; Varughese, Melvin; Newling, James; Hlozek, Renée; Kunz, Martin; Smith, Mat E-mail: bruce@saao.ac.za E-mail: renee.hlozek@gmail.com E-mail: matsmith2@gmail.com

    2013-01-01

    New supernova surveys such as the Dark Energy Survey, Pan-STARRS and the LSST will produce an unprecedented number of photometric supernova candidates, most with no spectroscopic data. Avoiding biases in cosmological parameters due to the resulting inevitable contamination from non-Ia supernovae can be achieved with the BEAMS formalism, allowing for fully photometric supernova cosmology studies. Here we extend BEAMS to deal with the case in which the supernovae are correlated by systematic uncertainties. The analytical form of the full BEAMS posterior requires evaluating 2{sup N} terms, where N is the number of supernova candidates. This 'exponential catastrophe' is computationally unfeasible even for N of order 100. We circumvent the exponential catastrophe by marginalising numerically instead of analytically over the possible supernova types: we augment the cosmological parameters with nuisance parameters describing the covariance matrix and the types of all the supernovae, τ{sub i}, that we include in our MCMC analysis. We show that this method deals well even with large, unknown systematic uncertainties without a major increase in computational time, whereas ignoring the correlations can lead to significant biases and incorrect credible contours. We then compare the numerical marginalisation technique with a perturbative expansion of the posterior based on the insight that future surveys will have exquisite light curves and hence the probability that a given candidate is a Type Ia will be close to unity or zero, for most objects. Although this perturbative approach changes computation of the posterior from a 2{sup N} problem into an N{sup 2} or N{sup 3} one, we show that it leads to biases in general through a small number of misclassifications, implying that numerical marginalisation is superior.

  4. ON THE ESTIMATION OF SYSTEMATIC UNCERTAINTIES OF STAR FORMATION HISTORIES

    SciTech Connect

    Dolphin, Andrew E.

    2012-05-20

    In most star formation history (SFH) measurements, the reported uncertainties are those due to effects whose sizes can be readily measured: Poisson noise, adopted distance and extinction, and binning choices in the solution itself. However, the largest source of error, systematics in the adopted isochrones, is usually ignored and very rarely explicitly incorporated into the uncertainties. I propose a process by which estimates of the uncertainties due to evolutionary models can be incorporated into the SFH uncertainties. This process relies on application of shifts in temperature and luminosity, the sizes of which must be calibrated for the data being analyzed. While there are inherent limitations, the ability to estimate the effect of systematic errors and include them in the overall uncertainty is significant. The effects of this are most notable in the case of shallow photometry, with which SFH measurements rely on evolved stars.

  5. Systematic Analysis Of Ocean Colour Uncertainties

    NASA Astrophysics Data System (ADS)

    Lavender, Samantha

    2013-12-01

    This paper reviews current research into the estimation of uncertainties as a pixel-based measure to aid non- specialist users of remote sensing products. An example MERIS image, captured on the 28 March 2012, was processed with above-water atmospheric correction code. This was initially based on both the Antoine & Morel Standard Atmospheric Correction, with Bright Pixel correction component, and Doerffer Neural Network coastal water's approach. It's showed that analysis of the atmospheric by-products yield important information about the separation of the atmospheric and in-water signals, helping to sign-post possible uncertainties in the atmospheric correction results. Further analysis has concentrated on implementing a ‘simplistic' atmospheric correction so that the impact of changing the input auxiliary data can be analysed; the influence of changing surface pressure is demonstrated. Future work will focus on automating the analysis, so that the methodology can be implemented within an operational system.

  6. Systematic uncertainties from halo asphericity in dark matter searches

    SciTech Connect

    Bernal, Nicolás; Forero-Romero, Jaime E.; Garani, Raghuveer; Palomares-Ruiz, Sergio E-mail: je.forero@uniandes.edu.co E-mail: sergio.palomares.ruiz@ific.uv.es

    2014-09-01

    Although commonly assumed to be spherical, dark matter halos are predicted to be non-spherical by N-body simulations and their asphericity has a potential impact on the systematic uncertainties in dark matter searches. The evaluation of these uncertainties is the main aim of this work, where we study the impact of aspherical dark matter density distributions in Milky-Way-like halos on direct and indirect searches. Using data from the large N-body cosmological simulation Bolshoi, we perform a statistical analysis and quantify the systematic uncertainties on the determination of local dark matter density and the so-called J factors for dark matter annihilations and decays from the galactic center. We find that, due to our ignorance about the extent of the non-sphericity of the Milky Way dark matter halo, systematic uncertainties can be as large as 35%, within the 95% most probable region, for a spherically averaged value for the local density of 0.3-0.4 GeV/cm {sup 3}. Similarly, systematic uncertainties on the J factors evaluated around the galactic center can be as large as 10% and 15%, within the 95% most probable region, for dark matter annihilations and decays, respectively.

  7. The effect of uncertainty and systematic errors in hydrological modelling

    NASA Astrophysics Data System (ADS)

    Steinsland, I.; Engeland, K.; Johansen, S. S.; Øverleir-Petersen, A.; Kolberg, S. A.

    2014-12-01

    The aims of hydrological model identification and calibration are to find the best possible set of process parametrization and parameter values that transform inputs (e.g. precipitation and temperature) to outputs (e.g. streamflow). These models enable us to make predictions of streamflow. Several sources of uncertainties have the potential to hamper the possibility of a robust model calibration and identification. In order to grasp the interaction between model parameters, inputs and streamflow, it is important to account for both systematic and random errors in inputs (e.g. precipitation and temperatures) and streamflows. By random errors we mean errors that are independent from time step to time step whereas by systematic errors we mean errors that persists for a longer period. Both random and systematic errors are important in the observation and interpolation of precipitation and temperature inputs. Important random errors comes from the measurements themselves and from the network of gauges. Important systematic errors originate from the under-catch in precipitation gauges and from unknown spatial trends that are approximated in the interpolation. For streamflow observations, the water level recordings might give random errors whereas the rating curve contributes mainly with a systematic error. In this study we want to answer the question "What is the effect of random and systematic errors in inputs and observed streamflow on estimated model parameters and streamflow predictions?". To answer we test systematically the effect of including uncertainties in inputs and streamflow during model calibration and simulation in distributed HBV model operating on daily time steps for the Osali catchment in Norway. The case study is based on observations from, uncertainty carefullt quantified, and increased uncertainties and systmatical errors are done realistically by for example removing a precipitation gauge from the network.We find that the systematical errors in

  8. Mapping Soil Transmitted Helminths and Schistosomiasis under Uncertainty: A Systematic Review and Critical Appraisal of Evidence

    PubMed Central

    Hamm, Nicholas A. S.; Soares Magalhães, Ricardo J.; Stein, Alfred

    2016-01-01

    Background Spatial modelling of STH and schistosomiasis epidemiology is now commonplace. Spatial epidemiological studies help inform decisions regarding the number of people at risk as well as the geographic areas that need to be targeted with mass drug administration; however, limited attention has been given to propagated uncertainties, their interpretation, and consequences for the mapped values. Using currently published literature on the spatial epidemiology of helminth infections we identified: (1) the main uncertainty sources, their definition and quantification and (2) how uncertainty is informative for STH programme managers and scientists working in this domain. Methodology/Principal Findings We performed a systematic literature search using the Preferred Reporting Items for Systematic reviews and Meta-Analysis (PRISMA) protocol. We searched Web of Knowledge and PubMed using a combination of uncertainty, geographic and disease terms. A total of 73 papers fulfilled the inclusion criteria for the systematic review. Only 9% of the studies did not address any element of uncertainty, while 91% of studies quantified uncertainty in the predicted morbidity indicators and 23% of studies mapped it. In addition, 57% of the studies quantified uncertainty in the regression coefficients but only 7% incorporated it in the regression response variable (morbidity indicator). Fifty percent of the studies discussed uncertainty in the covariates but did not quantify it. Uncertainty was mostly defined as precision, and quantified using credible intervals by means of Bayesian approaches. Conclusion/Significance None of the studies considered adequately all sources of uncertainties. We highlighted the need for uncertainty in the morbidity indicator and predictor variable to be incorporated into the modelling framework. Study design and spatial support require further attention and uncertainty associated with Earth observation data should be quantified. Finally, more attention

  9. Systematic and random uncertainties of HOAPS-3.2 evaporation

    NASA Astrophysics Data System (ADS)

    Kinzel, Julian; Fennig, Karsten; Schröder, Marc; Andersson, Axel; Bumke, Karl; Dietzsch, Felix

    2015-04-01

    The German Research Foundation (DFG) funds the research programme 'FOR1740 - Atlantic freshwater cycle', which aims at analysing and better understanding the freshwater budget of the Atlantic Ocean and the role of freshwater fluxes (evaporation minus precipitation) in context of oceanic surface salinity variability. It is well-known that these freshwater fluxes play an essential role in the global hydrological cycle and thus act as a key boundary condition for coupled ocean-atmosphere general circulation models. However, it remains unclear as to how uncertain evaporation (E) and precipitation (P ) are. Once quantified, freshwater flux fields and their underlying total uncertainty (systematic plus random) may be assimilated into ocean models to compute ocean transports and run-off estimates, which in turn serve as a stringent test on the quality of the input data. The Hamburg Ocean Atmosphere Parameters and Fluxes from Satellite Data (HOAPS) (Andersson et al. (2010), Fennig et al. (2012)) is an entirely satellite-based climatology, based on microwave radiometers, overcoming the lack of oceanic in-situ records. Its most current version, HOAPS-3.2, comprises 21 years (1987-2008) of pixel-level resolution data of numerous geophysical parameters over the global ice-free oceans. Amongst others, these include wind speed (u), near-surface specific humidity (q), and sea surface temperature (SST). Their uncertainties essentially contribute to the uncertainty in latent heat flux (LHF) and consequently to that of evaporation (E). Here, we will present HOAPS-3.2 pixel-level total uncertainty estimates of evaporation, based on a full error propagation of uncertainties in u, q, and SST. Both systematic and random uncertainty components are derived on the basis of collocated match-ups of satellite pixels, selected buoys, and ship records. The in-situ data is restricted to 1995 until 2008 and is provided by the Seewetteramt Hamburg as well as ICOADS Version 2.5 (Woodruff et al

  10. Efficiently estimating salmon escapement uncertainty using systematically sampled data

    USGS Publications Warehouse

    Reynolds, Joel H.; Woody, Carol Ann; Gove, Nancy E.; Fair, Lowell F.

    2007-01-01

    Fish escapement is generally monitored using nonreplicated systematic sampling designs (e.g., via visual counts from towers or hydroacoustic counts). These sampling designs support a variety of methods for estimating the variance of the total escapement. Unfortunately, all the methods give biased results, with the magnitude of the bias being determined by the underlying process patterns. Fish escapement commonly exhibits positive autocorrelation and nonlinear patterns, such as diurnal and seasonal patterns. For these patterns, poor choice of variance estimator can needlessly increase the uncertainty managers have to deal with in sustaining fish populations. We illustrate the effect of sampling design and variance estimator choice on variance estimates of total escapement for anadromous salmonids from systematic samples of fish passage. Using simulated tower counts of sockeye salmon Oncorhynchus nerka escapement on the Kvichak River, Alaska, five variance estimators for nonreplicated systematic samples were compared to determine the least biased. Using the least biased variance estimator, four confidence interval estimators were compared for expected coverage and mean interval width. Finally, five systematic sampling designs were compared to determine the design giving the smallest average variance estimate for total annual escapement. For nonreplicated systematic samples of fish escapement, all variance estimators were positively biased. Compared to the other estimators, the least biased estimator reduced bias by, on average, from 12% to 98%. All confidence intervals gave effectively identical results. Replicated systematic sampling designs consistently provided the smallest average estimated variance among those compared.

  11. Systematic Uncertainties in High-Energy Hadronic Interaction Models

    NASA Astrophysics Data System (ADS)

    Zha, M.; Knapp, J.; Ostapchenko, S.

    2003-07-01

    Hadronic interaction models for cosmic ray energies are uncertain since our knowledge of hadronic interactions is extrap olated from accelerator experiments at much lower energies. At present most high-energy models are based on Grib ov-Regge theory of multi-Pomeron exchange, which provides a theoretical framework to evaluate cross-sections and particle production. While experimental data constrain some of the model parameters, others are not well determined and are therefore a source of systematic uncertainties. In this paper we evaluate the variation of results obtained with the QGSJET model, when modifying parameters relating to three ma jor sources of uncertainty: the form of the parton structure function, the role of diffractive interactions, and the string hadronisation. Results on inelastic cross sections, on secondary particle production and on the air shower development are discussed.

  12. A systematic uncertainty analysis for liner impedance eduction technology

    NASA Astrophysics Data System (ADS)

    Zhou, Lin; Bodén, Hans

    2015-11-01

    The so-called impedance eduction technology is widely used for obtaining acoustic properties of liners used in aircraft engines. The measurement uncertainties for this technology are still not well understood though it is essential for data quality assessment and model validation. A systematic framework based on multivariate analysis is presented in this paper to provide 95 percent confidence interval uncertainty estimates in the process of impedance eduction. The analysis is made using a single mode straightforward method based on transmission coefficients involving the classic Ingard-Myers boundary condition. The multivariate technique makes it possible to obtain an uncertainty analysis for the possibly correlated real and imaginary parts of the complex quantities. The results show that the errors in impedance results at low frequency mainly depend on the variability of transmission coefficients, while the mean Mach number accuracy is the most important source of error at high frequencies. The effect of Mach numbers used in the wave dispersion equation and in the Ingard-Myers boundary condition has been separated for comparison of the outcome of impedance eduction. A local Mach number based on friction velocity is suggested as a way to reduce the inconsistencies found when estimating impedance using upstream and downstream acoustic excitation.

  13. Systematic Uncertainties in High-Rate Germanium Data

    SciTech Connect

    Gilbert, Andrew J.; Fast, James E.; Fulsom, Bryan G.; Pitts, William K.; VanDevender, Brent A.; Wood, Lynn S.

    2016-10-06

    For many nuclear material safeguards inspections, spectroscopic gamma detectors are required which can achieve high event rates (in excess of 10^6 s^-1) while maintaining very good energy resolution for discrimination of neighboring gamma signatures in complex backgrounds. Such spectra can be useful for non-destructive assay (NDA) of spent nuclear fuel with long cooling times, which contains many potentially useful low-rate gamma lines, e.g., Cs-134, in the presence of a few dominating gamma lines, such as Cs-137. Detectors in use typically sacrifice energy resolution for count rate, e.g., LaBr3, or visa versa, e.g., CdZnTe. In contrast, we anticipate that beginning with a detector with high energy resolution, e.g., high-purity germanium (HPGe), and adapting the data acquisition for high throughput will be able to achieve the goals of the ideal detector. In this work, we present quantification of Cs-134 and Cs-137 activities, useful for fuel burn-up quantification, in fuel that has been cooling for 22.3 years. A segmented, planar HPGe detector is used for this inspection, which has been adapted for a high-rate throughput in excess of 500k counts/s. Using a very-high-statistic spectrum of 2.4*10^11 counts, isotope activities can be determined with very low statistical uncertainty. However, it is determined that systematic uncertainties dominate in such a data set, e.g., the uncertainty in the pulse line shape. This spectrum offers a unique opportunity to quantify this uncertainty and subsequently determine required counting times for given precision on values of interest.

  14. Reducing capture zone uncertainty with a systematic sensitivity analysis.

    PubMed

    Esling, Steven P; Keller, John E; Miller, Kenneth J

    2008-01-01

    The U.S. Environmental Protection Agency has established several methods to delineate wellhead protection areas (WHPAs) around community wells in order to protect them from surface contamination sources. Delineating a WHPA often requires defining the capture zone for a well. Generally, analytical models or arbitrary setback zones have been used to define the capture zone in areas where little is known about the distribution of hydraulic head, hydraulic conductivity, or recharge. Numerical modeling, however, even in areas of sparse data, offers distinct advantages over the more simplified analytical models or arbitrary setback zones. The systematic approach discussed here calibrates a numerical flow model to regional topography and then applies a matrix of plausible recharge to hydraulic conductivity ratios (R/K) to investigate the impact on the size and shape of the capture zone. This approach does not attempt to determine the uncertainty of the model but instead yields several possible capture zones, the composite of which is likely to contain the actual capture zone. A WHPA based on this composite capture zone will protect ground water resources better than one based on any individual capture zone. An application of the method to three communities illustrates development of the R/K matrix and demonstrates that the method is particularly well suited for determining capture zones in alluvial aquifers.

  15. The low-energy structure of the nucleon-nucleon interaction: statistical versus systematic uncertainties

    NASA Astrophysics Data System (ADS)

    Navarro Pérez, R.; Amaro, J. E.; Ruiz Arriola, E.

    2016-11-01

    We analyze the low-energy nucleon-nucleon (NN) interaction by confronting statistical versus systematic uncertainties. This is carried out with the help of model potentials fitted to the Granada-2013 database where a statistically meaningful partial wave analysis comprising a total of 6713 np and pp published scattering data below 350 MeV from 1950 till 2013 has been made. We extract threshold parameter uncertainties from the coupled-channel effective range expansion up to j≤slant 5. We find that for threshold parameters systematic uncertainties are generally at least an order of magnitude larger than statistical uncertainties. Similar results are found for np phase shifts and amplitude parameters.

  16. Additional challenges for uncertainty analysis in river engineering

    NASA Astrophysics Data System (ADS)

    Berends, Koen; Warmink, Jord; Hulscher, Suzanne

    2016-04-01

    The management of rivers for improving safety, shipping and environment requires conscious effort on the part of river managers. River engineers design hydraulic works to tackle various challenges, from increasing flow conveyance to ensuring minimal water depths for environmental flow and inland shipping. Last year saw the completion of such large scale river engineering in the 'Room for the River' programme for the Dutch Rhine River system, in which several dozen of human interventions were built to increase flood safety. Engineering works in rivers are not completed in isolation from society. Rather, their benefits - increased safety, landscaping beauty - and their disadvantages - expropriation, hindrance - directly affect inhabitants. Therefore river managers are required to carefully defend their plans. The effect of engineering works on river dynamics is being evaluated using hydraulic river models. Two-dimensional numerical models based on the shallow water equations provide the predictions necessary to make decisions on designs and future plans. However, like all environmental models, these predictions are subject to uncertainty. In recent years progress has been made in the identification of the main sources of uncertainty for hydraulic river models. Two of the most important sources are boundary conditions and hydraulic roughness (Warmink et al. 2013). The result of these sources of uncertainty is that the identification of single, deterministic prediction model is a non-trivial task. This is this is a well-understood problem in other fields as well - most notably hydrology - and known as equifinality. However, the particular case of human intervention modelling with hydraulic river models compounds the equifinality case. The model that provides the reference baseline situation is usually identified through calibration and afterwards modified for the engineering intervention. This results in two distinct models, the evaluation of which yields the effect of

  17. Uncertainty

    USGS Publications Warehouse

    Hunt, Randall J.

    2012-01-01

    Management decisions will often be directly informed by model predictions. However, we now know there can be no expectation of a single ‘true’ model; thus, model results are uncertain. Understandable reporting of underlying uncertainty provides necessary context to decision-makers, as model results are used for management decisions. This, in turn, forms a mechanism by which groundwater models inform a risk-management framework because uncertainty around a prediction provides the basis for estimating the probability or likelihood of some event occurring. Given that the consequences of management decisions vary, it follows that the extent of and resources devoted to an uncertainty analysis may depend on the consequences. For events with low impact, a qualitative, limited uncertainty analysis may be sufficient for informing a decision. For events with a high impact, on the other hand, the risks might be better assessed and associated decisions made using a more robust and comprehensive uncertainty analysis. The purpose of this chapter is to provide guidance on uncertainty analysis through discussion of concepts and approaches, which can vary from heuristic (i.e. the modeller’s assessment of prediction uncertainty based on trial and error and experience) to a comprehensive, sophisticated, statistics-based uncertainty analysis. Most of the material presented here is taken from Doherty et al. (2010) if not otherwise cited. Although the treatment here is necessarily brief, the reader can find citations for the source material and additional references within this chapter.

  18. Quantifying uncertainty of determination by standard additions and serial dilutions methods taking into account standard uncertainties in both axes.

    PubMed

    Hyk, Wojciech; Stojek, Zbigniew

    2013-06-18

    The analytical expressions for the calculation of the standard uncertainty of the predictor variable either extrapolated or interpolated from a calibration line that takes into account uncertainties in both axes have been derived and successfully verified using the Monte Carlo modeling. These expressions are essential additions to the process of the analyte quantification realized with either the method of standard additions (SAM) or the method of serial dilutions (MSD). The latter one has been proposed as an alternative approach to the SAM procedure. In the MSD approach instead of the sequence of standard additions, the sequence of solvent additions to the spiked sample is performed. The comparison of the calculation results based on the expressions derived to their equivalents obtained from the Monte Carlo simulation, applied to real experimental data sets, confirmed that these expressions are valid in real analytical practice. The estimation of the standard uncertainty of the analyte concentration, quantified via either SAM or MSD or simply a calibration curve, is of great importance for the construction of the uncertainty budget of an analytical procedure. The correct estimation of the standard uncertainty of the analyte concentration is a key issue in the quality assurance in the instrumental analysis.

  19. Uncertainty analysis of the use of a retailer fidelity card scheme in the assessment of food additive intake.

    PubMed

    McNamara, C; Mehegan, J; O'Mahony, C; Safford, B; Smith, B; Tennant, D; Buck, N; Ehrlich, V; Sardi, M; Haldemann, Y; Nordmann, H; Jasti, P R

    2011-12-01

    The feasibility of using a retailer fidelity card scheme to estimate food additive intake was investigated in an earlier study. Fidelity card survey information was combined with information provided by the retailer on levels of the food colour Sunset Yellow (E110) in the foods to estimate a daily exposure to the additive in the Swiss population. As with any dietary exposure method the fidelity card scheme is subject to uncertainties and in this paper the impact of uncertainties associated with input variables including the amounts of food purchased, the levels of E110 in food, the proportion of food purchased at the retailer, the rate of fidelity card usage, the proportion of foods consumed outside of the home and bodyweights and with systematic uncertainties was assessed using a qualitative, deterministic and probabilistic approach. An analysis of the sensitivity of the results to each of the probabilistic inputs was also undertaken. The analysis identified the key factors responsible for uncertainty within the model and demonstrated how the application of some simple probabilistic approaches can be used quantitatively to assess uncertainty.

  20. Systematic and Statistical Uncertainties in Simulated r-Process Abundances due to Uncertain Nuclear Masses

    NASA Astrophysics Data System (ADS)

    Surman, Rebecca; Mumpower, Matthew; McLaughlin, Gail

    Unknown nuclear masses are a major source of nuclear physics uncertainty for r-process nucleosynthesis calculations. Here we examine the systematic and statistical uncertainties that arise in r-process abundance predictions due to uncertainties in the masses of nuclear species on the neutron-rich side of stability. There is a long history of examining systematic uncertainties by the application of a variety of different mass models to r-process calculations. Here we expand upon such efforts by examining six DFT mass models, where we capture the full impact of each mass model by updating the other nuclear properties — including neutron capture rates, β -decay lifetimes, and β -delayed neutron emission probabilities — that depend on the masses. Unlike systematic effects, statistical uncertainties in the r-process pattern have just begun to be explored. Here we apply a global Monte Carlo approach, starting from the latest FRDM masses and considering random mass variations within the FRDM rms error. We find in each approach that uncertain nuclear masses produce dramatic uncertainties in calculated r-process yields, which can be reduced in upcoming experimental campaigns.

  1. Systematic tests for position-dependent additive shear bias

    NASA Astrophysics Data System (ADS)

    van Uitert, Edo; Schneider, Peter

    2016-11-01

    We present new tests to identify stationary position-dependent additive shear biases in weak gravitational lensing data sets. These tests are important diagnostics for currently ongoing and planned cosmic shear surveys, as such biases induce coherent shear patterns that can mimic and potentially bias the cosmic shear signal. The central idea of these tests is to determine the average ellipticity of all galaxies with shape measurements in a grid in the pixel plane. The distribution of the absolute values of these averaged ellipticities can be compared to randomised catalogues; a difference points to systematics in the data. In addition, we introduce a method to quantify the spatial correlation of the additive bias, which suppresses the contribution from cosmic shear and therefore eases the identification of a position-dependent additive shear bias in the data. We apply these tests to the publicly available shear catalogues from the Canada-France-Hawaii Telescope Lensing Survey (CFHTLenS) and the Kilo Degree Survey (KiDS) and find evidence for a small but non-negligible residual additive bias at small scales. As this residual bias is smaller than the error on the shear correlation signal at those scales, it is highly unlikely that it causes a significant bias in the published cosmic shear results of CFHTLenS. In CFHTLenS, the amplitude of this systematic signal is consistent with zero in fields where the number of stars used to model the point spread function (PSF) is higher than average, suggesting that the position-dependent additive shear bias originates from undersampled PSF variations across the image.

  2. Systematic uncertainties in long-baseline neutrino oscillations for large θ₁₃

    SciTech Connect

    Coloma, Pilar; Huber, Patrick; Kopp, Joachim; Winter, Walter

    2013-02-01

    We study the physics potential of future long-baseline neutrino oscillation experiments at large θ₁₃, focusing especially on systematic uncertainties. We discuss superbeams, \\bbeams, and neutrino factories, and for the first time compare these experiments on an equal footing with respect to systematic errors. We explicitly simulate near detectors for all experiments, we use the same implementation of systematic uncertainties for all experiments, and we fully correlate the uncertainties among detectors, oscillation channels, and beam polarizations as appropriate. As our primary performance indicator, we use the achievable precision in the measurement of the CP violating phase $\\deltacp$. We find that a neutrino factory is the only instrument that can measure $\\deltacp$ with a precision similar to that of its quark sector counterpart. All neutrino beams operating at peak energies ≳2 GeV are quite robust with respect to systematic uncertainties, whereas especially \\bbeams and \\thk suffer from large cross section uncertainties in the quasi-elastic regime, combined with their inability to measure the appearance signal cross sections at the near detector. A noteworthy exception is the combination of a γ =100 \\bbeam with an \\spl-based superbeam, in which all relevant cross sections can be measured in a self-consistent way. This provides a performance, second only to the neutrino factory. For other superbeam experiments such as \\lbno and the setups studied in the context of the \\lbne reconfiguration effort, statistics turns out to be the bottleneck. In almost all cases, the near detector is not critical to control systematics since the combined fit of appearance and disappearance data already constrains the impact of systematics to be small provided that the three active flavor oscillation framework is valid.

  3. Systematic uncertainties associated with the cosmological analysis of the first Pan-STARRS1 type Ia supernova sample

    SciTech Connect

    Scolnic, D.; Riess, A.; Brout, D.; Rodney, S.; Rest, A.; Huber, M. E.; Tonry, J. L.; Foley, R. J.; Chornock, R.; Berger, E.; Soderberg, A. M.; Stubbs, C. W.; Kirshner, R. P.; Challis, P.; Czekala, I.; Drout, M.; Narayan, G.; Smartt, S. J.; Botticella, M. T.; Schlafly, E.; and others

    2014-11-01

    We probe the systematic uncertainties from the 113 Type Ia supernovae (SN Ia) in the Pan-STARRS1 (PS1) sample along with 197 SN Ia from a combination of low-redshift surveys. The companion paper by Rest et al. describes the photometric measurements and cosmological inferences from the PS1 sample. The largest systematic uncertainty stems from the photometric calibration of the PS1 and low-z samples. We increase the sample of observed Calspec standards from 7 to 10 used to define the PS1 calibration system. The PS1 and SDSS-II calibration systems are compared and discrepancies up to ∼0.02 mag are recovered. We find uncertainties in the proper way to treat intrinsic colors and reddening produce differences in the recovered value of w up to 3%. We estimate masses of host galaxies of PS1 supernovae and detect an insignificant difference in distance residuals of the full sample of 0.037 ± 0.031 mag for host galaxies with high and low masses. Assuming flatness and including systematic uncertainties in our analysis of only SNe measurements, we find w =−1.120{sub −0.206}{sup +0.360}(Stat){sub −0.291}{sup +0.269}(Sys). With additional constraints from Baryon acoustic oscillation, cosmic microwave background (CMB) (Planck) and H {sub 0} measurements, we find w=−1.166{sub −0.069}{sup +0.072} and Ω{sub m}=0.280{sub −0.012}{sup +0.013} (statistical and systematic errors added in quadrature). The significance of the inconsistency with w = –1 depends on whether we use Planck or Wilkinson Microwave Anisotropy Probe measurements of the CMB: w{sub BAO+H0+SN+WMAP}=−1.124{sub −0.065}{sup +0.083}.

  4. Accounting for uncertainty in systematic bias in exposure estimates used in relative risk regression

    SciTech Connect

    Gilbert, E.S.

    1995-12-01

    In many epidemiologic studies addressing exposure-response relationships, sources of error that lead to systematic bias in exposure measurements are known to be present, but there is uncertainty in the magnitude and nature of the bias. Two approaches that allow this uncertainty to be reflected in confidence limits and other statistical inferences were developed, and are applicable to both cohort and case-control studies. The first approach is based on a numerical approximation to the likelihood ratio statistic, and the second uses computer simulations based on the score statistic. These approaches were applied to data from a cohort study of workers at the Hanford site (1944-86) exposed occupationally to external radiation; to combined data on workers exposed at Hanford, Oak Ridge National Laboratory, and Rocky Flats Weapons plant; and to artificial data sets created to examine the effects of varying sample size and the magnitude of the risk estimate. For the worker data, sampling uncertainty dominated and accounting for uncertainty in systematic bias did not greatly modify confidence limits. However, with increased sample size, accounting for these uncertainties became more important, and is recommended when there is interest in comparing or combining results from different studies.

  5. Uncertainties and Systematic Effects on the estimate of stellar masses in high z galaxies

    NASA Astrophysics Data System (ADS)

    Salimbeni, S.; Fontana, A.; Giallongo, E.; Grazian, A.; Menci, N.; Pentericci, L.; Santini, P.

    2009-05-01

    We discuss the uncertainties and the systematic effects that exist in the estimates of the stellar masses of high redshift galaxies, using broad band photometry, and how they affect the deduced galaxy stellar mass function. We use at this purpose the latest version of the GOODS-MUSIC catalog. In particular, we discuss the impact of different synthetic models, of the assumed initial mass function and of the selection band. Using Chariot & Bruzual 2007 and Maraston 2005 models we find masses lower than those obtained from Bruzual & Chariot 2003 models. In addition, we find a slight trend as a function of the mass itself comparing these two mass determinations with that from Bruzual & Chariot 2003 models. As consequence, the derived galaxy stellar mass functions show diverse shapes, and their slope depends on the assumed models. Despite these differences, the overall results and scenario is observed in all these cases. The masses obtained with the assumption of the Chabrier initial mass function are in average 0.24 dex lower than those from the Salpeter assumption, at all redshifts, causing a shift of galaxy stellar mass function of the same amount. Finally, using a 4.5 μm-selected sample instead of a Ks-selected one, we add a new population of highly absorbed, dusty galaxies at z~=2-3 of relatively low masses, yielding stronger constraints on the slope of the galaxy stellar mass function at lower masses.

  6. Systematic study of the uncertainties in fitting the cosmic positron data by AMS-02

    SciTech Connect

    Yuan, Qiang; Bi, Xiao-Jun E-mail: bixj@ihep.ac.cn

    2015-03-01

    The operation of AMS-02 opens a new era for the study of cosmic ray physics with unprecedentedly precise data which are comparable with the laboratory measurements. The high precision data allow a quantitative study on the cosmic ray physics and give strict constraints on the nature of cosmic ray sources. However, the intrinsic errors from the theoretical models to interpret the data become dominant over the errors in the data. In the present work we try to give a systematic study on the uncertainties of the models to explain the AMS-02 positron fraction data, which shows the cosmic ray e{sup +}e{sup −} excesses together with the PAMELA and Fermi-LAT measurements. The excesses can be attributed to contributions from the extra e{sup +}e{sup −} sources, such as pulsars or the dark matter annihilation. The possible systematic uncertainties of the theoretical models considered include the cosmic ray propagation, the treatment of the low energy data, the solar modulation, the pp interaction models, the nuclei injection spectrum and so on. We find that in general a spectral hardening of the primary electron injection spectrum above ∼50–100 GeV is favored by the data. Furthermore, the present model uncertainties may lead to a factor of ∼2 enlargement in the determination of the parameter regions of the extra source, such as the dark matter mass, annihilation rate and so on.

  7. A new approach to handle additive and multiplicative uncertainties in the measurement for ? LPV filtering

    NASA Astrophysics Data System (ADS)

    Lacerda, Márcio J.; Tognetti, Eduardo S.; Oliveira, Ricardo C. L. F.; Peres, Pedro L. D.

    2016-04-01

    This paper presents a general framework to cope with full-order ? linear parameter-varying (LPV) filter design subject to inexactly measured parameters. The main novelty is the ability of handling additive and multiplicative uncertainties in the measurements, for both continuous and discrete-time LPV systems, in a unified approach. By conveniently modelling scheduling parameters and uncertainties affecting the measurements, the ? filter design problem can be expressed in terms of robust matrix inequalities that become linear when two scalar parameters are fixed. Therefore, the proposed conditions can be efficiently solved through linear matrix inequality relaxations based on polynomial solutions. Numerical examples are presented to illustrate the improved efficiency of the proposed approach when compared to other methods and, more important, its capability to deal with scenarios where the available strategies in the literature cannot be used.

  8. Single-Ion Atomic Clock with 3×10(-18) Systematic Uncertainty.

    PubMed

    Huntemann, N; Sanner, C; Lipphardt, B; Tamm, Chr; Peik, E

    2016-02-12

    We experimentally investigate an optical frequency standard based on the (2)S1/2(F=0)→(2)F7/2(F=3) electric octupole (E3) transition of a single trapped (171)Yb+ ion. For the spectroscopy of this strongly forbidden transition, we utilize a Ramsey-type excitation scheme that provides immunity to probe-induced frequency shifts. The cancellation of these shifts is controlled by interleaved single-pulse Rabi spectroscopy, which reduces the related relative frequency uncertainty to 1.1×10(-18). To determine the frequency shift due to thermal radiation emitted by the ion's environment, we measure the static scalar differential polarizability of the E3 transition as 0.888(16)×10(-40)  J m(2)/V(2) and a dynamic correction η(300  K)=-0.0015(7). This reduces the uncertainty due to thermal radiation to 1.8×10(-18). The residual motion of the ion yields the largest contribution (2.1×10(-18)) to the total systematic relative uncertainty of the clock of 3.2×10(-18).

  9. UNITY: Confronting Supernova Cosmology's Statistical and Systematic Uncertainties in a Unified Bayesian Framework

    NASA Astrophysics Data System (ADS)

    Rubin, D.; Aldering, G.; Barbary, K.; Boone, K.; Chappell, G.; Currie, M.; Deustua, S.; Fagrelius, P.; Fruchter, A.; Hayden, B.; Lidman, C.; Nordin, J.; Perlmutter, S.; Saunders, C.; Sofiatti, C.; Supernova Cosmology Project, The

    2015-11-01

    While recent supernova (SN) cosmology research has benefited from improved measurements, current analysis approaches are not statistically optimal and will prove insufficient for future surveys. This paper discusses the limitations of current SN cosmological analyses in treating outliers, selection effects, shape- and color-standardization relations, unexplained dispersion, and heterogeneous observations. We present a new Bayesian framework, called UNITY (Unified Nonlinear Inference for Type-Ia cosmologY), that incorporates significant improvements in our ability to confront these effects. We apply the framework to real SN observations and demonstrate smaller statistical and systematic uncertainties. We verify earlier results that SNe Ia require nonlinear shape and color standardizations, but we now include these nonlinear relations in a statistically well-justified way. This analysis was primarily performed blinded, in that the basic framework was first validated on simulated data before transitioning to real data. We also discuss possible extensions of the method.

  10. A review of sources of systematic errors and uncertainties in observations and simulations at 183 GHz

    NASA Astrophysics Data System (ADS)

    Brogniez, Helene; English, Stephen; Mahfouf, Jean-Francois; Behrendt, Andreas; Berg, Wesley; Boukabara, Sid; Buehler, Stefan Alexander; Chambon, Philippe; Gambacorta, Antonia; Geer, Alan; Ingram, William; Kursinski, E. Robert; Matricardi, Marco; Odintsova, Tatyana A.; Payne, Vivienne H.; Thorne, Peter W.; Tretyakov, Mikhail Yu.; Wang, Junhong

    2016-05-01

    Several recent studies have observed systematic differences between measurements in the 183.31 GHz water vapor line by space-borne sounders and calculations using radiative transfer models, with inputs from either radiosondes (radiosonde observations, RAOBs) or short-range forecasts by numerical weather prediction (NWP) models. This paper discusses all the relevant categories of observation-based or model-based data, quantifies their uncertainties and separates biases that could be common to all causes from those attributable to a particular cause. Reference observations from radiosondes, Global Navigation Satellite System (GNSS) receivers, differential absorption lidar (DIAL) and Raman lidar are thus overviewed. Biases arising from their calibration procedures, NWP models and data assimilation, instrument biases and radiative transfer models (both the models themselves and the underlying spectroscopy) are presented and discussed. Although presently no single process in the comparisons seems capable of explaining the observed structure of bias, recommendations are made in order to better understand the causes.

  11. Systematic evaluation of an atomic clock at 2 × 10(-18) total uncertainty.

    PubMed

    Nicholson, T L; Campbell, S L; Hutson, R B; Marti, G E; Bloom, B J; McNally, R L; Zhang, W; Barrett, M D; Safronova, M S; Strouse, G F; Tew, W L; Ye, J

    2015-04-21

    The pursuit of better atomic clocks has advanced many research areas, providing better quantum state control, new insights in quantum science, tighter limits on fundamental constant variation and improved tests of relativity. The record for the best stability and accuracy is currently held by optical lattice clocks. Here we take an important step towards realizing the full potential of a many-particle clock with a state-of-the-art stable laser. Our (87)Sr optical lattice clock now achieves fractional stability of 2.2 × 10(-16) at 1 s. With this improved stability, we perform a new accuracy evaluation of our clock, reducing many systematic uncertainties that limited our previous measurements, such as those in the lattice ac Stark shift, the atoms' thermal environment and the atomic response to room-temperature blackbody radiation. Our combined measurements have reduced the total uncertainty of the JILA Sr clock to 2.1 × 10(-18) in fractional frequency units.

  12. Systematic Uncertainties in Characterizing Cluster Outskirts: The Case of Abell 133

    NASA Astrophysics Data System (ADS)

    Paine, Jennie; Ogrean, Georgiana A.; Nulsen, Paul; Farrah, Duncan

    2016-01-01

    The outskirts of galaxy clusters have low surface brightness compared to the X-ray background, making accurate background subtraction particularly important for analyzing cluster spectra out to and beyond the virial radius. We analyze the thermodynamic properties of the intracluster medium (ICM) of Abell 133 and assess the extent to which uncertainties on background subtraction affect measured quantities. We implement two methods of analyzing the ICM spectra: one in which the blank-sky background is subtracted, and another in which the sky background is modeled. We find that the two methods are consistent within the 90% confidence ranges. We were able to measure the thermodynamic properties of the cluster up to R500. Even at R500, the systematic uncertainties associated with the sky background in the direction of A133 are small, despite the ICM signal constituting only ~25% of the total signal. This work was supported in part by the NSF REU and DoD ASSURE programs under NSF grant no. 1262851 and by the Smithsonian Institution. GAO acknowledges support by NASA through a Hubble Fellowship grant HST-HF2-51345.001-A awarded by the Space Telescope Science Institute, which is operated by the Association of Universities for Research in Astronomy, Incorporated, under NASA contract NAS5-26555.

  13. Longitudinal Double-Spin Asymmetry Measurements in p+p and Their Limitation by Systematic Uncertainty in Relative Luminosity

    NASA Astrophysics Data System (ADS)

    Manion, Andrew

    2012-03-01

    We present longitudinal double-spin asymmetries ALL in neutral 0̂ production in the PHENIX detector at RHIC. This measurement has been shown to constrain the gluon spin contribution to the proton, δG. We will also discuss the main systematic uncertainty, which originates from the relative luminosity of the different spin states in RHIC, and new ways to possibly understand the source of this uncertainty.

  14. Radiation therapy treatment plan optimization accounting for random and systematic patient setup uncertainties

    NASA Astrophysics Data System (ADS)

    Moore, Joseph Andrew

    2011-12-01

    External-beam radiotherapy is one of the primary methods for treating cancer. Typically a radiotherapy treatment course consists of radiation delivered to the patient in multiple daily treatment fractions over 6--8 weeks. Each fraction requires the patient to be aligned with the image acquired before the treatment course used in treatment planning. Unfortunately, patient alignment is not perfect and results in residual errors in patient setup. The standard technique for dealing with errors in patient setup is to expand the volume of the target by some margin to ensure the target receives the planned dose in the presence of setup errors. This work develops an alternative to margins for accommodating setup errors in the treatment planning process by directly including patient setup uncertainty in IMRT plan optimization. This probabilistic treatment planning (PTP) operates directly on the planning structure and develops a dose distribution robust to variations in the patient position. Two methods are presented. The first method includes only random setup uncertainty in the planning process by convolving the fluence of each beam with a Gaussian model of the distribution of random setup errors. The second method builds upon this by adding systematic uncertainty to optimization by way of a joint optimization over multiple probable patient positions. To assess the benefit of PTP methods, a PTP plan and a margin-based plan are developed for each of the 28 patients used in this study. Comparisons of plans show that PTP plans generally reduce the dose to normal tissues while maintaining a similar dose to the target structure when compared to margin-based plans. Physician assessment indicates that PTP plans are generally preferred over margin-based plans. PTP methods shows potential for improving patient outcome due to reduced complications associated with treatment.

  15. Systematic construction of genuine-multipartite-entanglement criteria in continuous-variable systems using uncertainty relations

    NASA Astrophysics Data System (ADS)

    Toscano, F.; Saboia, A.; Avelar, A. T.; Walborn, S. P.

    2015-11-01

    A general procedure to construct criteria for identifying genuine multipartite continuous-variable entanglement is presented. It relies on the definition of adequate global operators describing the multipartite system, the positive partial transpose criterion of separability, and quantum-mechanical uncertainty relations. As a consequence, each criterion encountered consists of a single inequality that is nicely computable and experimentally feasible. Violation of the inequality is a sufficient condition for genuine multipartite entanglement. Additionally, we show that the previous work of van Loock and Furusawa [P. van Loock and A. Furusawa, Phys. Rev. A 67, 052315 (2003), 10.1103/PhysRevA.67.052315] is a special case of our result.

  16. Determination of electron beam polarization using electron detector in Compton polarimeter with less than 1% statistical and systematic uncertainty

    SciTech Connect

    Narayan, Amrendra

    2015-05-01

    The Q-weak experiment aims to measure the weak charge of proton with a precision of 4.2%. The proposed precision on weak charge required a 2.5% measurement of the parity violating asymmetry in elastic electron - proton scattering. Polarimetry was the largest experimental contribution to this uncertainty and a new Compton polarimeter was installed in Hall C at Jefferson Lab to make the goal achievable. In this polarimeter the electron beam collides with green laser light in a low gain Fabry-Perot Cavity; the scattered electrons are detected in 4 planes of a novel diamond micro strip detector while the back scattered photons are detected in lead tungstate crystals. This diamond micro-strip detector is the first such device to be used as a tracking detector in a nuclear and particle physics experiment. The diamond detectors are read out using custom built electronic modules that include a preamplifier, a pulse shaping amplifier and a discriminator for each detector micro-strip. We use field programmable gate array based general purpose logic modules for event selection and histogramming. Extensive Monte Carlo simulations and data acquisition simulations were performed to estimate the systematic uncertainties. Additionally, the Moller and Compton polarimeters were cross calibrated at low electron beam currents using a series of interleaved measurements. In this dissertation, we describe all the subsystems of the Compton polarimeter with emphasis on the electron detector. We focus on the FPGA based data acquisition system built by the author and the data analysis methods implemented by the author. The simulations of the data acquisition and the polarimeter that helped rigorously establish the systematic uncertainties of the polarimeter are also elaborated, resulting in the first sub 1% measurement of low energy (?1 GeV) electron beam polarization with a Compton electron detector. We have demonstrated that diamond based micro-strip detectors can be used for tracking in a

  17. Coordinate swapping in standard addition graphs for analytical chemistry: a simplified path for uncertainty calculation in linear and nonlinear plots.

    PubMed

    Meija, Juris; Pagliano, Enea; Mester, Zoltán

    2014-09-02

    Uncertainty of the result from the method of standard addition is often underestimated due to neglect of the covariance between the intercept and the slope. In order to simplify the data analysis from standard addition experiments, we propose x-y coordinate swapping in conventional linear regression. Unlike the ratio of the intercept and slope, which is the result of the traditional method of standard addition, the result of the inverse standard addition is obtained directly from the intercept of the swapped calibration line. Consequently, the uncertainty evaluation becomes markedly simpler. The method is also applicable to nonlinear curves, such as the quadratic model, without incurring any additional complexity.

  18. A Slippery Slope: Systematic Uncertainties in the Line Width Baryonic Tully-Fisher Relation

    NASA Astrophysics Data System (ADS)

    Bradford, Jeremy D.; Geha, Marla C.; van den Bosch, Frank C.

    2016-11-01

    The baryonic Tully-Fisher relation (BTFR) is both a valuable observational tool and a critical test of galaxy formation theory. We explore the systematic uncertainty in the slope and the scatter of the observed line-width BTFR utilizing homogeneously measured, unresolved H i observations for 930 isolated galaxies. We measure a fiducial relation of {{log}}10 {M}{baryon}=3.24 {{log}}10 {V}{rot} + 3.21 with observed scatter of 0.25 dex over a baryonic mass range of 107.4 to 1011.3 {M}⊙ where {V}{rot} is measured from 20% H i line widths. We then conservatively vary the definitions of {M}{baryon} and {V}{rot}, the sample definition and the linear fitting algorithm. We obtain slopes ranging from 2.64 to 3.53 and scatter measurements ranging from 0.14 to 0.41 dex, indicating a significant systematic uncertainty of 0.25 in the BTFR slope derived from unresolved H i line widths. We next compare our fiducial slope to literature measurements, where reported slopes range from 3.0 to 4.3 and scatter is either unmeasured, immeasurable, or as large as 0.4 dex. Measurements derived from unresolved H i line widths tend to produce slopes of 3.3, while measurements derived strictly from resolved asymptotic rotation velocities tend to produce slopes of 3.9. The single largest factor affecting the BTFR slope is the definition of rotation velocity. The sample definition, the mass range and the linear fitting algorithm also significantly affect the measured BTFR. We find that galaxies in our sample with {V}{rot}\\lt 100 km s-1 are consistent with the line-width BTFR of more massive galaxies, but these galaxies drive most of the observed scatter. It is critical when comparing predictions to an observed BTFR that the rotation velocity definition, the sample selection and the fitting algorithm are similarly defined. We recommend direct statistical comparisons between data sets with commensurable properties as opposed to simply comparing BTFR power-law fits.

  19. Multidimensional analysis of fast-spectrum material replacement measurements for systematic estimation of cross section uncertainties

    NASA Technical Reports Server (NTRS)

    Klann, P. G.; Lantz, E.; Mayo, W. T.

    1973-01-01

    A series of central core and core-reflector interface sample replacement experiments for 16 materials performed in the NASA heavy-metal-reflected, fast spectrum critical assembly (NCA) were analyzed in four and 13 groups using the GAM 2 cross-section set. The individual worths obtained by TDSN and DOT multidimensional transport theory calculations showed significant differences from the experimental results. These were attributed to cross-section uncertainties in the GAM 2 cross sections. Simultaneous analysis of the measured and calculated sample worths permitted separation of the worths into capture and scattering components which systematically provided fast spectrum averaged correction factors to the magnitudes of the GAM 2 absorption and scattering cross sections. Several Los Alamos clean critical assemblies containing Oy, Ta, and Mo as well as one of the NCA compositions were reanalyzed using the corrected cross sections. In all cases the eigenvalues were significantly improved and were recomputed to within 1 percent of the experimental eigenvalue. A comparable procedure may be used for ENDF cross sections when these are available.

  20. Systematic Analysis of Resolution and Uncertainties in Gravity Interpretation of Bathymetry Beneath Floating Ice

    NASA Astrophysics Data System (ADS)

    Cochran, J. R.; Tinto, K. J.; Elieff, S. H.; Bell, R. E.

    2011-12-01

    Airborne geophysical surveys in West Antarctica and Greenland carried out during Operation IceBridge (OIB) utilized the Sander Geophysics AIRGrav gravimeter, which collects high quality data during low-altitude, draped flights. This data has been used to determine bathymetry beneath ice shelves and floating ice tongues (e.g., Tinto et al, 2010, Cochran et al, 2010). This paper systematically investigates uncertainties arising from survey, instrumental and geologic constraints in this type of study and the resulting resolution of the bathymetry model. Gravity line data is low-pass filtered with time-based filters to remove high frequency noise. The spatial filter length is dependent on aircraft speed. For parameters used in OIB (70-140 s filters and 270-290 knots), spatial filter half-wavelengths are ~5-10 km. The half-wavelength does not define a lower limit to the width of feature that can be detected, but shorter wavelength features may appear wider with a lower amplitude. Resolution can be improved either by using a shorter filter or by flying slower. Both involve tradeoffs; a shorter filter allows more noise and slower speeds result in less coverage. These filters are applied along tracks, rather than in a region surrounding a measurement. In areas of large gravity relief, tracks in different directions can sample a very different range of gravity values within the length of the filter. We show that this can lead to crossover mismatches of >5 mGal, complicating interpretation. For dense surveys, gridding the data and then sampling the grid at the measurement points can minimize this effect. Resolution is also affected by the elevation of survey flights. For a distributed mass, the gravity amplitude decreases with distance and short-wavelength components attenuate faster. This is not a serious issue for OIB, which flew draped flights <500 m above the ice surface, but is a serious factor for gravimeters that require a constant elevation above the highest

  1. Systematic uncertainties on Delta m2 from neutrino physics, using calorimetric energy reconstruction

    SciTech Connect

    Deborah A. Harris

    2003-08-13

    This report describes how uncertainties in neutrino interactions, particularly at neutrino energies of a few GeV, can contribute to uncertainties in measurements of neutrino oscillation parameters for experiments using calorimetric devices. Uncertainties studied include those on final state multiplicities, cross sections, electron-hadron calorimeter differences, and nuclear rescattering.

  2. Defense Additive Manufacturing: DOD Needs to Systematically Track Department-wide 3D Printing Efforts

    DTIC Science & Technology

    2015-10-01

    Clip Additively Manufactured • The Navy installed a 3D printer aboard the USS Essex to demonstrate the ability to additively develop and produce...desired result and vision to have the capability on the fleet. These officials stated that the Navy plans to install 3D printers on two additional...DEFENSE ADDITIVE MANUFACTURING DOD Needs to Systematically Track Department-wide 3D Printing Efforts Report to

  3. Jet energy measurement and its systematic uncertainty in proton-proton collisions at [Formula: see text] TeV with the ATLAS detector.

    PubMed

    Aad, G; Abajyan, T; Abbott, B; Abdallah, J; Abdel Khalek, S; Abdinov, O; Aben, R; Abi, B; Abolins, M; AbouZeid, O S; Abramowicz, H; Abreu, H; Abulaiti, Y; Acharya, B S; Adamczyk, L; Adams, D L; Addy, T N; Adelman, J; Adomeit, S; Adye, T; Aefsky, S; Agatonovic-Jovin, T; Aguilar-Saavedra, J A; Agustoni, M; Ahlen, S P; Ahmad, A; Ahmadov, F; Aielli, G; Åkesson, T P A; Akimoto, G; Akimov, A V; Alam, M A; Albert, J; Albrand, S; Alconada Verzini, M J; Aleksa, M; Aleksandrov, I N; Alessandria, F; Alexa, C; Alexander, G; Alexandre, G; Alexopoulos, T; Alhroob, M; Aliev, M; Alimonti, G; Alio, L; Alison, J; Allbrooke, B M M; Allison, L J; Allport, P P; Allwood-Spiers, S E; Almond, J; Aloisio, A; Alon, R; Alonso, A; Alonso, F; Altheimer, A; Alvarez Gonzalez, B; Alviggi, M G; Amako, K; Amaral Coutinho, Y; Amelung, C; Ammosov, V V; Amor Dos Santos, S P; Amorim, A; Amoroso, S; Amram, N; Amundsen, G; Anastopoulos, C; Ancu, L S; Andari, N; Andeen, T; Anders, C F; Anders, G; Anderson, K J; Andreazza, A; Andrei, V; Anduaga, X S; Angelidakis, S; Anger, P; Angerami, A; Anghinolfi, F; Anisenkov, A V; Anjos, N; Annovi, A; Antonaki, A; Antonelli, M; Antonov, A; Antos, J; Anulli, F; Aoki, M; Aperio Bella, L; Apolle, R; Arabidze, G; Aracena, I; Arai, Y; Arce, A T H; Arfaoui, S; Arguin, J-F; Argyropoulos, S; Arik, E; Arik, M; Armbruster, A J; Arnaez, O; Arnal, V; Arslan, O; Artamonov, A; Artoni, G; Asai, S; Asbah, N; Ask, S; Åsman, B; Asquith, L; Assamagan, K; Astalos, R; Astbury, A; Atkinson, M; Atlay, N B; Auerbach, B; Auge, E; Augsten, K; Aurousseau, M; Avolio, G; Azuelos, G; Azuma, Y; Baak, M A; Bacci, C; Bach, A M; Bachacou, H; Bachas, K; Backes, M; Backhaus, M; Backus Mayes, J; Badescu, E; Bagiacchi, P; Bagnaia, P; Bai, Y; Bailey, D C; Bain, T; Baines, J T; Baker, O K; Baker, S; Balek, P; Balli, F; Banas, E; Banerjee, Sw; Banfi, D; Bangert, A; Bansal, V; Bansil, H S; Barak, L; Baranov, S P; Barber, T; Barberio, E L; Barberis, D; Barbero, M; Barillari, T; Barisonzi, M; Barklow, T; Barlow, N; Barnett, B M; Barnett, R M; Baroncelli, A; Barone, G; Barr, A J; Barreiro, F; Barreiro Guimarães da Costa, J; Bartoldus, R; Barton, A E; Bartos, P; Bartsch, V; Bassalat, A; Basye, A; Bates, R L; Batkova, L; Batley, J R; Battistin, M; Bauer, F; Bawa, H S; Beau, T; Beauchemin, P H; Beccherle, R; Bechtle, P; Beck, H P; Becker, K; Becker, S; Beckingham, M; Beddall, A J; Beddall, A; Bedikian, S; Bednyakov, V A; Bee, C P; Beemster, L J; Beermann, T A; Begel, M; Behr, K; Belanger-Champagne, C; Bell, P J; Bell, W H; Bella, G; Bellagamba, L; Bellerive, A; Bellomo, M; Belloni, A; Beloborodova, O L; Belotskiy, K; Beltramello, O; Benary, O; Benchekroun, D; Bendtz, K; Benekos, N; Benhammou, Y; Benhar Noccioli, E; Benitez Garcia, J A; Benjamin, D P; Bensinger, J R; Benslama, K; Bentvelsen, S; Berge, D; Bergeaas Kuutmann, E; Berger, N; Berghaus, F; Berglund, E; Beringer, J; Bernard, C; Bernat, P; Bernhard, R; Bernius, C; Bernlochner, F U; Berry, T; Berta, P; Bertella, C; Bertolucci, F; Besana, M I; Besjes, G J; Bessidskaia, O; Besson, N; Bethke, S; Bhimji, W; Bianchi, R M; Bianchini, L; Bianco, M; Biebel, O; Bieniek, S P; Bierwagen, K; Biesiada, J; Biglietti, M; Bilbao De Mendizabal, J; Bilokon, H; Bindi, M; Binet, S; Bingul, A; Bini, C; Bittner, B; Black, C W; Black, J E; Black, K M; Blackburn, D; Blair, R E; Blanchard, J-B; Blazek, T; Bloch, I; Blocker, C; Blocki, J; Blum, W; Blumenschein, U; Bobbink, G J; Bobrovnikov, V S; Bocchetta, S S; Bocci, A; Boddy, C R; Boehler, M; Boek, J; Boek, T T; Boelaert, N; Bogaerts, J A; Bogdanchikov, A G; Bogouch, A; Bohm, C; Bohm, J; Boisvert, V; Bold, T; Boldea, V; Boldyrev, A S; Bolnet, N M; Bomben, M; Bona, M; Boonekamp, M; Bordoni, S; Borer, C; Borisov, A; Borissov, G; Borri, M; Borroni, S; Bortfeldt, J; Bortolotto, V; Bos, K; Boscherini, D; Bosman, M; Boterenbrood, H; Bouchami, J; Boudreau, J; Bouhova-Thacker, E V; Boumediene, D; Bourdarios, C; Bousson, N; Boutouil, S; Boveia, A; Boyd, J; Boyko, I R; Bozovic-Jelisavcic, I; Bracinik, J; Branchini, P; Brandt, A; Brandt, G; Brandt, O; Bratzler, U; Brau, B; Brau, J E; Braun, H M; Brazzale, S F; Brelier, B; Brendlinger, K; Brenner, R; Bressler, S; Bristow, T M; Britton, D; Brochu, F M; Brock, I; Brock, R; Broggi, F; Bromberg, C; Bronner, J; Brooijmans, G; Brooks, T; Brooks, W K; Brosamer, J; Brost, E; Brown, G; Brown, J; Bruckman de Renstrom, P A; Bruncko, D; Bruneliere, R; Brunet, S; Bruni, A; Bruni, G; Bruschi, M; Bryngemark, L; Buanes, T; Buat, Q; Bucci, F; Buchholz, P; Buckingham, R M; Buckley, A G; Buda, S I; Budagov, I A; Budick, B; Buehrer, F; Bugge, L; Bugge, M K; Bulekov, O; Bundock, A C; Bunse, M; Burckhart, H; Burdin, S; Burgess, T; Burghgrave, B; Burke, S; Burmeister, I; Busato, E; Büscher, V; Bussey, P; Buszello, C P; Butler, B; Butler, J M; Butt, A I; Buttar, C M; Butterworth, J M; Buttinger, W; Buzatu, A; Byszewski, M; Cabrera Urbán, S; Caforio, D; Cakir, O; Calafiura, P; Calderini, G; Calfayan, P; Calkins, R; Caloba, L P; Caloi, R; Calvet, D; Calvet, S; Camacho Toro, R; Camarri, P; Cameron, D; Caminada, L M; Caminal Armadans, R; Campana, S; Campanelli, M; Canale, V; Canelli, F; Canepa, A; Cantero, J; Cantrill, R; Cao, T; Capeans Garrido, M D M; Caprini, I; Caprini, M; Capua, M; Caputo, R; Cardarelli, R; Carli, T; Carlino, G; Carminati, L; Caron, S; Carquin, E; Carrillo-Montoya, G D; Carter, A A; Carter, J R; Carvalho, J; Casadei, D; Casado, M P; Caso, C; Castaneda-Miranda, E; Castelli, A; Castillo Gimenez, V; Castro, N F; Catastini, P; Catinaccio, A; Catmore, J R; Cattai, A; Cattani, G; Caughron, S; Cavaliere, V; Cavalli, D; Cavalli-Sforza, M; Cavasinni, V; Ceradini, F; Cerio, B; Cerny, K; Cerqueira, A S; Cerri, A; Cerrito, L; Cerutti, F; Cervelli, A; Cetin, S A; Chafaq, A; Chakraborty, D; Chalupkova, I; Chan, K; Chang, P; Chapleau, B; Chapman, J D; Charfeddine, D; Charlton, D G; Chavda, V; Chavez Barajas, C A; Cheatham, S; Chekanov, S; Chekulaev, S V; Chelkov, G A; Chelstowska, M A; Chen, C; Chen, H; Chen, K; Chen, L; Chen, S; Chen, X; Chen, Y; Cheng, Y; Cheplakov, A; Cherkaoui El Moursli, R; Chernyatin, V; Cheu, E; Chevalier, L; Chiarella, V; Chiefari, G; Childers, J T; Chilingarov, A; Chiodini, G; Chisholm, A S; Chislett, R T; Chitan, A; Chizhov, M V; Chouridou, S; Chow, B K B; Christidi, I A; Chromek-Burckhart, D; Chu, M L; Chudoba, J; Ciapetti, G; Ciftci, A K; Ciftci, R; Cinca, D; Cindro, V; Ciocio, A; Cirilli, M; Cirkovic, P; Citron, Z H; Citterio, M; Ciubancan, M; Clark, A; Clark, P J; Clarke, R N; Cleland, W; Clemens, J C; Clement, B; Clement, C; Coadou, Y; Cobal, M; Coccaro, A; Cochran, J; Coelli, S; Coffey, L; Cogan, J G; Coggeshall, J; Colas, J; Cole, B; Cole, S; Colijn, A P; Collins-Tooth, C; Collot, J; Colombo, T; Colon, G; Compostella, G; Conde Muiño, P; Coniavitis, E; Conidi, M C; Connelly, I A; Consonni, S M; Consorti, V; Constantinescu, S; Conta, C; Conti, G; Conventi, F; Cooke, M; Cooper, B D; Cooper-Sarkar, A M; Cooper-Smith, N J; Copic, K; Cornelissen, T; Corradi, M; Corriveau, F; Corso-Radu, A; Cortes-Gonzalez, A; Cortiana, G; Costa, G; Costa, M J; Costanzo, D; Côté, D; Cottin, G; Courneyea, L; Cowan, G; Cox, B E; Cranmer, K; Cree, G; Crépé-Renaudin, S; Crescioli, F; Crispin Ortuzar, M; Cristinziani, M; Crosetti, G; Cuciuc, C-M; Cuenca Almenar, C; Cuhadar Donszelmann, T; Cummings, J; Curatolo, M; Cuthbert, C; Czirr, H; Czodrowski, P; Czyczula, Z; D'Auria, S; D'Onofrio, M; D'Orazio, A; Da Cunha Sargedas De Sousa, M J; Da Via, C; Dabrowski, W; Dafinca, A; Dai, T; Dallaire, F; Dallapiccola, C; Dam, M; Daniells, A C; Dano Hoffmann, M; Dao, V; Darbo, G; Darlea, G L; Darmora, S; Dassoulas, J A; Davey, W; David, C; Davidek, T; Davies, E; Davies, M; Davignon, O; Davison, A R; Davygora, Y; Dawe, E; Dawson, I; Daya-Ishmukhametova, R K; De, K; de Asmundis, R; De Castro, S; De Cecco, S; de Graat, J; De Groot, N; de Jong, P; De La Taille, C; De la Torre, H; De Lorenzi, F; De Nooij, L; De Pedis, D; De Salvo, A; De Sanctis, U; De Santo, A; De Vivie De Regie, J B; De Zorzi, G; Dearnaley, W J; Debbe, R; Debenedetti, C; Dechenaux, B; Dedovich, D V; Degenhardt, J; Del Peso, J; Del Prete, T; Delemontex, T; Deliot, F; Deliyergiyev, M; Dell'Acqua, A; Dell'Asta, L; Della Pietra, M; Della Volpe, D; Delmastro, M; Delsart, P A; Deluca, C; Demers, S; Demichev, M; Demilly, A; Demirkoz, B; Denisov, S P; Derendarz, D; Derkaoui, J E; Derue, F; Dervan, P; Desch, K; Deviveiros, P O; Dewhurst, A; DeWilde, B; Dhaliwal, S; Dhullipudi, R; Di Ciaccio, A; Di Ciaccio, L; Di Domenico, A; Di Donato, C; Di Girolamo, A; Di Girolamo, B; Di Mattia, A; Di Micco, B; Di Nardo, R; Di Simone, A; Di Sipio, R; Di Valentino, D; Diaz, M A; Diehl, E B; Dietrich, J; Dietzsch, T A; Diglio, S; Dindar Yagci, K; Dingfelder, J; Dionisi, C; Dita, P; Dita, S; Dittus, F; Djama, F; Djobava, T; do Vale, M A B; Do Valle Wemans, A; Doan, T K O; Dobos, D; Dobson, E; Dodd, J; Doglioni, C; Doherty, T; Dohmae, T; Dolejsi, J; Dolezal, Z; Dolgoshein, B A; Donadelli, M; Donati, S; Dondero, P; Donini, J; Dopke, J; Doria, A; Dos Anjos, A; Dotti, A; Dova, M T; Doyle, A T; Dris, M; Dubbert, J; Dube, S; Dubreuil, E; Duchovni, E; Duckeck, G; Ducu, O A; Duda, D; Dudarev, A; Dudziak, F; Duflot, L; Duguid, L; Dührssen, M; Dunford, M; Duran Yildiz, H; Düren, M; Dwuznik, M; Ebke, J; Edson, W; Edwards, C A; Edwards, N C; Ehrenfeld, W; Eifert, T; Eigen, G; Einsweiler, K; Eisenhandler, E; Ekelof, T; El Kacimi, M; Ellert, M; Elles, S; Ellinghaus, F; Ellis, K; Ellis, N; Elmsheuser, J; Elsing, M; Emeliyanov, D; Enari, Y; Endner, O C; Endo, M; Engelmann, R; Erdmann, J; Ereditato, A; Eriksson, D; Ernis, G; Ernst, J; Ernst, M; Ernwein, J; Errede, D; Errede, S; Ertel, E; Escalier, M; Esch, H; Escobar, C; Espinal Curull, X; Esposito, B; Etienne, F; Etienvre, A I; Etzion, E; Evangelakou, D; Evans, H; Fabbri, L; Facini, G; Fakhrutdinov, R M; Falciano, S; Fang, Y; Fanti, M; Farbin, A; Farilla, A; Farooque, T; Farrell, S; Farrington, S M; Farthouat, P; Fassi, F; Fassnacht, P; Fassouliotis, D; Fatholahzadeh, B; Favareto, A; Fayard, L; Federic, P; Fedin, O L; Fedorko, W; Fehling-Kaschek, M; Feligioni, L; Feng, C; Feng, E J; Feng, H; Fenyuk, A B; Fernando, W; Ferrag, S; Ferrando, J; Ferrara, V; Ferrari, A; Ferrari, P; Ferrari, R; Ferreira de Lima, D E; Ferrer, A; Ferrere, D; Ferretti, C; Ferretto Parodi, A; Fiascaris, M; Fiedler, F; Filipčič, A; Filipuzzi, M; Filthaut, F; Fincke-Keeler, M; Finelli, K D; Fiolhais, M C N; Fiorini, L; Firan, A; Fischer, J; Fisher, M J; Fitzgerald, E A; Flechl, M; Fleck, I; Fleischmann, P; Fleischmann, S; Fletcher, G T; Fletcher, G; Flick, T; Floderus, A; Flores Castillo, L R; Florez Bustos, A C; Flowerdew, M J; Fonseca Martin, T; Formica, A; Forti, A; Fortin, D; Fournier, D; Fox, H; Francavilla, P; Franchini, M; Franchino, S; Francis, D; Franklin, M; Franz, S; Fraternali, M; Fratina, S; French, S T; Friedrich, C; Friedrich, F; Froidevaux, D; Frost, J A; Fukunaga, C; Fullana Torregrosa, E; Fulsom, B G; Fuster, J; Gabaldon, C; Gabizon, O; Gabrielli, A; Gabrielli, A; Gadatsch, S; Gadfort, T; Gadomski, S; Gagliardi, G; Gagnon, P; Galea, C; Galhardo, B; Gallas, E J; Gallo, V; Gallop, B J; Gallus, P; Galster, G; Gan, K K; Gandrajula, R P; Gao, J; Gao, Y S; Garay Walls, F M; Garberson, F; García, C; García Navarro, J E; Garcia-Sciveres, M; Gardner, R W; Garelli, N; Garonne, V; Gatti, C; Gaudio, G; Gaur, B; Gauthier, L; Gauzzi, P; Gavrilenko, I L; Gay, C; Gaycken, G; Gazis, E N; Ge, P; Gecse, Z; Gee, C N P; Geerts, D A A; Geich-Gimbel, Ch; Gellerstedt, K; Gemme, C; Gemmell, A; Genest, M H; Gentile, S; George, M; George, S; Gerbaudo, D; Gershon, A; Ghazlane, H; Ghodbane, N; Giacobbe, B; Giagu, S; Giangiobbe, V; Giannetti, P; Gianotti, F; Gibbard, B; Gibson, S M; Gilchriese, M; Gillam, T P S; Gillberg, D; Gillman, A R; Gingrich, D M; Giokaris, N; Giordani, M P; Giordano, R; Giorgi, F M; Giovannini, P; Giraud, P F; Giugni, D; Giuliani, C; Giunta, M; Gjelsten, B K; Gkialas, I; Gladilin, L K; Glasman, C; Glatzer, J; Glazov, A; Glonti, G L; Goblirsch-Kolb, M; Goddard, J R; Godfrey, J; Godlewski, J; Goeringer, C; Goldfarb, S; Golling, T; Golubkov, D; Gomes, A; Gomez Fajardo, L S; Gonçalo, R; Goncalves Pinto Firmino Da Costa, J; Gonella, L; González de la Hoz, S; Gonzalez Parra, G; Gonzalez Silva, M L; Gonzalez-Sevilla, S; Goodson, J J; Goossens, L; Gorbounov, P A; Gordon, H A; Gorelov, I; Gorfine, G; Gorini, B; Gorini, E; Gorišek, A; Gornicki, E; Goshaw, A T; Gössling, C; Gostkin, M I; Gouighri, M; Goujdami, D; Goulette, M P; Goussiou, A G; Goy, C; Gozpinar, S; Grabas, H M X; Graber, L; Grabowska-Bold, I; Grafström, P; Grahn, K-J; Gramling, J; Gramstad, E; Grancagnolo, F; Grancagnolo, S; Grassi, V; Gratchev, V; Gray, H M; Gray, J A; Graziani, E; Grebenyuk, O G; Greenwood, Z D; Gregersen, K; Gregor, I M; Grenier, P; Griffiths, J; Grigalashvili, N; Grillo, A A; Grimm, K; Grinstein, S; Gris, Ph; Grishkevich, Y V; Grivaz, J-F; Grohs, J P; Grohsjean, A; Gross, E; Grosse-Knetter, J; Grossi, G C; Groth-Jensen, J; Grout, Z J; Grybel, K; Guescini, F; Guest, D; Gueta, O; Guicheney, C; Guido, E; Guillemin, T; Guindon, S; Gul, U; Gumpert, C; Gunther, J; Guo, J; Gupta, S; Gutierrez, P; Gutierrez Ortiz, N G; Gutschow, C; Guttman, N; Guyot, C; Gwenlan, C; Gwilliam, C B; Haas, A; Haber, C; Hadavand, H K; Haefner, P; Hageboeck, S; Hajduk, Z; Hakobyan, H; Haleem, M; Hall, D; Halladjian, G; Hamacher, K; Hamal, P; Hamano, K; Hamer, M; Hamilton, A; Hamilton, S; Han, L; Hanagaki, K; Hanawa, K; Hance, M; Hanke, P; Hansen, J R; Hansen, J B; Hansen, J D; Hansen, P H; Hansson, P; Hara, K; Hard, A S; Harenberg, T; Harkusha, S; Harper, D; Harrington, R D; Harris, O M; Harrison, P F; Hartjes, F; Harvey, A; Hasegawa, S; Hasegawa, Y; Hassani, S; Haug, S; Hauschild, M; Hauser, R; Havranek, M; Hawkes, C M; Hawkings, R J; Hawkins, A D; Hayashi, T; Hayden, D; Hays, C P; Hayward, H S; Haywood, S J; Head, S J; Heck, T; Hedberg, V; Heelan, L; Heim, S; Heinemann, B; Heisterkamp, S; Hejbal, J; Helary, L; Heller, C; Heller, M; Hellman, S; Hellmich, D; Helsens, C; Henderson, J; Henderson, R C W; Henrichs, A; Henriques Correia, A M; Henrot-Versille, S; Hensel, C; Herbert, G H; Hernandez, C M; Hernández Jiménez, Y; Herrberg-Schubert, R; Herten, G; Hertenberger, R; Hervas, L; Hesketh, G G; Hessey, N P; Hickling, R; Higón-Rodriguez, E; Hill, J C; Hiller, K H; Hillert, S; Hillier, S J; Hinchliffe, I; Hines, E; Hirose, M; Hirschbuehl, D; Hobbs, J; Hod, N; Hodgkinson, M C; Hodgson, P; Hoecker, A; Hoeferkamp, M R; Hoffman, J; Hoffmann, D; Hofmann, J I; Hohlfeld, M; Holmes, T R; Hong, T M; Hooft van Huysduynen, L; Hostachy, J-Y; Hou, S; Hoummada, A; Howard, J; Howarth, J; Hrabovsky, M; Hristova, I; Hrivnac, J; Hryn'ova, T; Hsu, P J; Hsu, S-C; Hu, D; Hu, X; Huang, Y; Hubacek, Z; Hubaut, F; Huegging, F; Huettmann, A; Huffman, T B; Hughes, E W; Hughes, G; Huhtinen, M; Hülsing, T A; Hurwitz, M; Huseynov, N; Huston, J; Huth, J; Iacobucci, G; Iakovidis, G; Ibragimov, I; Iconomidou-Fayard, L; Idarraga, J; Ideal, E; Iengo, P; Igonkina, O; Iizawa, T; Ikegami, Y; Ikematsu, K; Ikeno, M; Iliadis, D; Ilic, N; Inamaru, Y; Ince, T; Ioannou, P; Iodice, M; Iordanidou, K; Ippolito, V; Irles Quiles, A; Isaksson, C; Ishino, M; Ishitsuka, M; Ishmukhametov, R; Issever, C; Istin, S; Ivashin, A V; Iwanski, W; Iwasaki, H; Izen, J M; Izzo, V; Jackson, B; Jackson, J N; Jackson, M; Jackson, P; Jaekel, M R; Jain, V; Jakobs, K; Jakobsen, S; Jakoubek, T; Jakubek, J; Jamin, D O; Jana, D K; Jansen, E; Jansen, H; Janssen, J; Janus, M; Jared, R C; Jarlskog, G; Jeanty, L; Jeng, G-Y; Jen-La Plante, I; Jennens, D; Jenni, P; Jentzsch, J; Jeske, C; Jézéquel, S; Jha, M K; Ji, H; Ji, W; Jia, J; Jiang, Y; Jimenez Belenguer, M; Jin, S; Jinaru, A; Jinnouchi, O; Joergensen, M D; Joffe, D; Johansson, K E; Johansson, P; Johns, K A; Jon-And, K; Jones, G; Jones, R W L; Jones, T J; Jorge, P M; Joshi, K D; Jovicevic, J; Ju, X; Jung, C A; Jungst, R M; Jussel, P; Juste Rozas, A; Kaci, M; Kaczmarska, A; Kadlecik, P; Kado, M; Kagan, H; Kagan, M; Kajomovitz, E; Kalinin, S; Kama, S; Kanaya, N; Kaneda, M; Kaneti, S; Kanno, T; Kantserov, V A; Kanzaki, J; Kaplan, B; Kapliy, A; Kar, D; Karakostas, K; Karastathis, N; Karnevskiy, M; Karpov, S N; Karthik, K; Kartvelishvili, V; Karyukhin, A N; Kashif, L; Kasieczka, G; Kass, R D; Kastanas, A; Kataoka, Y; Katre, A; Katzy, J; Kaushik, V; Kawagoe, K; Kawamoto, T; Kawamura, G; Kazama, S; Kazanin, V F; Kazarinov, M Y; Keeler, R; Keener, P T; Kehoe, R; Keil, M; Keller, J S; Keoshkerian, H; Kepka, O; Kerševan, B P; Kersten, S; Kessoku, K; Keung, J; Khalil-Zada, F; Khandanyan, H; Khanov, A; Kharchenko, D; Khodinov, A; Khomich, A; Khoo, T J; Khoriauli, G; Khoroshilov, A; Khovanskiy, V; Khramov, E; Khubua, J; Kim, H; Kim, S H; Kimura, N; Kind, O; King, B T; King, M; King, R S B; King, S B; Kirk, J; Kiryunin, A E; Kishimoto, T; Kisielewska, D; Kitamura, T; Kittelmann, T; Kiuchi, K; Kladiva, E; Klein, M; Klein, U; Kleinknecht, K; Klimek, P; Klimentov, A; Klingenberg, R; Klinger, J A; Klinkby, E B; Klioutchnikova, T; Klok, P F; Kluge, E-E; Kluit, P; Kluth, S; Kneringer, E; Knoops, E B F G; Knue, A; Kobayashi, T; Kobel, M; Kocian, M; Kodys, P; Koenig, S; Koevesarki, P; Koffas, T; Koffeman, E; Kogan, L A; Kohlmann, S; Kohout, Z; Kohriki, T; Koi, T; Kolanoski, H; Koletsou, I; Koll, J; Komar, A A; Komori, Y; Kondo, T; Köneke, K; König, A C; Kono, T; Konoplich, R; Konstantinidis, N; Kopeliansky, R; Koperny, S; Köpke, L; Kopp, A K; Korcyl, K; Kordas, K; Korn, A; Korol, A A; Korolkov, I; Korolkova, E V; Korotkov, V A; Kortner, O; Kortner, S; Kostyukhin, V V; Kotov, S; Kotov, V M; Kotwal, A; Kourkoumelis, C; Kouskoura, V; Koutsman, A; Kowalewski, R; Kowalski, T Z; Kozanecki, W; Kozhin, A S; Kral, V; Kramarenko, V A; Kramberger, G; Krasny, M W; Krasznahorkay, A; Kraus, J K; Kravchenko, A; Kreiss, S; Kretzschmar, J; Kreutzfeldt, K; Krieger, N; Krieger, P; Kroeninger, K; Kroha, H; Kroll, J; Kroseberg, J; Krstic, J; Kruchonak, U; Krüger, H; Kruker, T; Krumnack, N; Krumshteyn, Z V; Kruse, A; Kruse, M C; Kruskal, M; Kubota, T; Kuday, S; Kuehn, S; Kugel, A; Kuhl, T; Kukhtin, V; Kulchitsky, Y; Kuleshov, S; Kuna, M; Kunkle, J; Kupco, A; Kurashige, H; Kurata, M; Kurochkin, Y A; Kurumida, R; Kus, V; Kuwertz, E S; Kuze, M; Kvita, J; Kwee, R; La Rosa, A; La Rotonda, L; Labarga, L; Lablak, S; Lacasta, C; Lacava, F; Lacey, J; Lacker, H; Lacour, D; Lacuesta, V R; Ladygin, E; Lafaye, R; Laforge, B; Lagouri, T; Lai, S; Laier, H; Laisne, E; Lambourne, L; Lampen, C L; Lampl, W; Lançon, E; Landgraf, U; Landon, M P J; Lang, V S; Lange, C; Lankford, A J; Lanni, F; Lantzsch, K; Lanza, A; Laplace, S; Lapoire, C; Laporte, J F; Lari, T; Larner, A; Lassnig, M; Laurelli, P; Lavorini, V; Lavrijsen, W; Laycock, P; Le, B T; Le Dortz, O; Le Guirriec, E; Le Menedeu, E; LeCompte, T; Ledroit-Guillon, F; Lee, C A; Lee, H; Lee, J S H; Lee, S C; Lee, L; Lefebvre, G; Lefebvre, M; Legger, F; Leggett, C; Lehan, A; Lehmacher, M; Lehmann Miotto, G; Leister, A G; Leite, M A L; Leitner, R; Lellouch, D; Lemmer, B; Lendermann, V; Leney, K J C; Lenz, T; Lenzen, G; Lenzi, B; Leone, R; Leonhardt, K; Leontsinis, S; Leroy, C; Lessard, J-R; Lester, C G; Lester, C M; Levêque, J; Levin, D; Levinson, L J; Lewis, A; Lewis, G H; Leyko, A M; Leyton, M; Li, B; Li, B; Li, H; Li, H L; Li, S; Li, X; Liang, Z; Liao, H; Liberti, B; Lichard, P; Lie, K; Liebal, J; Liebig, W; Limbach, C; Limosani, A; Limper, M; Lin, S C; Linde, F; Lindquist, B E; Linnemann, J T; Lipeles, E; Lipniacka, A; Lisovyi, M; Liss, T M; Lissauer, D; Lister, A; Litke, A M; Liu, B; Liu, D; Liu, J B; Liu, K; Liu, L; Liu, M; Liu, M; Liu, Y; Livan, M; Livermore, S S A; Lleres, A; Llorente Merino, J; Lloyd, S L; Lo Sterzo, F; Lobodzinska, E; Loch, P; Lockman, W S; Loddenkoetter, T; Loebinger, F K; Loevschall-Jensen, A E; Loginov, A; Loh, C W; Lohse, T; Lohwasser, K; Lokajicek, M; Lombardo, V P; Long, J D; Long, R E; Lopes, L; Lopez Mateos, D; Lopez Paredes, B; Lorenz, J; Lorenzo Martinez, N; Losada, M; Loscutoff, P; Losty, M J; Lou, X; Lounis, A; Love, J; Love, P A; Lowe, A J; Lu, F; Lubatti, H J; Luci, C; Lucotte, A; Ludwig, D; Ludwig, I; Luehring, F; Lukas, W; Luminari, L; Lund, E; Lundberg, J; Lundberg, O; Lund-Jensen, B; Lungwitz, M; Lynn, D; Lysak, R; Lytken, E; Ma, H; Ma, L L; Maccarrone, G; Macchiolo, A; Maček, B; Machado Miguens, J; Macina, D; Mackeprang, R; Madar, R; Madaras, R J; Maddocks, H J; Mader, W F; Madsen, A; Maeno, M; Maeno, T; Magnoni, L; Magradze, E; Mahboubi, K; Mahlstedt, J; Mahmoud, S; Mahout, G; Maiani, C; Maidantchik, C; Maio, A; Majewski, S; Makida, Y; Makovec, N; Mal, P; Malaescu, B; Malecki, Pa; Maleev, V P; Malek, F; Mallik, U; Malon, D; Malone, C; Maltezos, S; Malyshev, V M; Malyukov, S; Mamuzic, J; Mandelli, L; Mandić, I; Mandrysch, R; Maneira, J; Manfredini, A; Manhaes de Andrade Filho, L; Manjarres Ramos, J A; Mann, A; Manning, P M; Manousakis-Katsikakis, A; Mansoulie, B; Mantifel, R; Mapelli, L; March, L; Marchand, J F; Marchese, F; Marchiori, G; Marcisovsky, M; Marino, C P; Marques, C N; Marroquim, F; Marshall, Z; Marti, L F; Marti-Garcia, S; Martin, B; Martin, B; Martin, J P; Martin, T A; Martin, V J; Martin Dit Latour, B; Martinez, H; Martinez, M; Martin-Haugh, S; Martyniuk, A C; Marx, M; Marzano, F; Marzin, A; Masetti, L; Mashimo, T; Mashinistov, R; Masik, J; Maslennikov, A L; Massa, I; Massol, N; Mastrandrea, P; Mastroberardino, A; Masubuchi, T; Matsunaga, H; Matsushita, T; Mättig, P; Mättig, S; Mattmann, J; Mattravers, C; Maurer, J; Maxfield, S J; Maximov, D A; Mazini, R; Mazzaferro, L; Mazzanti, M; Mc Goldrick, G; Mc Kee, S P; McCarn, A; McCarthy, R L; McCarthy, T G; McCubbin, N A; McFarlane, K W; Mcfayden, J A; Mchedlidze, G; Mclaughlan, T; McMahon, S J; McPherson, R A; Meade, A; Mechnich, J; Mechtel, M; Medinnis, M; Meehan, S; Meera-Lebbai, R; Mehlhase, S; Mehta, A; Meier, K; Meineck, C; Meirose, B; Melachrinos, C; Mellado Garcia, B R; Meloni, F; Mendoza Navas, L; Mengarelli, A; Menke, S; Meoni, E; Mercurio, K M; Mergelmeyer, S; Meric, N; Mermod, P; Merola, L; Meroni, C; Merritt, F S; Merritt, H; Messina, A; Metcalfe, J; Mete, A S; Meyer, C; Meyer, C; Meyer, J-P; Meyer, J; Meyer, J; Michal, S; Middleton, R P; Migas, S; Mijović, L; Mikenberg, G; Mikestikova, M; Mikuž, M; Miller, D W; Mills, C; Milov, A; Milstead, D A; Milstein, D; Minaenko, A A; Miñano Moya, M; Minashvili, I A; Mincer, A I; Mindur, B; Mineev, M; Ming, Y; Mir, L M; Mirabelli, G; Mitani, T; Mitrevski, J; Mitsou, V A; Mitsui, S; Miyagawa, P S; Mjörnmark, J U; Moa, T; Moeller, V; Mohapatra, S; Mohr, W; Molander, S; Moles-Valls, R; Molfetas, A; Mönig, K; Monini, C; Monk, J; Monnier, E; Montejo Berlingen, J; Monticelli, F; Monzani, S; Moore, R W; Mora Herrera, C; Moraes, A; Morange, N; Morel, J; Moreno, D; Moreno Llácer, M; Morettini, P; Morgenstern, M; Morii, M; Moritz, S; Morley, A K; Mornacchi, G; Morris, J D; Morvaj, L; Moser, H G; Mosidze, M; Moss, J; Mount, R; Mountricha, E; Mouraviev, S V; Moyse, E J W; Mudd, R D; Mueller, F; Mueller, J; Mueller, K; Mueller, T; Mueller, T; Muenstermann, D; Munwes, Y; Murillo Quijada, J A; Murray, W J; Mussche, I; Musto, E; Myagkov, A G; Myska, M; Nackenhorst, O; Nadal, J; Nagai, K; Nagai, R; Nagai, Y; Nagano, K; Nagarkar, A; Nagasaka, Y; Nagel, M; Nairz, A M; Nakahama, Y; Nakamura, K; Nakamura, T; Nakano, I; Namasivayam, H; Nanava, G; Napier, A; Narayan, R; Nash, M; Nattermann, T; Naumann, T; Navarro, G; Neal, H A; Nechaeva, P Yu; Neep, T J; Negri, A; Negri, G; Negrini, M; Nektarijevic, S; Nelson, A; Nelson, T K; Nemecek, S; Nemethy, P; Nepomuceno, A A; Nessi, M; Neubauer, M S; Neumann, M; Neusiedl, A; Neves, R M; Nevski, P; Newcomer, F M; Newman, P R; Nguyen, D H; Nguyen Thi Hong, V; Nickerson, R B; Nicolaidou, R; Nicquevert, B; Nielsen, J; Nikiforou, N; Nikiforov, A; Nikolaenko, V; Nikolic-Audit, I; Nikolics, K; Nikolopoulos, K; Nilsson, P; Ninomiya, Y; Nisati, A; Nisius, R; Nobe, T; Nodulman, L; Nomachi, M; Nomidis, I; Norberg, S; Nordberg, M; Novakova, J; Nozaki, M; Nozka, L; Ntekas, K; Nuncio-Quiroz, A-E; Nunes Hanninger, G; Nunnemann, T; Nurse, E; O'Brien, B J; O'Grady, F; O'Neil, D C; O'Shea, V; Oakes, L B; Oakham, F G; Oberlack, H; Ocariz, J; Ochi, A; Ochoa, M I; Oda, S; Odaka, S; Ogren, H; Oh, A; Oh, S H; Ohm, C C; Ohshima, T; Okamura, W; Okawa, H; Okumura, Y; Okuyama, T; Olariu, A; Olchevski, A G; Olivares Pino, S A; Oliveira, M; Oliveira Damazio, D; Oliver Garcia, E; Olivito, D; Olszewski, A; Olszowska, J; Onofre, A; Onyisi, P U E; Oram, C J; Oreglia, M J; Oren, Y; Orestano, D; Orlando, N; Oropeza Barrera, C; Orr, R S; Osculati, B; Ospanov, R; Otero Y Garzon, G; Otono, H; Ouchrif, M; Ouellette, E A; Ould-Saada, F; Ouraou, A; Oussoren, K P; Ouyang, Q; Ovcharova, A; Owen, M; Owen, S; Ozcan, V E; Ozturk, N; Pachal, K; Pacheco Pages, A; Padilla Aranda, C; Pagan Griso, S; Paganis, E; Pahl, C; Paige, F; Pais, P; Pajchel, K; Palacino, G; Palestini, S; Pallin, D; Palma, A; Palmer, J D; Pan, Y B; Panagiotopoulou, E; Panduro Vazquez, J G; Pani, P; Panikashvili, N; Panitkin, S; Pantea, D; Papadopoulou, Th D; Papageorgiou, K; Paramonov, A; Paredes Hernandez, D; Parker, M A; Parodi, F; Parsons, J A; Parzefall, U; Pashapour, S; Pasqualucci, E; Passaggio, S; Passeri, A; Pastore, F; Pastore, Fr; Pásztor, G; Pataraia, S; Patel, N D; Pater, J R; Patricelli, S; Pauly, T; Pearce, J; Pedersen, M; Pedraza Lopez, S; Pedro, R; Peleganchuk, S V; Pelikan, D; Peng, H; Penning, B; Penwell, J; Perepelitsa, D V; Perez Cavalcanti, T; Perez Codina, E; Pérez García-Estañ, M T; Perez Reale, V; Perini, L; Pernegger, H; Perrino, R; Peschke, R; Peshekhonov, V D; Peters, K; Peters, R F Y; Petersen, B A; Petersen, J; Petersen, T C; Petit, E; Petridis, A; Petridou, C; Petrolo, E; Petrucci, F; Petteni, M; Pezoa, R; Phillips, P W; Piacquadio, G; Pianori, E; Picazio, A; Piccaro, E; Piccinini, M; Piec, S M; Piegaia, R; Pignotti, D T; Pilcher, J E; Pilkington, A D; Pina, J; Pinamonti, M; Pinder, A; Pinfold, J L; Pingel, A; Pinto, B; Pizio, C; Pleier, M-A; Pleskot, V; Plotnikova, E; Plucinski, P; Poddar, S; Podlyski, F; Poettgen, R; Poggioli, L; Pohl, D; Pohl, M; Polesello, G; Policicchio, A; Polifka, R; Polini, A; Pollard, C S; Polychronakos, V; Pomeroy, D; Pommès, K; Pontecorvo, L; Pope, B G; Popeneciu, G A; Popovic, D S; Poppleton, A; Portell Bueso, X; Pospelov, G E; Pospisil, S; Potamianos, K; Potrap, I N; Potter, C J; Potter, C T; Poulard, G; Poveda, J; Pozdnyakov, V; Prabhu, R; Pralavorio, P; Pranko, A; Prasad, S; Pravahan, R; Prell, S; Price, D; Price, J; Price, L E; Prieur, D; Primavera, M; Proissl, M; Prokofiev, K; Prokoshin, F; Protopapadaki, E; Protopopescu, S; Proudfoot, J; Prudent, X; Przybycien, M; Przysiezniak, H; Psoroulas, S; Ptacek, E; Pueschel, E; Puldon, D; Purohit, M; Puzo, P; Pylypchenko, Y; Qian, J; Quadt, A; Quarrie, D R; Quayle, W B; Quilty, D; Radeka, V; Radescu, V; Radhakrishnan, S K; Radloff, P; Ragusa, F; Rahal, G; Rajagopalan, S; Rammensee, M; Rammes, M; Randle-Conde, A S; Rangel-Smith, C; Rao, K; Rauscher, F; Rave, T C; Ravenscroft, T; Raymond, M; Read, A L; Rebuzzi, D M; Redelbach, A; Redlinger, G; Reece, R; Reeves, K; Reinsch, A; Reisin, H; Reisinger, I; Relich, M; Rembser, C; Ren, Z L; Renaud, A; Rescigno, M; Resconi, S; Resende, B; Reznicek, P; Rezvani, R; Richter, R; Ridel, M; Rieck, P; Rijssenbeek, M; Rimoldi, A; Rinaldi, L; Ritsch, E; Riu, I; Rivoltella, G; Rizatdinova, F; Rizvi, E; Robertson, S H; Robichaud-Veronneau, A; Robinson, D; Robinson, J E M; Robson, A; Rocha de Lima, J G; Roda, C; Roda Dos Santos, D; Rodrigues, L; Roe, S; Røhne, O; Rolli, S; Romaniouk, A; Romano, M; Romeo, G; Romero Adam, E; Rompotis, N; Roos, L; Ros, E; Rosati, S; Rosbach, K; Rose, A; Rose, M; Rosendahl, P L; Rosenthal, O; Rossetti, V; Rossi, E; Rossi, L P; Rosten, R; Rotaru, M; Roth, I; Rothberg, J; Rousseau, D; Royon, C R; Rozanov, A; Rozen, Y; Ruan, X; Rubbo, F; Rubinskiy, I; Rud, V I; Rudolph, C; Rudolph, M S; Rühr, F; Ruiz-Martinez, A; Rumyantsev, L; Rurikova, Z; Rusakovich, N A; Ruschke, A; Rutherfoord, J P; Ruthmann, N; Ruzicka, P; Ryabov, Y F; Rybar, M; Rybkin, G; Ryder, N C; Saavedra, A F; Sacerdoti, S; Saddique, A; Sadeh, I; Sadrozinski, H F-W; Sadykov, R; Safai Tehrani, F; Sakamoto, H; Sakurai, Y; Salamanna, G; Salamon, A; Saleem, M; Salek, D; Sales De Bruin, P H; Salihagic, D; Salnikov, A; Salt, J; Salvachua Ferrando, B M; Salvatore, D; Salvatore, F; Salvucci, A; Salzburger, A; Sampsonidis, D; Sanchez, A; Sánchez, J; Sanchez Martinez, V; Sandaker, H; Sander, H G; Sanders, M P; Sandhoff, M; Sandoval, T; Sandoval, C; Sandstroem, R; Sankey, D P C; Sansoni, A; Santoni, C; Santonico, R; Santos, H; Santoyo Castillo, I; Sapp, K; Sapronov, A; Saraiva, J G; Sarkisyan-Grinbaum, E; Sarrazin, B; Sartisohn, G; Sasaki, O; Sasaki, Y; Sasao, N; Satsounkevitch, I; Sauvage, G; Sauvan, E; Sauvan, J B; Savard, P; Savinov, V; Savu, D O; Sawyer, C; Sawyer, L; Saxon, D H; Saxon, J; Sbarra, C; Sbrizzi, A; Scanlon, T; Scannicchio, D A; Scarcella, M; Schaarschmidt, J; Schacht, P; Schaefer, D; Schaelicke, A; Schaepe, S; Schaetzel, S; Schäfer, U; Schaffer, A C; Schaile, D; Schamberger, R D; Scharf, V; Schegelsky, V A; Scheirich, D; Schernau, M; Scherzer, M I; Schiavi, C; Schieck, J; Schillo, C; Schioppa, M; Schlenker, S; Schmidt, E; Schmieden, K; Schmitt, C; Schmitt, C; Schmitt, S; Schneider, B; Schnellbach, Y J; Schnoor, U; Schoeffel, L; Schoening, A; Schoenrock, B D; Schorlemmer, A L S; Schott, M; Schouten, D; Schovancova, J; Schram, M; Schramm, S; Schreyer, M; Schroeder, C; Schroer, N; Schuh, N; Schultens, M J; Schultz-Coulon, H-C; Schulz, H; Schumacher, M; Schumm, B A; Schune, Ph; Schwartzman, A; Schwegler, Ph; Schwemling, Ph; Schwienhorst, R; Schwindling, J; Schwindt, T; Schwoerer, M; Sciacca, F G; Scifo, E; Sciolla, G; Scott, W G; Scutti, F; Searcy, J; Sedov, G; Sedykh, E; Seidel, S C; Seiden, A; Seifert, F; Seixas, J M; Sekhniaidze, G; Sekula, S J; Selbach, K E; Seliverstov, D M; Sellers, G; Seman, M; Semprini-Cesari, N; Serfon, C; Serin, L; Serkin, L; Serre, T; Seuster, R; Severini, H; Sforza, F; Sfyrla, A; Shabalina, E; Shamim, M; Shan, L Y; Shank, J T; Shao, Q T; Shapiro, M; Shatalov, P B; Shaw, K; Sherwood, P; Shimizu, S; Shimojima, M; Shin, T; Shiyakova, M; Shmeleva, A; Shochet, M J; Short, D; Shrestha, S; Shulga, E; Shupe, M A; Shushkevich, S; Sicho, P; Sidorov, D; Sidoti, A; Siegert, F; Sijacki, Dj; Silbert, O; Silva, J; Silver, Y; Silverstein, D; Silverstein, S B; Simak, V; Simard, O; Simic, Lj; Simion, S; Simioni, E; Simmons, B; Simoniello, R; Simonyan, M; Sinervo, P; Sinev, N B; Sipica, V; Siragusa, G; Sircar, A; Sisakyan, A N; Sivoklokov, S Yu; Sjölin, J; Sjursen, T B; Skinnari, L A; Skottowe, H P; Skovpen, K Yu; Skubic, P; Slater, M; Slavicek, T; Sliwa, K; Smakhtin, V; Smart, B H; Smestad, L; Smirnov, S Yu; Smirnov, Y; Smirnova, L N; Smirnova, O; Smith, K M; Smizanska, M; Smolek, K; Snesarev, A A; Snidero, G; Snow, J; Snyder, S; Sobie, R; Socher, F; Sodomka, J; Soffer, A; Soh, D A; Solans, C A; Solar, M; Solc, J; Soldatov, E Yu; Soldevila, U; Solfaroli Camillocci, E; Solodkov, A A; Solovyanov, O V; Solovyev, V; Soni, N; Sood, A; Sopko, V; Sopko, B; Sosebee, M; Soualah, R; Soueid, P; Soukharev, A M; South, D; Spagnolo, S; Spanò, F; Spearman, W R; Spighi, R; Spigo, G; Spousta, M; Spreitzer, T; Spurlock, B; St Denis, R D; Stahlman, J; Stamen, R; Stanecka, E; Stanek, R W; Stanescu, C; Stanescu-Bellu, M; Stanitzki, M M; Stapnes, S; Starchenko, E A; Stark, J; Staroba, P; Starovoitov, P; Staszewski, R; Stavina, P; Steele, G; Steinbach, P; Steinberg, P; Stekl, I; Stelzer, B; Stelzer, H J; Stelzer-Chilton, O; Stenzel, H; Stern, S; Stewart, G A; Stillings, J A; Stockton, M C; Stoebe, M; Stoerig, K; Stoicea, G; Stonjek, S; Stradling, A R; Straessner, A; Strandberg, J; Strandberg, S; Strandlie, A; Strauss, E; Strauss, M; Strizenec, P; Ströhmer, R; Strom, D M; Stroynowski, R; Stucci, S A; Stugu, B; Stumer, I; Stupak, J; Sturm, P; Styles, N A; Su, D; Su, J; Subramania, Hs; Subramaniam, R; Succurro, A; Sugaya, Y; Suhr, C; Suk, M; Sulin, V V; Sultansoy, S; Sumida, T; Sun, X; Sundermann, J E; Suruliz, K; Susinno, G; Sutton, M R; Suzuki, Y; Svatos, M; Swedish, S; Swiatlowski, M; Sykora, I; Sykora, T; Ta, D; Tackmann, K; Taenzer, J; Taffard, A; Tafirout, R; Taiblum, N; Takahashi, Y; Takai, H; Takashima, R; Takeda, H; Takeshita, T; Takubo, Y; Talby, M; Talyshev, A A; Tam, J Y C; Tamsett, M C; Tan, K G; Tanaka, J; Tanaka, R; Tanaka, S; Tanaka, S; Tanasijczuk, A J; Tani, K; Tannoury, N; Tapprogge, S; Tarem, S; Tarrade, F; Tartarelli, G F; Tas, P; Tasevsky, M; Tashiro, T; Tassi, E; Tavares Delgado, A; Tayalati, Y; Taylor, C; Taylor, F E; Taylor, G N; Taylor, W; Teischinger, F A; Teixeira Dias Castanheira, M; Teixeira-Dias, P; Temming, K K; Ten Kate, H; Teng, P K; Terada, S; Terashi, K; Terron, J; Terzo, S; Testa, M; Teuscher, R J; Therhaag, J; Theveneaux-Pelzer, T; Thoma, S; Thomas, J P; Thompson, E N; Thompson, P D; Thompson, P D; Thompson, A S; Thomsen, L A; Thomson, E; Thomson, M; Thong, W M; Thun, R P; Tian, F; Tibbetts, M J; Tic, T; Tikhomirov, V O; Tikhonov, Yu A; Timoshenko, S; Tiouchichine, E; Tipton, P; Tisserant, S; Todorov, T; Todorova-Nova, S; Toggerson, B; Tojo, J; Tokár, S; Tokushuku, K; Tollefson, K; Tomlinson, L; Tomoto, M; Tompkins, L; Toms, K; Topilin, N D; Torrence, E; Torres, H; Torró Pastor, E; Toth, J; Touchard, F; Tovey, D R; Tran, H L; Trefzger, T; Tremblet, L; Tricoli, A; Trigger, I M; Trincaz-Duvoid, S; Tripiana, M F; Triplett, N; Trischuk, W; Trocmé, B; Troncon, C; Trottier-McDonald, M; Trovatelli, M; True, P; Trzebinski, M; Trzupek, A; Tsarouchas, C; Tseng, J C-L; Tsiareshka, P V; Tsionou, D; Tsipolitis, G; Tsirintanis, N; Tsiskaridze, S; Tsiskaridze, V; Tskhadadze, E G; Tsukerman, I I; Tsulaia, V; Tsung, J-W; Tsuno, S; Tsybychev, D; Tua, A; Tudorache, A; Tudorache, V; Tuggle, J M; Tuna, A N; Tupputi, S A; Turchikhin, S; Turecek, D; Turk Cakir, I; Turra, R; Tuts, P M; Tykhonov, A; Tylmad, M; Tyndel, M; Uchida, K; Ueda, I; Ueno, R; Ughetto, M; Ugland, M; Uhlenbrock, M; Ukegawa, F; Unal, G; Undrus, A; Unel, G; Ungaro, F C; Unno, Y; Urbaniec, D; Urquijo, P; Usai, G; Usanova, A; Vacavant, L; Vacek, V; Vachon, B; Valencic, N; Valentinetti, S; Valero, A; Valery, L; Valkar, S; Valladolid Gallego, E; Vallecorsa, S; Valls Ferrer, J A; Van Berg, R; Van Der Deijl, P C; van der Geer, R; van der Graaf, H; Van Der Leeuw, R; van der Ster, D; van Eldik, N; van Gemmeren, P; Van Nieuwkoop, J; van Vulpen, I; van Woerden, M C; Vanadia, M; Vandelli, W; Vaniachine, A; Vankov, P; Vannucci, F; Vardanyan, G; Vari, R; Varnes, E W; Varol, T; Varouchas, D; Vartapetian, A; Varvell, K E; Vassilakopoulos, V I; Vazeille, F; Vazquez Schroeder, T; Veatch, J; Veloso, F; Veneziano, S; Ventura, A; Ventura, D; Venturi, M; Venturi, N; Venturini, A; Vercesi, V; Verducci, M; Verkerke, W; Vermeulen, J C; Vest, A; Vetterli, M C; Viazlo, O; Vichou, I; Vickey, T; Vickey Boeriu, O E; Viehhauser, G H A; Viel, S; Vigne, R; Villa, M; Villaplana Perez, M; Vilucchi, E; Vincter, M G; Vinogradov, V B; Virzi, J; Vitells, O; Viti, M; Vivarelli, I; Vives Vaque, F; Vlachos, S; Vladoiu, D; Vlasak, M; Vogel, A; Vokac, P; Volpi, G; Volpi, M; Volpini, G; von der Schmitt, H; von Radziewski, H; von Toerne, E; Vorobel, V; Vos, M; Voss, R; Vossebeld, J H; Vranjes, N; Vranjes Milosavljevic, M; Vrba, V; Vreeswijk, M; Vu Anh, T; Vuillermet, R; Vukotic, I; Vykydal, Z; Wagner, W; Wagner, P; Wahrmund, S; Wakabayashi, J; Walch, S; Walder, J; Walker, R; Walkowiak, W; Wall, R; Waller, P; Walsh, B; Wang, C; Wang, H; Wang, H; Wang, J; Wang, J; Wang, K; Wang, R; Wang, S M; Wang, T; Wang, X; Warburton, A; Ward, C P; Wardrope, D R; Warsinsky, M; Washbrook, A; Wasicki, C; Watanabe, I; Watkins, P M; Watson, A T; Watson, I J; Watson, M F; Watts, G; Watts, S; Waugh, A T; Waugh, B M; Webb, S; Weber, M S; Weber, S W; Webster, J S; Weidberg, A R; Weigell, P; Weingarten, J; Weiser, C; Weits, H; Wells, P S; Wenaus, T; Wendland, D; Weng, Z; Wengler, T; Wenig, S; Wermes, N; Werner, M; Werner, P; Wessels, M; Wetter, J; Whalen, K; White, A; White, M J; White, R; White, S; Whiteson, D; Whittington, D; Wicke, D; Wickens, F J; Wiedenmann, W; Wielers, M; Wienemann, P; Wiglesworth, C; Wiik-Fuchs, L A M; Wijeratne, P A; Wildauer, A; Wildt, M A; Wilhelm, I; Wilkens, H G; Will, J Z; Williams, H H; Williams, S; Willis, W; Willocq, S; Wilson, J A; Wilson, A; Wingerter-Seez, I; Winkelmann, S; Winklmeier, F; Wittgen, M; Wittig, T; Wittkowski, J; Wollstadt, S J; Wolter, M W; Wolters, H; Wong, W C; Wosiek, B K; Wotschack, J; Woudstra, M J; Wozniak, K W; Wraight, K; Wright, M; Wu, S L; Wu, X; Wu, Y; Wulf, E; Wyatt, T R; Wynne, B M; Xella, S; Xiao, M; Xu, C; Xu, D; Xu, L; Yabsley, B; Yacoob, S; Yamada, M; Yamaguchi, H; Yamaguchi, Y; Yamamoto, A; Yamamoto, K; Yamamoto, S; Yamamura, T; Yamanaka, T; Yamauchi, K; Yamazaki, Y; Yan, Z; Yang, H; Yang, H; Yang, U K; Yang, Y; Yanush, S; Yao, L; Yasu, Y; Yatsenko, E; Yau Wong, K H; Ye, J; Ye, S; Yen, A L; Yildirim, E; Yilmaz, M; Yoosoofmiya, R; Yorita, K; Yoshida, R; Yoshihara, K; Young, C; Young, C J S; Youssef, S; Yu, D R; Yu, J; Yu, J; Yuan, L; Yurkewicz, A; Zabinski, B; Zaidan, R; Zaitsev, A M; Zaman, A; Zambito, S; Zanello, L; Zanzi, D; Zaytsev, A; Zeitnitz, C; Zeman, M; Zemla, A; Zengel, K; Zenin, O; Ženiš, T; Zerwas, D; Zevi Della Porta, G; Zhang, D; Zhang, H; Zhang, J; Zhang, L; Zhang, X; Zhang, Z; Zhao, Z; Zhemchugov, A; Zhong, J; Zhou, B; Zhou, L; Zhou, N; Zhu, C G; Zhu, H; Zhu, J; Zhu, Y; Zhuang, X; Zibell, A; Zieminska, D; Zimine, N I; Zimmermann, C; Zimmermann, R; Zimmermann, S; Zimmermann, S; Zinonos, Z; Ziolkowski, M; Zitoun, R; Zobernig, G; Zoccoli, A; Zur Nedden, M; Zurzolo, G; Zutshi, V; Zwalinski, L

    The jet energy scale (JES) and its systematic uncertainty are determined for jets measured with the ATLAS detector using proton-proton collision data with a centre-of-mass energy of [Formula: see text] TeV corresponding to an integrated luminosity of [Formula: see text][Formula: see text]. Jets are reconstructed from energy deposits forming topological clusters of calorimeter cells using the anti-[Formula: see text] algorithm with distance parameters [Formula: see text] or [Formula: see text], and are calibrated using MC simulations. A residual JES correction is applied to account for differences between data and MC simulations. This correction and its systematic uncertainty are estimated using a combination of in situ techniques exploiting the transverse momentum balance between a jet and a reference object such as a photon or a [Formula: see text] boson, for [Formula: see text] and pseudorapidities [Formula: see text]. The effect of multiple proton-proton interactions is corrected for, and an uncertainty is evaluated using in situ techniques. The smallest JES uncertainty of less than 1 % is found in the central calorimeter region ([Formula: see text]) for jets with [Formula: see text]. For central jets at lower [Formula: see text], the uncertainty is about 3 %. A consistent JES estimate is found using measurements of the calorimeter response of single hadrons in proton-proton collisions and test-beam data, which also provide the estimate for [Formula: see text] TeV. The calibration of forward jets is derived from dijet [Formula: see text] balance measurements. The resulting uncertainty reaches its largest value of 6 % for low-[Formula: see text] jets at [Formula: see text]. Additional JES uncertainties due to specific event topologies, such as close-by jets or selections of event samples with an enhanced content of jets originating from light quarks or gluons, are also discussed. The magnitude of these uncertainties depends on the event sample used in a

  4. Jet energy measurement and its systematic uncertainty in proton–proton collisions at √s = 7 TeV with the ATLAS detector

    DOE PAGES

    Aad, G.

    2015-01-15

    The jet energy scale (JES) and its systematic uncertainty are determined for jets measured with the ATLAS detector using proton–proton collision data with a centre-of-mass energy of \\(\\sqrt{s}=7\\) TeV corresponding to an integrated luminosity of \\(4.7\\) \\(\\,\\,\\text{ fb }^{-1}\\). Jets are reconstructed from energy deposits forming topological clusters of calorimeter cells using the anti-\\(k_{t}\\) algorithm with distance parameters \\(R=0.4\\) or \\(R=0.6\\), and are calibrated using MC simulations. A residual JES correction is applied to account for differences between data and MC simulations. This correction and its systematic uncertainty are estimated using a combination of in situ techniques exploiting the transversemore » momentum balance between a jet and a reference object such as a photon or a \\(Z\\) boson, for \\({20} \\le p_{\\mathrm {T}}^\\mathrm {jet}<{1000}\\, ~\\mathrm{GeV }\\) and pseudorapidities \\(|\\eta |<{4.5}\\). The effect of multiple proton–proton interactions is corrected for, and an uncertainty is evaluated using in situ techniques. The smallest JES uncertainty of less than 1 % is found in the central calorimeter region (\\(|\\eta |<{1.2}\\)) for jets with \\({55} \\le p_{\\mathrm {T}}^\\mathrm {jet}<{500}\\, ~\\mathrm{GeV }\\). For central jets at lower \\(p_{\\mathrm {T}}\\), the uncertainty is about 3 %. A consistent JES estimate is found using measurements of the calorimeter response of single hadrons in proton–proton collisions and test-beam data, which also provide the estimate for \\(p_{\\mathrm {T}}^\\mathrm {jet}> 1\\) TeV. The calibration of forward jets is derived from dijet \\(p_{\\mathrm {T}}\\) balance measurements. The resulting uncertainty reaches its largest value of 6 % for low-\\(p_{\\mathrm {T}}\\) jets at \\(|\\eta |=4.5\\). In addition, JES uncertainties due to specific event topologies, such as close-by jets or selections of event samples with an enhanced content of jets originating from light

  5. Jet energy measurement and its systematic uncertainty in proton–proton collisions at √s = 7 TeV with the ATLAS detector

    SciTech Connect

    Aad, G.

    2015-01-15

    The jet energy scale (JES) and its systematic uncertainty are determined for jets measured with the ATLAS detector using proton–proton collision data with a centre-of-mass energy of \\(\\sqrt{s}=7\\) TeV corresponding to an integrated luminosity of \\(4.7\\) \\(\\,\\,\\text{ fb }^{-1}\\). Jets are reconstructed from energy deposits forming topological clusters of calorimeter cells using the anti-\\(k_{t}\\) algorithm with distance parameters \\(R=0.4\\) or \\(R=0.6\\), and are calibrated using MC simulations. A residual JES correction is applied to account for differences between data and MC simulations. This correction and its systematic uncertainty are estimated using a combination of in situ techniques exploiting the transverse momentum balance between a jet and a reference object such as a photon or a \\(Z\\) boson, for \\({20} \\le p_{\\mathrm {T}}^\\mathrm {jet}<{1000}\\, ~\\mathrm{GeV }\\) and pseudorapidities \\(|\\eta |<{4.5}\\). The effect of multiple proton–proton interactions is corrected for, and an uncertainty is evaluated using in situ techniques. The smallest JES uncertainty of less than 1 % is found in the central calorimeter region (\\(|\\eta |<{1.2}\\)) for jets with \\({55} \\le p_{\\mathrm {T}}^\\mathrm {jet}<{500}\\, ~\\mathrm{GeV }\\). For central jets at lower \\(p_{\\mathrm {T}}\\), the uncertainty is about 3 %. A consistent JES estimate is found using measurements of the calorimeter response of single hadrons in proton–proton collisions and test-beam data, which also provide the estimate for \\(p_{\\mathrm {T}}^\\mathrm {jet}> 1\\) TeV. The calibration of forward jets is derived from dijet \\(p_{\\mathrm {T}}\\) balance measurements. The resulting uncertainty reaches its largest value of 6 % for low-\\(p_{\\mathrm {T}}\\) jets at \\(|\\eta |=4.5\\). In addition, JES uncertainties due to specific event topologies, such as close-by jets or selections of event samples with an enhanced content of jets originating from light quarks or

  6. Statistical and systematic uncertainties in the event reconstruction and S(1000) determination by the Pierre Auger surface detector

    SciTech Connect

    Ghia, Piera L.; /Gran Sasso

    2005-07-01

    We discuss the statistical and systematic uncertainties in the event reconstruction (core location, and determination of S(1000) , i.e., the signal at a distance of 1000 m from the shower core) by the Pierre Auger surface detector for showers with zenith angle less than 60 degrees. The method is based on a maximum likelihood method where the reference lateral distribution function is obtained through the experimental data. We also discuss S(1000) as primary energy estimator.

  7. Systematic uncertainty reduction strategies for developing streamflow forecasts utilizing multiple climate models and hydrologic models

    NASA Astrophysics Data System (ADS)

    Singh, Harminder; Sankarasubramanian, A.

    2014-02-01

    Recent studies show that multimodel combinations improve hydroclimatic predictions by reducing model uncertainty. Given that climate forecasts are available from multiple climate models, which could be ingested with multiple watershed models, what is the best strategy to reduce the uncertainty in streamflow forecasts? To address this question, we consider three possible strategies: (1) reduce the input uncertainty first by combining climate models and then use the multimodel climate forecasts with multiple watershed models (MM-P), (2) ingest the individual climate forecasts (without multimodel combination) with various watershed models and then combine the streamflow predictions that arise from all possible combinations of climate and watershed models (MM-Q), (3) combine the streamflow forecasts obtained from multiple watershed models based on strategy (1) to develop a single streamflow prediction that reduces uncertainty in both climate forecasts and watershed models (MM-PQ). For this purpose, we consider synthetic schemes that generate streamflow and climate forecasts, for comparing the performance of three strategies with the true streamflow generated by a given hydrologic model. Results from the synthetic study show that reducing input uncertainty first (MM-P) by combining climate forecasts results in reduced error in predicting the true streamflow compared to the error of multimodel streamflow forecasts obtained by combining streamflow forecasts from all-possible combination of individual climate model with various hydrologic models (MM-Q). Since the true hydrologic model structure is unknown, it is desirable to consider MM-PQ as an alternate choice that reduces both input uncertainty and hydrologic model uncertainty. Application on two watersheds in NC also indicates that reducing the input uncertainty first is critical before reducing the hydrologic model uncertainty.

  8. Reducing model uncertainty effects in flexible manipulators through the addition of passive damping

    NASA Technical Reports Server (NTRS)

    Alberts, T. E.

    1987-01-01

    An important issue in the control of practical systems is the effect of model uncertainty on closed loop performance. This is of particular concern when flexible structures are to be controlled, due to the fact that states associated with higher frequency vibration modes are truncated in order to make the control problem tractable. Digital simulations of a single-link manipulator system are employed to demonstrate that passive damping added to the flexible member reduces adverse effects associated with model uncertainty. A controller was designed based on a model including only one flexible mode. This controller was applied to larger order systems to evaluate the effects of modal truncation. Simulations using a Linear Quadratic Regulator (LQR) design assuming full state feedback illustrate the effect of control spillover. Simulations of a system using output feedback illustrate the destabilizing effect of observation spillover. The simulations reveal that the system with passive damping is less susceptible to these effects than the untreated case.

  9. Systematic uncertainties in RF-based measurement of superconducting cavity quality factors

    SciTech Connect

    Holzbauer, J. P.; Pischalnikov, Yu.; Sergatskov, D. A.; Schappert, W.; Smith, S.

    2016-05-10

    Q0 determinations based on RF power measurements are subject to at least three potentially large systematic effects that have not been previously appreciated. Here, instrumental factors that can systematically bias RF based measurements of Q0 are quantified and steps that can be taken to improve the determination of Q0 are discussed.

  10. Systematic uncertainties in RF-based measurement of superconducting cavity quality factors

    DOE PAGES

    Holzbauer, J. P.; Pischalnikov, Yu.; Sergatskov, D. A.; ...

    2016-05-10

    Q0 determinations based on RF power measurements are subject to at least three potentially large systematic effects that have not been previously appreciated. Here, instrumental factors that can systematically bias RF based measurements of Q0 are quantified and steps that can be taken to improve the determination of Q0 are discussed.

  11. Systematic diffuse optical image errors resulting from uncertainty in the background optical properties

    NASA Astrophysics Data System (ADS)

    Cheng, Xuefeng; Boas, David A.

    1999-04-01

    We investigated the diffuse optical image errors resulting from systematic errors in the background scattering and absorption coefficients, Gaussian noise in the measurements, and the depth at which the image is reconstructed when using a 2D linear reconstruction algorithm for a 3D object. The fourth Born perturbation approach was used to generate reflectance measurements and k-space tomography was used for the reconstruction. Our simulations using both single and dual wavelengths show large systematic errors in the absolute reconstructed absorption coefficients and corresponding hemoglobin concentrations, while the errors in the relative oxy- and deoxy- hemoglobin concentrations are acceptable. The greatest difference arises from a systematic error in the depth at which an image is reconstructed. While an absolute reconstruction of the hemoglobin concentrations can deviate by 100% for a depth error of ñ1 mm, the error in the relative concentrations is less than 5%. These results demonstrate that while quantitative diffuse optical tomography is difficult, images of the relative concentrations of oxy- and deoxy-hemoglobin are accurate and robust. Other results, not presented, confirm that these findings hold for other linear reconstruction techniques (i.e. SVD and SIRT) as well as for transmission through slab geometries.

  12. Reactive greenhouse gas scenarios: Systematic exploration of uncertainties and the role of atmospheric chemistry

    NASA Astrophysics Data System (ADS)

    Prather, Michael J.; Holmes, Christopher D.; Hsu, Juno

    2012-05-01

    Knowledge of the atmospheric chemistry of reactive greenhouse gases is needed to accurately quantify the relationship between human activities and climate, and to incorporate uncertainty in our projections of greenhouse gas abundances. We present a method for estimating the fraction of greenhouse gases attributable to human activities, both currently and for future scenarios. Key variables used to calculate the atmospheric chemistry and budgets of major non-CO2 greenhouse gases are codified along with their uncertainties, and then used to project budgets and abundances under the new climate-change scenarios. This new approach uses our knowledge of changing abundances and lifetimes to estimate current total anthropogenic emissions, independently and possibly more accurately than inventory-based scenarios. We derive a present-day atmospheric lifetime for methane (CH4) of 9.1 ± 0.9 y and anthropogenic emissions of 352 ± 45 Tg/y (64% of total emissions). For N2O, corresponding values are 131 ± 10 y and 6.5 ± 1.3 TgN/y (41% of total); and for HFC-134a, the lifetime is 14.2 ± 1.5 y.

  13. A new approach to systematic uncertainties and self-consistency in helium abundance determinations

    SciTech Connect

    Aver, Erik; Olive, Keith A.; Skillman, Evan D. E-mail: olive@umn.edu

    2010-05-01

    Tests of big bang nucleosynthesis and early universe cosmology require precision measurements for helium abundance determinations. However, efforts to determine the primordial helium abundance via observations of metal poor H II regions have been limited by significant uncertainties (compared with the value inferred from BBN theory using the CMB determined value of the baryon density). This work builds upon previous work by providing an updated and extended program in evaluating these uncertainties. Procedural consistency is achieved by integrating the hydrogen based reddening correction with the helium based abundance calculation, i.e., all physical parameters are solved for simultaneously. We include new atomic data for helium recombination and collisional emission based upon recent work by Porter \\etal and wavelength dependent corrections to underlying absorption are investigated. The set of physical parameters has been expanded here to include the effects of neutral hydrogen collisional emission. It is noted that Hγ and Hδ allow better isolation of the collisional effects from the reddening. Because of a degeneracy between the solutions for density and temperature, the precision of the helium abundance determinations is limited. Also, at lower temperatures (T ∼< 13,000 K) the neutral hydrogen fraction is poorly constrained resulting in a larger uncertainty in the helium abundances. Thus, the derived errors on the helium abundances for individual objects are larger than those typical of previous studies. Seven previously analyzed, ''high quality'' H II region spectra are used for a primordial helium abundance determination. The updated emissivities and neutral hydrogen correction generally raise the abundance. From a regression to zero metallicity, we find Y{sub p} as 0.2561 ± 0.0108, in broad agreement with the WMAP result. Alternatively, a simple average of the data yields Y{sub p} 0.2566 ± 0.0028. Tests with synthetic data show a potential for distinct

  14. Clinical uncertainties, health service challenges, and ethical complexities of HIV "test-and-treat": a systematic review.

    PubMed

    Kulkarni, Sonali P; Shah, Kavita R; Sarma, Karthik V; Mahajan, Anish P

    2013-06-01

    Despite the HIV "test-and-treat" strategy's promise, questions about its clinical rationale, operational feasibility, and ethical appropriateness have led to vigorous debate in the global HIV community. We performed a systematic review of the literature published between January 2009 and May 2012 using PubMed, SCOPUS, Global Health, Web of Science, BIOSIS, Cochrane CENTRAL, EBSCO Africa-Wide Information, and EBSCO CINAHL Plus databases to summarize clinical uncertainties, health service challenges, and ethical complexities that may affect the test-and-treat strategy's success. A thoughtful approach to research and implementation to address clinical and health service questions and meaningful community engagement regarding ethical complexities may bring us closer to safe, feasible, and effective test-and-treat implementation.

  15. Modeling environmental impacts of urban expansion: a systematic method for dealing with uncertainties.

    PubMed

    Liu, Yi; Yang, Sheng; Chen, Jining

    2012-08-07

    In a rapidly transitioning China, urban land use has changed dramatically, both spatially and in terms of magnitude; these changes have significantly affected the natural environment. This paper reports the development of an Integrated Environmental Assessment of Urban Land Use Change (IEA-ULUC) model, which combines cellular automata, scenario analysis, and stochastic spatial sampling with the goal of exploring urban land-use change, related environmental impacts, and various uncertainties. By applying the IEA-ULUC model to a new urban development area in Dalian in northeastern China, the evolution of spatial patterns from 1986 to 2005 was examined to identify key driving forces affecting the changing trajectories of local land use. Using these results, future urban land use in the period 2005-2020 was projected for four scenarios of economic development and land-use planning regulation. A stochastic sampling process was implemented to generate industrial land distributions for each land expansion scenario. Finally, domestic and industrial water pollution loads to the ocean were estimated, and the environmental impacts of each scenario are discussed. The results showed that the four urban expansion scenarios could lead to considerable differences in environmental responses. In principle, urban expansion scenarios along the intercity transportation rail/roadways could have higher negative environmental impacts than cluster-developing scenarios, while faster economic growth could more intensely aggravate the environment than in the moderate growth scenarios.

  16. M dwarf metallicities and giant planet occurrence: Ironing out uncertainties and systematics

    SciTech Connect

    Gaidos, Eric; Mann, Andrew W.

    2014-08-10

    Comparisons between the planet populations around solar-type stars and those orbiting M dwarfs shed light on the possible dependence of planet formation and evolution on stellar mass. However, such analyses must control for other factors, i.e., metallicity, a stellar parameter that strongly influences the occurrence of gas giant planets. We obtained infrared spectra of 121 M dwarfs stars monitored by the California Planet Search and determined metallicities with an accuracy of 0.08 dex. The mean and standard deviation of the sample are –0.05 and 0.20 dex, respectively. We parameterized the metallicity dependence of the occurrence of giant planets on orbits with a period less than two years around solar-type stars and applied this to our M dwarf sample to estimate the expected number of giant planets. The number of detected planets (3) is lower than the predicted number (6.4), but the difference is not very significant (12% probability of finding as many or fewer planets). The three M dwarf planet hosts are not especially metal rich and the most likely value of the power-law index relating planet occurrence to metallicity is 1.06 dex per dex for M dwarfs compared to 1.80 for solar-type stars; this difference, however, is comparable to uncertainties. Giant planet occurrence around both types of stars allows, but does not necessarily require, a mass dependence of ∼1 dex per dex. The actual planet-mass-metallicity relation may be complex, and elucidating it will require larger surveys like those to be conducted by ground-based infrared spectrographs and the Gaia space astrometry mission.

  17. No additional value of fusion techniques on anterior discectomy for neck pain: a systematic review.

    PubMed

    van Middelkoop, Marienke; Rubinstein, Sidney M; Ostelo, Raymond; van Tulder, Maurits W; Peul, Wilco; Koes, Bart W; Verhagen, Arianne P

    2012-11-01

    We aimed to assess the effects of additional fusion on surgical interventions to the cervical spine for patients with neck pain with or without radiculopathy or myelopathy by performing a systematic review. The search strategy outlined by the Cochrane Back Review Group (CBRG) was followed. The primary search was conducted in MEDLINE, EMBASE, CINAHL, CENTRAL and PEDro up to June 2011. Only randomised, controlled trials of adults with neck pain that evaluated at least one clinically relevant primary outcome measure (pain, functional status, recovery) were included. Two authors independently assessed the risk of bias by using the criteria recommended by the CBRG and extracted the data. Data were pooled using a random effects model. The quality of the evidence was rated using the GRADE method. In total, 10 randomised, controlled trials were identified comparing additional fusion upon anterior decompression techniques, including 2 studies with a low risk of bias. Results revealed no clinically relevant differences in recovery: the pooled risk difference in the short-term follow-up was -0.06 (95% confidence interval -0.22 to 0.10) and -0.07 (95% confidence interval -0.14 to 0.00) in the long-term follow-up. Pooled risk differences for pain and return to work all demonstrated no differences. There is no additional benefit of fusion techniques applied within an anterior discectomy procedure on pain, recovery and return to work.

  18. Using Principal Component and Tidal Analysis as a Quality Metric for Detecting Systematic Heading Uncertainty in Long-Term Acoustic Doppler Current Profiler Data

    NASA Astrophysics Data System (ADS)

    Morley, M. G.; Mihaly, S. F.; Dewey, R. K.; Jeffries, M. A.

    2015-12-01

    Ocean Networks Canada (ONC) operates the NEPTUNE and VENUS cabled ocean observatories to collect data on physical, chemical, biological, and geological ocean conditions over multi-year time periods. Researchers can download real-time and historical data from a large variety of instruments to study complex earth and ocean processes from their home laboratories. Ensuring that the users are receiving the most accurate data is a high priority at ONC, requiring quality assurance and quality control (QAQC) procedures to be developed for all data types. While some data types have relatively straightforward QAQC tests, such as scalar data range limits that are based on expected observed values or measurement limits of the instrument, for other data types the QAQC tests are more comprehensive. Long time series of ocean currents from Acoustic Doppler Current Profilers (ADCP), stitched together from multiple deployments over many years is one such data type where systematic data biases are more difficult to identify and correct. Data specialists at ONC are working to quantify systematic compass heading uncertainty in long-term ADCP records at each of the major study sites using the internal compass, remotely operated vehicle bearings, and more analytical tools such as principal component analysis (PCA) to estimate the optimal instrument alignments. In addition to using PCA, some work has been done to estimate the main components of the current at each site using tidal harmonic analysis. This paper describes the key challenges and presents preliminary PCA and tidal analysis approaches used by ONC to improve long-term observatory current measurements.

  19. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  20. A multicenter study to quantify systematic variations and associated uncertainties in source positioning with commonly used HDR afterloaders and ring applicators for the treatment of cervical carcinomas

    SciTech Connect

    Awunor, O.; Berger, D.; Kirisits, C.

    2015-08-15

    Purpose: The reconstruction of radiation source position in the treatment planning system is a key part of the applicator reconstruction process in high dose rate (HDR) brachytherapy treatment of cervical carcinomas. The steep dose gradients, of as much as 12%/mm, associated with typical cervix treatments emphasize the importance of accurate and precise determination of source positions. However, a variety of methodologies with a range in associated measurement uncertainties, of up to ±2.5 mm, are currently employed by various centers to do this. In addition, a recent pilot study by Awunor et al. [“Direct reconstruction and associated uncertainties of {sup 192}Ir source dwell positions in ring applicators using gafchromic film in the treatment planning of HDR brachytherapy cervix patients,” Phys. Med. Biol. 58, 3207–3225 (2013)] reported source positional differences of up to 2.6 mm between ring sets of the same type and geometry. This suggests a need for a comprehensive study to assess and quantify systematic source position variations between commonly used ring applicators and HDR afterloaders across multiple centers. Methods: Eighty-six rings from 20 European brachytherapy centers were audited in the form of a postal audit with each center collecting the data independently. The data were collected by setting up the rings using a bespoke jig and irradiating gafchromic films at predetermined dwell positions using four afterloader types, MicroSelectron, Flexitron, GammaMed, and MultiSource, from three manufacturers, Nucletron, Varian, and Eckert & Ziegler BEBIG. Five different ring types in six sizes (Ø25–Ø35 mm) and two angles (45° and 60°) were used. Coordinates of irradiated positions relative to the ring center were determined and collated, and source position differences quantified by ring type, size, and angle. Results: The mean expanded measurement uncertainty (k = 2) along the direction of source travel was ±1.4 mm. The standard deviation

  1. Systematic Uncertainties in the Spectroscopic Measurements of Neutron-star Masses and Radii from Thermonuclear X-Ray Bursts. III. Absolute Flux Calibration

    NASA Astrophysics Data System (ADS)

    Güver, Tolga; Özel, Feryal; Marshall, Herman; Psaltis, Dimitrios; Guainazzi, Matteo; Díaz-Trigo, Maria

    2016-09-01

    Many techniques for measuring neutron star radii rely on absolute flux measurements in the X-rays. As a result, one of the fundamental uncertainties in these spectroscopic measurements arises from the absolute flux calibrations of the detectors being used. Using the stable X-ray burster, GS 1826-238, and its simultaneous observations by Chandra HETG/ACIS-S and RXTE/PCA as well as by XMM-Newton EPIC-pn and RXTE/PCA, we quantify the degree of uncertainty in the flux calibration by assessing the differences between the measured fluxes during bursts. We find that the RXTE/PCA and the Chandra gratings measurements agree with each other within their formal uncertainties, increasing our confidence in these flux measurements. In contrast, XMM-Newton EPIC-pn measures 14.0 ± 0.3% less flux than the RXTE/PCA. This is consistent with the previously reported discrepancy with the flux measurements of EPIC-pn, compared with EPIC MOS1, MOS2, and ACIS-S detectors. We also show that any intrinsic time-dependent systematic uncertainty that may exist in the calibration of the satellites has already been implicity taken into account in the neutron star radius measurements.

  2. Measurement Uncertainty

    NASA Astrophysics Data System (ADS)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  3. SU-E-CAMPUS-J-05: Quantitative Investigation of Random and Systematic Uncertainties From Hardware and Software Components in the Frameless 6DBrainLAB ExacTrac System

    SciTech Connect

    Keeling, V; Jin, H; Hossain, S; Ahmad, S; Ali, I

    2014-06-15

    Purpose: To evaluate setup accuracy and quantify individual systematic and random errors for the various hardware and software components of the frameless 6D-BrainLAB ExacTrac system. Methods: 35 patients with cranial lesions, some with multiple isocenters (50 total lesions treated in 1, 3, 5 fractions), were investigated. All patients were simulated with a rigid head-and-neck mask and the BrainLAB localizer. CT images were transferred to the IPLAN treatment planning system where optimized plans were generated using stereotactic reference frame based on the localizer. The patients were setup initially with infrared (IR) positioning ExacTrac system. Stereoscopic X-ray images (XC: X-ray Correction) were registered to their corresponding digitally-reconstructed-radiographs, based on bony anatomy matching, to calculate 6D-translational and rotational (Lateral, Longitudinal, Vertical, Pitch, Roll, Yaw) shifts. XC combines systematic errors of the mask, localizer, image registration, frame, and IR. If shifts were below tolerance (0.7 mm translational and 1 degree rotational), treatment was initiated; otherwise corrections were applied and additional X-rays were acquired to verify patient position (XV: X-ray Verification). Statistical analysis was used to extract systematic and random errors of the different components of the 6D-ExacTrac system and evaluate the cumulative setup accuracy. Results: Mask systematic errors (translational; rotational) were the largest and varied from one patient to another in the range (−15 to 4mm; −2.5 to 2.5degree) obtained from mean of XC for each patient. Setup uncertainty in IR positioning (0.97,2.47,1.62mm;0.65,0.84,0.96degree) was extracted from standard-deviation of XC. Combined systematic errors of the frame and localizer (0.32,−0.42,−1.21mm; −0.27,0.34,0.26degree) was extracted from mean of means of XC distributions. Final patient setup uncertainty was obtained from the standard deviations of XV (0.57,0.77,0.67mm,0

  4. SYSTEMATIC UNCERTAINTIES IN THE SPECTROSCOPIC MEASUREMENTS OF NEUTRON-STAR MASSES AND RADII FROM THERMONUCLEAR X-RAY BURSTS. II. EDDINGTON LIMIT

    SciTech Connect

    Guever, Tolga; Oezel, Feryal; Psaltis, Dimitrios

    2012-03-01

    Time-resolved X-ray spectroscopy of thermonuclear bursts observed from low-mass X-ray binaries offer a unique tool to measure neutron-star masses and radii. In this paper, we continue our systematic analysis of all the X-ray bursts observed with Rossi X-ray Timing Explorer from X-ray binaries. We determine the events that show clear evidence for photospheric radius expansion and measure the Eddington limits for these accreting neutron stars using the bolometric fluxes attained at the touchdown moments of each X-ray burst. We employ a Bayesian technique to investigate the degree to which the Eddington limit for each source remains constant between bursts. We find that for sources with a large number of radius expansion bursts, systematic uncertainties are at a 5%-10% level. Moreover, in six sources with only pairs of Eddington-limited bursts, the distribution of fluxes is consistent with a {approx}10% fractional dispersion. This indicates that the spectroscopic measurements of neutron-star masses and radii using thermonuclear X-ray bursts can reach the level of accuracy required to distinguish between different neutron-star equations of state, provided that uncertainties related to the overall flux calibration of X-ray detectors are of comparable magnitude.

  5. Systematics of the family Plectopylidae in Vietnam with additional information on Chinese taxa (Gastropoda, Pulmonata, Stylommatophora)

    PubMed Central

    Páll-Gergely, Barna; Hunyadi, András; Ablett, Jonathan; Lương, Hào Văn; Fred Naggs; Asami, Takahiro

    2015-01-01

    Abstract Vietnamese species from the family Plectopylidae are revised based on the type specimens of all known taxa, more than 600 historical non-type museum lots, and almost 200 newly-collected samples. Altogether more than 7000 specimens were investigated. The revision has revealed that species diversity of the Vietnamese Plectopylidae was previously overestimated. Overall, thirteen species names (anterides Gude, 1909, bavayi Gude, 1901, congesta Gude, 1898, fallax Gude, 1909, gouldingi Gude, 1909, hirsuta Möllendorff, 1901, jovia Mabille, 1887, moellendorffi Gude, 1901, persimilis Gude, 1901, pilsbryana Gude, 1901, soror Gude, 1908, tenuis Gude, 1901, verecunda Gude, 1909) were synonymised with other species. In addition to these, Gudeodiscus hemmeni sp. n. and Gudeodiscus messageri raheemi ssp. n. are described from north-western Vietnam. Sixteen species and two subspecies are recognized from Vietnam. The reproductive anatomy of eight taxa is described. Based on anatomical information, Halongella gen. n. is erected to include Plectopylis schlumbergeri and Plectopylis fruhstorferi. Additionally, the genus Gudeodiscus is subdivided into two subgenera (Gudeodiscus and Veludiscus subgen. n.) on the basis of the morphology of the reproductive anatomy and the radula. The Chinese Gudeodiscus phlyarius werneri Páll-Gergely, 2013 is moved to synonymy of Gudeodiscus phlyarius. A spermatophore was found in the organ situated next to the gametolytic sac in one specimen. This suggests that this organ in the Plectopylidae is a diverticulum. Statistically significant evidence is presented for the presence of calcareous hook-like granules inside the penis being associated with the absence of embryos in the uterus in four genera. This suggests that these probably play a role in mating periods before disappearing when embryos develop. Sicradiscus mansuyi is reported from China for the first time. PMID:25632253

  6. Impacts of nitrogen addition on plant biodiversity in mountain grasslands depend on dose, application duration and climate: a systematic review.

    PubMed

    Humbert, Jean-Yves; Dwyer, John M; Andrey, Aline; Arlettaz, Raphaël

    2016-01-01

    Although the influence of nitrogen (N) addition on grassland plant communities has been widely studied, it is still unclear whether observed patterns and underlying mechanisms are constant across biomes. In this systematic review, we use meta-analysis and metaregression to investigate the influence of N addition (here referring mostly to fertilization) upon the biodiversity of temperate mountain grasslands (including montane, subalpine and alpine zones). Forty-two studies met our criteria of inclusion, resulting in 134 measures of effect size. The main general responses of mountain grasslands to N addition were increases in phytomass and reductions in plant species richness, as observed in lowland grasslands. More specifically, the analysis reveals that negative effects on species richness were exacerbated by dose (ha(-1) year(-1) ) and duration of N application (years) in an additive manner. Thus, sustained application of low to moderate levels of N over time had effects similar to short-term application of high N doses. The climatic context also played an important role: the overall effects of N addition on plant species richness and diversity (Shannon index) were less pronounced in mountain grasslands experiencing cool rather than warm summers. Furthermore, the relative negative effect of N addition on species richness was more pronounced in managed communities and was strongly negatively related to N-induced increases in phytomass, that is the greater the phytomass response to N addition, the greater the decline in richness. Altogether, this review not only establishes that plant biodiversity of mountain grasslands is negatively affected by N addition, but also demonstrates that several local management and abiotic factors interact with N addition to drive plant community changes. This synthesis yields essential information for a more sustainable management of mountain grasslands, emphasizing the importance of preserving and restoring grasslands with both low

  7. Characterization factors for terrestrial acidification at the global scale: a systematic analysis of spatial variability and uncertainty.

    PubMed

    Roy, Pierre-Olivier; Azevedo, Ligia B; Margni, Manuele; van Zelm, Rosalie; Deschênes, Louise; Huijbregts, Mark A J

    2014-12-01

    Characterization factors (CFs) are used in life cycle assessment (LCA) to quantify the potential impact per unit of emission. CFs are obtained from a characterization model which assess the environmental mechanisms along the cause-effect chain linking an emission to its potential damage on a given area of protection, such as loss in ecosystem quality. Up to now, CFs for acidifying emissions did not cover the global scale and were only representative of their characterization model geographical scope. Consequently, current LCA practices implicitly assume that all emissions from a global supply chain occur within the continent referring to the characterization method geographical scope. This paper provides worldwide 2°×2.5° spatially-explicit CFs, representing the change in relative loss of terrestrial vascular plant species due to an emission change of nitrogen oxides (NOx), ammonia (NH3) and sulfur dioxide (SO2). We found that spatial variability in the CFs is much larger compared to statistical uncertainty (six orders of magnitude vs. two orders of magnitude). Spatial variability is mainly caused by the atmospheric fate factor and soil sensitivity factor, while the ecological effect factor is the dominant contributor to the statistical uncertainty. The CFs provided in our study allow the worldwide spatially explicit evaluation of life cycle impacts related to acidifying emissions. This opens the door to evaluate regional life cycle emissions of different products in a global economy.

  8. A Software Toolkit to Study Systematic Uncertainties of the Physics Models of the Geant4 Simulation Package

    SciTech Connect

    Genser, Krzysztof; Hatcher, Robert; Perdue, Gabriel; Wenzel, Hans; Yarba, Julia; Kelsey, Michael; Wright, Dennis H.

    2016-11-10

    The Geant4 toolkit is used to model interactions between particles and matter. Geant4 employs a set of validated physics models that span a wide range of interaction energies. These models are tuned to cover a large variety of possible applications. This raises the critical question of what uncertainties are associated with the Geant4 physics model, or group of models, involved in a simulation project. To address the challenge, we have designed and implemented a comprehen- sive, modular, user-friendly software toolkit that allows the variation of one or more parameters of one or more Geant4 physics models involved in simulation studies. It also enables analysis of multiple variants of the resulting physics observables of interest in order to estimate the uncertain- ties associated with the simulation model choices. Key functionalities of the toolkit are presented in this paper and are illustrated with selected results.

  9. Impact of contacting study authors to obtain additional data for systematic reviews: diagnostic accuracy studies for hepatic fibrosis

    PubMed Central

    2014-01-01

    Background Seventeen of 172 included studies in a recent systematic review of blood tests for hepatic fibrosis or cirrhosis reported diagnostic accuracy results discordant from 2 × 2 tables, and 60 studies reported inadequate data to construct 2 × 2 tables. This study explores the yield of contacting authors of diagnostic accuracy studies and impact on the systematic review findings. Methods Sixty-six corresponding authors were sent letters requesting additional information or clarification of data from 77 studies. Data received from the authors were synthesized with data included in the previous review, and diagnostic accuracy sensitivities, specificities, and positive and likelihood ratios were recalculated. Results Of the 66 authors, 68% were successfully contacted and 42% provided additional data for 29 out of 77 studies (38%). All authors who provided data at all did so by the third emailed request (ten authors provided data after one request). Authors of more recent studies were more likely to be located and provide data compared to authors of older studies. The effects of requests for additional data on the conclusions regarding the utility of blood tests to identify patients with clinically significant fibrosis or cirrhosis were generally small for ten out of 12 tests. Additional data resulted in reclassification (using median likelihood ratio estimates) from less useful to moderately useful or vice versa for the remaining two blood tests and enabled the calculation of an estimate for a third blood test for which previously the data had been insufficient to do so. We did not identify a clear pattern for the directional impact of additional data on estimates of diagnostic accuracy. Conclusions We successfully contacted and received results from 42% of authors who provided data for 38% of included studies. Contacting authors of studies evaluating the diagnostic accuracy of serum biomarkers for hepatic fibrosis and cirrhosis in hepatitis C patients

  10. The additional effect of orthotic devices on exercise therapy for patients with patellofemoral pain syndrome: a systematic review.

    PubMed

    Swart, Nynke M; van Linschoten, Robbart; Bierma-Zeinstra, Sita M A; van Middelkoop, Marienke

    2012-06-01

    The aim of the study is to determine "the additional effect of... function" for patellofemoral pain syndrome (PFPS). The additional effect of orthotic devices over exercise therapy on pain and function. A systematic literature search was conducted in MEDLINE, CINAHL, EMBASE, Cochrane and PEDro. Randomised controlled trials and controlled clinical trials of patients diagnosed with PFPS evaluating a clinically relevant outcome were included. Treatment had to include exercise therapy combined with orthotics, compared with an identical exercise programme with or without sham orthotics. Data were summarised using a best evidence synthesis. Eight trials fulfilled the inclusion criteria, of which three had a low risk of bias. There is moderate evidence for no additive effectiveness of knee braces to exercise therapy on pain (effect sizes (ES) varied from -0.14 to 0.04) and conflicting evidence on function (ES -0.33). There is moderate evidence for no difference between knee braces and exercise therapy versus placebo knee braces and exercise therapy on pain and function (ES -0.1-0.10). More studies of high methodological quality are needed to draw definitive conclusions.

  11. Adenomyomatosis of the gallbladder in childhood: A systematic review of the literature and an additional case report

    PubMed Central

    Parolini, Filippo; Indolfi, Giuseppe; Magne, Miguel Garcia; Salemme, Marianna; Cheli, Maurizio; Boroni, Giovanni; Alberti, Daniele

    2016-01-01

    AIM: To investigate the diagnostic and therapeutic assessment in children with adenomyomatosis of the gallbladder (AMG). METHODS: AMG is a degenerative disease characterized by a proliferation of the mucosal epithelium which deeply invaginates and extends into the thickened muscular layer of the gallbladder, causing intramural diverticula. Although AMG is found in up to 5% of cholecystectomy specimens in adult populations, this condition in childhood is extremely uncommon. Authors provide a detailed systematic review of the pediatric literature according to PRISMA guidelines, focusing on diagnostic and therapeutic assessment. An additional case of AMG is also presented. RESULTS: Five studies were finally enclosed, encompassing 5 children with AMG. Analysis was extended to our additional 11-year-old patient, who presented diffuse AMG and pancreatic acinar metaplasia of the gallbladder mucosa and was successfully managed with laparoscopic cholecystectomy. Mean age at presentation was 7.2 years. Unspecific abdominal pain was the commonest symptom. Abdominal ultrasound was performed on all patients, with a diagnostic accuracy of 100%. Five patients underwent cholecystectomy, and at follow-up were asymptomatic. In the remaining patient, completely asymptomatic at diagnosis, a conservative approach with monthly monitoring via ultrasonography was undertaken. CONCLUSION: Considering the remote but possible degeneration leading to cancer and the feasibility of laparoscopic cholecystectomy even in small children, evidence suggests that elective laparoscopic cholecystectomy represent the treatment of choice. Pre-operative evaluation of the extrahepatic biliary tree anatomy with cholangio-MRI is strongly recommended. PMID:27170933

  12. Network planning under uncertainties

    NASA Astrophysics Data System (ADS)

    Ho, Kwok Shing; Cheung, Kwok Wai

    2008-11-01

    One of the main focuses for network planning is on the optimization of network resources required to build a network under certain traffic demand projection. Traditionally, the inputs to this type of network planning problems are treated as deterministic. In reality, the varying traffic requirements and fluctuations in network resources can cause uncertainties in the decision models. The failure to include the uncertainties in the network design process can severely affect the feasibility and economics of the network. Therefore, it is essential to find a solution that can be insensitive to the uncertain conditions during the network planning process. As early as in the 1960's, a network planning problem with varying traffic requirements over time had been studied. Up to now, this kind of network planning problems is still being active researched, especially for the VPN network design. Another kind of network planning problems under uncertainties that has been studied actively in the past decade addresses the fluctuations in network resources. One such hotly pursued research topic is survivable network planning. It considers the design of a network under uncertainties brought by the fluctuations in topology to meet the requirement that the network remains intact up to a certain number of faults occurring anywhere in the network. Recently, the authors proposed a new planning methodology called Generalized Survivable Network that tackles the network design problem under both varying traffic requirements and fluctuations of topology. Although all the above network planning problems handle various kinds of uncertainties, it is hard to find a generic framework under more general uncertainty conditions that allows a more systematic way to solve the problems. With a unified framework, the seemingly diverse models and algorithms can be intimately related and possibly more insights and improvements can be brought out for solving the problem. This motivates us to seek a

  13. Additive Synergism between Asbestos and Smoking in Lung Cancer Risk: A Systematic Review and Meta-Analysis

    PubMed Central

    Ngamwong, Yuwadee; Tangamornsuksan, Wimonchat; Lohitnavy, Ornrat; Chaiyakunapruk, Nathorn; Scholfield, C. Norman; Reisfeld, Brad; Lohitnavy, Manupat

    2015-01-01

    Smoking and asbestos exposure are important risks for lung cancer. Several epidemiological studies have linked asbestos exposure and smoking to lung cancer. To reconcile and unify these results, we conducted a systematic review and meta-analysis to provide a quantitative estimate of the increased risk of lung cancer associated with asbestos exposure and cigarette smoking and to classify their interaction. Five electronic databases were searched from inception to May, 2015 for observational studies on lung cancer. All case-control (N = 10) and cohort (N = 7) studies were included in the analysis. We calculated pooled odds ratios (ORs), relative risks (RRs) and 95% confidence intervals (CIs) using a random-effects model for the association of asbestos exposure and smoking with lung cancer. Lung cancer patients who were not exposed to asbestos and non-smoking (A-S-) were compared with; (i) asbestos-exposed and non-smoking (A+S-), (ii) non-exposure to asbestos and smoking (A-S+), and (iii) asbestos-exposed and smoking (A+S+). Our meta-analysis showed a significant difference in risk of developing lung cancer among asbestos exposed and/or smoking workers compared to controls (A-S-), odds ratios for the disease (95% CI) were (i) 1.70 (A+S-, 1.31–2.21), (ii) 5.65; (A-S+, 3.38–9.42), (iii) 8.70 (A+S+, 5.8–13.10). The additive interaction index of synergy was 1.44 (95% CI = 1.26–1.77) and the multiplicative index = 0.91 (95% CI = 0.63–1.30). Corresponding values for cohort studies were 1.11 (95% CI = 1.00–1.28) and 0.51 (95% CI = 0.31–0.85). Our results point to an additive synergism for lung cancer with co-exposure of asbestos and cigarette smoking. Assessments of industrial health risks should take smoking and other airborne health risks when setting occupational asbestos exposure limits. PMID:26274395

  14. Additive Synergism between Asbestos and Smoking in Lung Cancer Risk: A Systematic Review and Meta-Analysis.

    PubMed

    Ngamwong, Yuwadee; Tangamornsuksan, Wimonchat; Lohitnavy, Ornrat; Chaiyakunapruk, Nathorn; Scholfield, C Norman; Reisfeld, Brad; Lohitnavy, Manupat

    2015-01-01

    Smoking and asbestos exposure are important risks for lung cancer. Several epidemiological studies have linked asbestos exposure and smoking to lung cancer. To reconcile and unify these results, we conducted a systematic review and meta-analysis to provide a quantitative estimate of the increased risk of lung cancer associated with asbestos exposure and cigarette smoking and to classify their interaction. Five electronic databases were searched from inception to May, 2015 for observational studies on lung cancer. All case-control (N = 10) and cohort (N = 7) studies were included in the analysis. We calculated pooled odds ratios (ORs), relative risks (RRs) and 95% confidence intervals (CIs) using a random-effects model for the association of asbestos exposure and smoking with lung cancer. Lung cancer patients who were not exposed to asbestos and non-smoking (A-S-) were compared with; (i) asbestos-exposed and non-smoking (A+S-), (ii) non-exposure to asbestos and smoking (A-S+), and (iii) asbestos-exposed and smoking (A+S+). Our meta-analysis showed a significant difference in risk of developing lung cancer among asbestos exposed and/or smoking workers compared to controls (A-S-), odds ratios for the disease (95% CI) were (i) 1.70 (A+S-, 1.31-2.21), (ii) 5.65; (A-S+, 3.38-9.42), (iii) 8.70 (A+S+, 5.8-13.10). The additive interaction index of synergy was 1.44 (95% CI = 1.26-1.77) and the multiplicative index = 0.91 (95% CI = 0.63-1.30). Corresponding values for cohort studies were 1.11 (95% CI = 1.00-1.28) and 0.51 (95% CI = 0.31-0.85). Our results point to an additive synergism for lung cancer with co-exposure of asbestos and cigarette smoking. Assessments of industrial health risks should take smoking and other airborne health risks when setting occupational asbestos exposure limits.

  15. Uncertainty Analysis of Seebeck Coefficient and Electrical Resistivity Characterization

    NASA Technical Reports Server (NTRS)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    In order to provide a complete description of a materials thermoelectric power factor, in addition to the measured nominal value, an uncertainty interval is required. The uncertainty may contain sources of measurement error including systematic bias error and precision error of a statistical nature. The work focuses specifically on the popular ZEM-3 (Ulvac Technologies) measurement system, but the methods apply to any measurement system. The analysis accounts for sources of systematic error including sample preparation tolerance, measurement probe placement, thermocouple cold-finger effect, and measurement parameters; in addition to including uncertainty of a statistical nature. Complete uncertainty analysis of a measurement system allows for more reliable comparison of measurement data between laboratories.

  16. Addition of Ezetimibe to statins for patients at high cardiovascular risk: Systematic review of patient-important outcomes.

    PubMed

    Fei, Yutong; Guyatt, Gordon Henry; Alexander, Paul Elias; El Dib, Regina; Siemieniuk, Reed A C; Vandvik, Per Olav; Nunnally, Mark E; Gomaa, Huda; Morgan, Rebecca L; Agarwal, Arnav; Zhang, Ying; Bhatnagar, Neera; Spencer, Frederick A

    2017-01-16

    Ezetimibe is widely used in combination with statins to reduce low-density lipoprotein. We sought to examine the impact of ezetimibe when added to statins on patient-important outcomes. Medline, EMBASE, CINAHL, and CENTRAL were searched through July, 2016. Randomized controlled trials (RCTs) of ezetimibe combined with statins versus statins alone that followed patients for at least 6 months and reported on at least one of all-cause mortality, cardiovascular deaths, non-fatal myocardial infarctions (MI), and non-fatal strokes were included. Pairs of reviewers extracted study data and assessed risk of bias independently and in duplicate. Quality of evidence was assessed using the GRADE approach. We conducted a narrative review with complementary subgroup and sensitivity analyses. IMPROVE-IT study enrolled 93% of all patients enrolled in the 8 included trials. Our analysis of the IMPROVE-IT study results showed that in patients at high risk of cardiovascular events, ezetimibe added to statins was associated with i) a likely reduction in non-fatal MI (17 fewer/1000 treated over 6 years, moderate certainty in evidence); ii) a possible reduction in non-fatal stroke (6 fewer/1000 treated over 6 years, low certainty); iii) no impact on myopathy (moderate certainty); iv) potentially no impact on all-cause mortality and cardiovascular death (both moderate certainty); and v) possibly no impact on cancer (low certainty). Addition of ezetimibe to moderate-dose statins is likely to result in 17 fewer MIs and possibly 6 fewer strokes/1000 treated over 6 years but is unlikely to reduce all-cause mortality or cardiovascular death. Patients who place a high value on a small absolute reduction in MI and are not adverse to use of an additional medication over a long duration may opt for ezetimibe in addition to statin therapy. Our analysis revealed no increased specific harms associated with addition of ezetimibe to statins.

  17. Preventive zinc supplementation for children, and the effect of additional iron: a systematic review and meta-analysis

    PubMed Central

    Mayo-Wilson, Evan; Imdad, Aamer; Junior, Jean; Dean, Sohni; Bhutta, Zulfiqar A

    2014-01-01

    Objective Zinc deficiency is widespread, and preventive supplementation may have benefits in young children. Effects for children over 5 years of age, and effects when coadministered with other micronutrients are uncertain. These are obstacles to scale-up. This review seeks to determine if preventive supplementation reduces mortality and morbidity for children aged 6 months to 12 years. Design Systematic review conducted with the Cochrane Developmental, Psychosocial and Learning Problems Group. Two reviewers independently assessed studies. Meta-analyses were performed for mortality, illness and side effects. Data sources We searched multiple databases, including CENTRAL and MEDLINE in January 2013. Authors were contacted for missing information. Eligibility criteria for selecting studies Randomised trials of preventive zinc supplementation. Hospitalised children and children with chronic diseases were excluded. Results 80 randomised trials with 205 401 participants were included. There was a small but non-significant effect on all-cause mortality (risk ratio (RR) 0.95 (95% CI 0.86 to 1.05)). Supplementation may reduce incidence of all-cause diarrhoea (RR 0.87 (0.85 to 0.89)), but there was evidence of reporting bias. There was no evidence of an effect of incidence or prevalence of respiratory infections or malaria. There was moderate quality evidence of a very small effect on linear growth (standardised mean difference 0.09 (0.06 to 0.13)) and an increase in vomiting (RR 1.29 (1.14 to 1.46)). There was no evidence of an effect on iron status. Comparing zinc with and without iron cosupplementation and direct comparisons of zinc plus iron versus zinc administered alone favoured cointervention for some outcomes and zinc alone for other outcomes. Effects may be larger for children over 1 year of age, but most differences were not significant. Conclusions Benefits of preventive zinc supplementation may outweigh any potentially adverse effects in areas where

  18. Deriving uncertainty factors for threshold chemical contaminants in drinking water.

    PubMed

    Ritter, Leonard; Totman, Céline; Krishnan, Kannan; Carrier, Richard; Vézina, Anne; Morisset, Véronique

    2007-10-01

    Uncertainty factors are used in the development of drinking-water guidelines to account for uncertainties in the database, including extrapolations of toxicity from animal studies and variability within humans, which result in some uncertainty about risk. The application of uncertainty factors is entrenched in toxicological risk assessment worldwide, but is not applied consistently. This report, prepared in collaboration with Health Canada, provides an assessment of the derivation of the uncertainty factor assumptions used in developing drinking-water quality guidelines for chemical contaminants. Assumptions used by Health Canada in the development of guidelines were compared to several other major regulatory jurisdictions. This assessment has revealed that uncertainty factor assumptions have been substantially influenced by historical practice. While the application of specific uncertainty factors appears to be well entrenched in regulatory practice, a well-documented and disciplined basis for the selection of these factors was not apparent in any of the literature supporting the default assumptions of Canada, the United States, Australia, or the World Health Organization. While there is a basic scheme used in most cases in developing drinking-water quality guidelines for nonthreshold contaminants by the jurisdictions included in this report, additional factors are sometimes included to account for other areas of uncertainty. These factors may include extrapolating subchronic data to anticipated chronic exposure, or use of a LOAEL instead of a NOAEL. The default value attributed to each uncertainty factor is generally a factor of 3 or 10; however, again, no comprehensive guidance to develop and apply these additional uncertainty factors was evident from the literature reviewed. A decision tree has been developed to provide guidance for selection of appropriate uncertainty factors, to account for the range of uncertainty encountered in the risk assessment process

  19. Model Uncertainty, Earthquake Hazard, and the WGCEP-2002 Forecast

    NASA Astrophysics Data System (ADS)

    Page, M. T.; Carlson, J. M.

    2005-12-01

    Model uncertainty is prevalent in Probabilistic Seismic Hazard Analysis (PSHA) because the true mechanism generating risk is unknown. While it is well-understood how to incorporate parameter uncertainty in PSHA, model uncertainty is more difficult to incorporate due to the high degree of dependence between different earthquake-recurrence models. We find that the method used by the 2002 Working Group on California Earthquake Probabilities (WG02) to combine the probability distributions given by multiple models has several adverse effects on their result. In particular, taking a linear combination of the various models ignores issues of model dependence and leads to large uncertainties in the final hazard estimate. Furthermore, choosing model weights based on data can systematically bias the final probability distribution. The weighting scheme of the WG02 report also depends upon an arbitrary ordering of models. In addition to analyzing current statistical problems, we present alternative methods for rigorously incorporating model uncertainty into PSHA.

  20. Assessing uncertainties in land cover projections.

    PubMed

    Alexander, Peter; Prestele, Reinhard; Verburg, Peter H; Arneth, Almut; Baranzelli, Claudia; Batista E Silva, Filipe; Brown, Calum; Butler, Adam; Calvin, Katherine; Dendoncker, Nicolas; Doelman, Jonathan C; Dunford, Robert; Engström, Kerstin; Eitelberg, David; Fujimori, Shinichiro; Harrison, Paula A; Hasegawa, Tomoko; Havlik, Petr; Holzhauer, Sascha; Humpenöder, Florian; Jacobs-Crisioni, Chris; Jain, Atul K; Krisztin, Tamás; Kyle, Page; Lavalle, Carlo; Lenton, Tim; Liu, Jiayi; Meiyappan, Prasanth; Popp, Alexander; Powell, Tom; Sands, Ronald D; Schaldach, Rüdiger; Stehfest, Elke; Steinbuks, Jevgenijs; Tabeau, Andrzej; van Meijl, Hans; Wise, Marshall A; Rounsevell, Mark D A

    2017-02-01

    Understanding uncertainties in land cover projections is critical to investigating land-based climate mitigation policies, assessing the potential of climate adaptation strategies and quantifying the impacts of land cover change on the climate system. Here, we identify and quantify uncertainties in global and European land cover projections over a diverse range of model types and scenarios, extending the analysis beyond the agro-economic models included in previous comparisons. The results from 75 simulations over 18 models are analysed and show a large range in land cover area projections, with the highest variability occurring in future cropland areas. We demonstrate systematic differences in land cover areas associated with the characteristics of the modelling approach, which is at least as great as the differences attributed to the scenario variations. The results lead us to conclude that a higher degree of uncertainty exists in land use projections than currently included in climate or earth system projections. To account for land use uncertainty, it is recommended to use a diverse set of models and approaches when assessing the potential impacts of land cover change on future climate. Additionally, further work is needed to better understand the assumptions driving land use model results and reveal the causes of uncertainty in more depth, to help reduce model uncertainty and improve the projections of land cover.

  1. Systematic Dissection of Coding Exons at Single Nucleotide Resolution Supports an Additional Role in Cell-Specific Transcriptional Regulation

    PubMed Central

    Kim, Mee J.; Findlay, Gregory M.; Martin, Beth; Zhao, Jingjing; Bell, Robert J. A.; Smith, Robin P.; Ku, Angel A.; Shendure, Jay; Ahituv, Nadav

    2014-01-01

    In addition to their protein coding function, exons can also serve as transcriptional enhancers. Mutations in these exonic-enhancers (eExons) could alter both protein function and transcription. However, the functional consequence of eExon mutations is not well known. Here, using massively parallel reporter assays, we dissect the enhancer activity of three liver eExons (SORL1 exon 17, TRAF3IP2 exon 2, PPARG exon 6) at single nucleotide resolution in the mouse liver. We find that both synonymous and non-synonymous mutations have similar effects on enhancer activity and many of the deleterious mutation clusters overlap known liver-associated transcription factor binding sites. Carrying a similar massively parallel reporter assay in HeLa cells with these three eExons found differences in their mutation profiles compared to the liver, suggesting that enhancers could have distinct operating profiles in different tissues. Our results demonstrate that eExon mutations could lead to multiple phenotypes by disrupting both the protein sequence and enhancer activity and that enhancers can have distinct mutation profiles in different cell types. PMID:25340400

  2. Systematic estimation of theoretical uncertainties in the calculation of the pion-photon transition form factor using light-cone sum rules

    NASA Astrophysics Data System (ADS)

    Mikhailov, S. V.; Pimikov, A. V.; Stefanis, N. G.

    2016-06-01

    We consider the calculation of the pion-photon transition form factor Fγ*γπ0(Q2) within light-cone sum rules focusing attention to the low-mid region of momenta. The central aim is to estimate the theoretical uncertainties which originate from a wide variety of sources related to (i) the relevance of next-to-next-to-leading order radiative corrections (ii) the influence of the twist-four and the twist-six term (iii) the sensitivity of the results on auxiliary parameters, like the Borel scale M2, (iv) the role of the phenomenological description of resonances, and (v) the significance of a small but finite virtuality of the quasireal photon. Predictions for Fγ*γπ0(Q2) are presented which include all these uncertainties and found to comply within the margin of experimental error with the existing data in the Q2 range between 1 and 5 GeV2 , thus justifying the reliability of the applied calculational scheme. This provides a solid basis for confronting theoretical predictions with forthcoming data bearing small statistical errors.

  3. Optimal test selection for prediction uncertainty reduction

    DOE PAGES

    Mullins, Joshua; Mahadevan, Sankaran; Urbina, Angel

    2016-12-02

    Economic factors and experimental limitations often lead to sparse and/or imprecise data used for the calibration and validation of computational models. This paper addresses resource allocation for calibration and validation experiments, in order to maximize their effectiveness within given resource constraints. When observation data are used for model calibration, the quality of the inferred parameter descriptions is directly affected by the quality and quantity of the data. This paper characterizes parameter uncertainty within a probabilistic framework, which enables the uncertainty to be systematically reduced with additional data. The validation assessment is also uncertain in the presence of sparse and imprecisemore » data; therefore, this paper proposes an approach for quantifying the resulting validation uncertainty. Since calibration and validation uncertainty affect the prediction of interest, the proposed framework explores the decision of cost versus importance of data in terms of the impact on the prediction uncertainty. Often, calibration and validation tests may be performed for different input scenarios, and this paper shows how the calibration and validation results from different conditions may be integrated into the prediction. Then, a constrained discrete optimization formulation that selects the number of tests of each type (calibration or validation at given input conditions) is proposed. Furthermore, the proposed test selection methodology is demonstrated on a microelectromechanical system (MEMS) example.« less

  4. Optimal test selection for prediction uncertainty reduction

    SciTech Connect

    Mullins, Joshua; Mahadevan, Sankaran; Urbina, Angel

    2016-12-02

    Economic factors and experimental limitations often lead to sparse and/or imprecise data used for the calibration and validation of computational models. This paper addresses resource allocation for calibration and validation experiments, in order to maximize their effectiveness within given resource constraints. When observation data are used for model calibration, the quality of the inferred parameter descriptions is directly affected by the quality and quantity of the data. This paper characterizes parameter uncertainty within a probabilistic framework, which enables the uncertainty to be systematically reduced with additional data. The validation assessment is also uncertain in the presence of sparse and imprecise data; therefore, this paper proposes an approach for quantifying the resulting validation uncertainty. Since calibration and validation uncertainty affect the prediction of interest, the proposed framework explores the decision of cost versus importance of data in terms of the impact on the prediction uncertainty. Often, calibration and validation tests may be performed for different input scenarios, and this paper shows how the calibration and validation results from different conditions may be integrated into the prediction. Then, a constrained discrete optimization formulation that selects the number of tests of each type (calibration or validation at given input conditions) is proposed. Furthermore, the proposed test selection methodology is demonstrated on a microelectromechanical system (MEMS) example.

  5. SU(2) uncertainty limits

    NASA Astrophysics Data System (ADS)

    Shabbir, Saroosh; Björk, Gunnar

    2016-05-01

    Although progress has been made recently in defining nontrivial uncertainty limits for the SU(2) group, a description of the intermediate states bound by these limits remains lacking. In this paper we enumerate possible uncertainty relations for the SU(2) group that involve all three observables and that are, moreover, invariant under SU(2) transformations. We demonstrate that these relations however, even taken as a group, do not provide sharp, saturable bounds. To find sharp bounds, we systematically calculate the variance of the SU(2) operators for all pure states belonging to the N =2 and N =3 polarization excitation manifold (corresponding to spin 1 and spin 3/2). Lastly, and perhaps counter to expectation, we note that even pure states can reach the maximum uncertainty limit.

  6. A Low-Cost Environmental Monitoring System: How to Prevent Systematic Errors in the Design Phase through the Combined Use of Additive Manufacturing and Thermographic Techniques.

    PubMed

    Salamone, Francesco; Danza, Ludovico; Meroni, Italo; Pollastro, Maria Cristina

    2017-04-11

    nEMoS (nano Environmental Monitoring System) is a 3D-printed device built following the Do-It-Yourself (DIY) approach. It can be connected to the web and it can be used to assess indoor environmental quality (IEQ). It is built using some low-cost sensors connected to an Arduino microcontroller board. The device is assembled in a small-sized case and both thermohygrometric sensors used to measure the air temperature and relative humidity, and the globe thermometer used to measure the radiant temperature, can be subject to thermal effects due to overheating of some nearby components. A thermographic analysis was made to rule out this possibility. The paper shows how the pervasive technique of additive manufacturing can be combined with the more traditional thermographic techniques to redesign the case and to verify the accuracy of the optimized system in order to prevent instrumental systematic errors in terms of the difference between experimental and actual values of the above-mentioned environmental parameters.

  7. Optimal design and uncertainty quantification in blood flow simulations for congenital heart disease

    NASA Astrophysics Data System (ADS)

    Marsden, Alison

    2009-11-01

    Recent work has demonstrated substantial progress in capabilities for patient-specific cardiovascular flow simulations. Recent advances include increasingly complex geometries, physiological flow conditions, and fluid structure interaction. However inputs to these simulations, including medical image data, catheter-derived pressures and material properties, can have significant uncertainties associated with them. For simulations to predict clinically useful and reliable output information, it is necessary to quantify the effects of input uncertainties on outputs of interest. In addition, blood flow simulation tools can now be efficiently coupled to shape optimization algorithms for surgery design applications, and these tools should incorporate uncertainty information. We present a unified framework to systematically and efficient account for uncertainties in simulations using adaptive stochastic collocation. In addition, we present a framework for derivative-free optimization of cardiovascular geometries, and layer these tools to perform optimization under uncertainty. These methods are demonstrated using simulations and surgery optimization to improve hemodynamics in pediatric cardiology applications.

  8. The challenges of uncertainty and interprofessional collaboration in palliative care for non-cancer patients in the community: A systematic review of views from patients, carers and health-care professionals

    PubMed Central

    Murtagh, Fliss EM

    2014-01-01

    Background: Primary care has the potential to play significant roles in providing effective palliative care for non-cancer patients. Aim: To identify, critically appraise and synthesise the existing evidence on views on the provision of palliative care for non-cancer patients by primary care providers and reveal any gaps in the evidence. Design: Standard systematic review and narrative synthesis. Data sources: MEDLINE, Embase, CINAHL, PsycINFO, Applied Social Science Abstract and the Cochrane library were searched in 2012. Reference searching, hand searching, expert consultations and grey literature searches complemented these. Papers with the views of patients/carers or professionals on primary palliative care provision to non-cancer patients in the community were included. The amended Hawker’s criteria were used for quality assessment of included studies. Results: A total of 30 studies were included and represent the views of 719 patients, 605 carers and over 400 professionals. In all, 27 studies are from the United Kingdom. Patients and carers expect primary care physicians to provide compassionate care, have appropriate knowledge and play central roles in providing care. The roles of professionals are unclear to patients, carers and professionals themselves. Uncertainty of illness trajectory and lack of collaboration between health-care professionals were identified as barriers to effective care. Conclusions: Effective interprofessional work to deal with uncertainty and maintain coordinated care is needed for better palliative care provision to non-cancer patients in the community. Research into and development of a best model for effective interdisciplinary work are needed. PMID:24821710

  9. Assessing uncertainty in physical constants

    NASA Astrophysics Data System (ADS)

    Henrion, Max; Fischhoff, Baruch

    1986-09-01

    Assessing the uncertainty due to possible systematic errors in a physical measurement unavoidably involves an element of subjective judgment. Examination of historical measurements and recommended values for the fundamental physical constants shows that the reported uncertainties have a consistent bias towards underestimating the actual errors. These findings are comparable to findings of persistent overconfidence in psychological research on the assessment of subjective probability distributions. Awareness of these biases could help in interpreting the precision of measurements, as well as provide a basis for improving the assessment of uncertainty in measurements.

  10. Exercise training alone or with the addition of activity counseling improves physical activity levels in COPD: a systematic review and meta-analysis of randomized controlled trials

    PubMed Central

    Lahham, Aroub; McDonald, Christine F; Holland, Anne E

    2016-01-01

    Background Physical inactivity is associated with poor outcomes in COPD, and as a result, interventions to improve physical activity (PA) are a current research focus. However, many trials have been small and inconclusive. Objective The aim of this systematic review and meta-analysis was to study the effects of randomized controlled trials (RCTs) targeting PA in COPD. Methods Databases (Physiotherapy Evidence Database [PEDro], Embase, MEDLINE, CINAHL and the Cochrane Central Register for Controlled Trials) were searched using the following keywords: “COPD”, “intervention” and “physical activity” from inception to May 20, 2016; published RCTs that aimed to increase PA in individuals with COPD were included. The PEDro scale was used to rate study quality. Standardized mean differences (effect sizes, ESs) with 95% confidence intervals (CIs) were determined. Effects of included interventions were also measured according to the minimal important difference (MID) in daily steps for COPD (599 daily steps). Results A total of 37 RCTs with 4,314 participants (mean forced expiratory volume in one second (FEV1) % predicted 50.5 [SD=10.4]) were identified. Interventions including exercise training (ET; n=3 studies, 103 participants) significantly increased PA levels in COPD compared to standard care (ES [95% CI]; 0.84 [0.44–1.25]). The addition of activity counseling to pulmonary rehabilitation (PR; n=4 studies, 140 participants) showed important effects on PA levels compared to PR alone (0.47 [0.02–0.92]), achieving significant increases that exceeded the MID for daily steps in COPD (mean difference [95% CI], 1,452 daily steps [549–2,356]). Reporting of methodological quality was poor in most included RCTs. Conclusion Interventions that included ET and PA counseling during PR were effective strategies to improve PA in COPD. PMID:27994451

  11. Simplified propagation of standard uncertainties

    SciTech Connect

    Shull, A.H.

    1997-06-09

    An essential part of any measurement control program is adequate knowledge of the uncertainties of the measurement system standards. Only with an estimate of the standards` uncertainties can one determine if the standard is adequate for its intended use or can one calculate the total uncertainty of the measurement process. Purchased standards usually have estimates of uncertainty on their certificates. However, when standards are prepared and characterized by a laboratory, variance propagation is required to estimate the uncertainty of the standard. Traditional variance propagation typically involves tedious use of partial derivatives, unfriendly software and the availability of statistical expertise. As a result, the uncertainty of prepared standards is often not determined or determined incorrectly. For situations meeting stated assumptions, easier shortcut methods of estimation are now available which eliminate the need for partial derivatives and require only a spreadsheet or calculator. A system of simplifying the calculations by dividing into subgroups of absolute and relative uncertainties is utilized. These methods also incorporate the International Standards Organization (ISO) concepts for combining systematic and random uncertainties as published in their Guide to the Expression of Measurement Uncertainty. Details of the simplified methods and examples of their use are included in the paper.

  12. Teaching Uncertainties

    ERIC Educational Resources Information Center

    Duerdoth, Ian

    2009-01-01

    The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…

  13. Uncertainty quantification in molecular dynamics

    NASA Astrophysics Data System (ADS)

    Rizzi, Francesco

    This dissertation focuses on uncertainty quantification (UQ) in molecular dynamics (MD) simulations. The application of UQ to molecular dynamics is motivated by the broad uncertainty characterizing MD potential functions and by the complexity of the MD setting, where even small uncertainties can be amplified to yield large uncertainties in the model predictions. Two fundamental, distinct sources of uncertainty are investigated in this work, namely parametric uncertainty and intrinsic noise. Intrinsic noise is inherently present in the MD setting, due to fluctuations originating from thermal effects. Averaging methods can be exploited to reduce the fluctuations, but due to finite sampling, this effect cannot be completely filtered, thus yielding a residual uncertainty in the MD predictions. Parametric uncertainty, on the contrary, is introduced in the form of uncertain potential parameters, geometry, and/or boundary conditions. We address the UQ problem in both its main components, namely the forward propagation, which aims at characterizing how uncertainty in model parameters affects selected observables, and the inverse problem, which involves the estimation of target model parameters based on a set of observations. The dissertation highlights the challenges arising when parametric uncertainty and intrinsic noise combine to yield non-deterministic, noisy MD predictions of target macroscale observables. Two key probabilistic UQ methods, namely Polynomial Chaos (PC) expansions and Bayesian inference, are exploited to develop a framework that enables one to isolate the impact of parametric uncertainty on the MD predictions and, at the same time, properly quantify the effect of the intrinsic noise. Systematic applications to a suite of problems of increasing complexity lead to the observation that an uncertain PC representation built via Bayesian regression is the most suitable model for the representation of uncertain MD predictions of target observables in the

  14. Exploring Uncertainty with Projectile Launchers

    ERIC Educational Resources Information Center

    Orzel, Chad; Reich, Gary; Marr, Jonathan

    2012-01-01

    The proper choice of a measurement technique that minimizes systematic and random uncertainty is an essential part of experimental physics. These issues are difficult to teach in the introductory laboratory, though. Because most experiments involve only a single measurement technique, students are often unable to make a clear distinction between…

  15. Uncertainty quantification and error analysis

    SciTech Connect

    Higdon, Dave M; Anderson, Mark C; Habib, Salman; Klein, Richard; Berliner, Mark; Covey, Curt; Ghattas, Omar; Graziani, Carlo; Seager, Mark; Sefcik, Joseph; Stark, Philip

    2010-01-01

    UQ studies all sources of error and uncertainty, including: systematic and stochastic measurement error; ignorance; limitations of theoretical models; limitations of numerical representations of those models; limitations on the accuracy and reliability of computations, approximations, and algorithms; and human error. A more precise definition for UQ is suggested below.

  16. Strategy under uncertainty.

    PubMed

    Courtney, H; Kirkland, J; Viguerie, P

    1997-01-01

    At the heart of the traditional approach to strategy lies the assumption that by applying a set of powerful analytic tools, executives can predict the future of any business accurately enough to allow them to choose a clear strategic direction. But what happens when the environment is so uncertain that no amount of analysis will allow us to predict the future? What makes for a good strategy in highly uncertain business environments? The authors, consultants at McKinsey & Company, argue that uncertainty requires a new way of thinking about strategy. All too often, they say, executives take a binary view: either they underestimate uncertainty to come up with the forecasts required by their companies' planning or capital-budging processes, or they overestimate it, abandon all analysis, and go with their gut instinct. The authors outline a new approach that begins by making a crucial distinction among four discrete levels of uncertainty that any company might face. They then explain how a set of generic strategies--shaping the market, adapting to it, or reserving the right to play at a later time--can be used in each of the four levels. And they illustrate how these strategies can be implemented through a combination of three basic types of actions: big bets, options, and no-regrets moves. The framework can help managers determine which analytic tools can inform decision making under uncertainty--and which cannot. At a broader level, it offers executives a discipline for thinking rigorously and systematically about uncertainty and its implications for strategy.

  17. Uncertainty quantification of effective nuclear interactions

    SciTech Connect

    Pérez, R. Navarro; Amaro, J. E.; Arriola, E. Ruiz

    2016-03-02

    We give a brief review on the development of phenomenological NN interactions and the corresponding quanti cation of statistical uncertainties. We look into the uncertainty of effective interactions broadly used in mean eld calculations through the Skyrme parameters and effective eld theory counter-terms by estimating both statistical and systematic uncertainties stemming from the NN interaction. We also comment on the role played by different tting strategies on the light of recent developments.

  18. Uncertainty analysis

    SciTech Connect

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.

  19. Modeling Errors in Daily Precipitation Measurements: Additive or Multiplicative?

    NASA Technical Reports Server (NTRS)

    Tian, Yudong; Huffman, George J.; Adler, Robert F.; Tang, Ling; Sapiano, Matthew; Maggioni, Viviana; Wu, Huan

    2013-01-01

    The definition and quantification of uncertainty depend on the error model used. For uncertainties in precipitation measurements, two types of error models have been widely adopted: the additive error model and the multiplicative error model. This leads to incompatible specifications of uncertainties and impedes intercomparison and application.In this letter, we assess the suitability of both models for satellite-based daily precipitation measurements in an effort to clarify the uncertainty representation. Three criteria were employed to evaluate the applicability of either model: (1) better separation of the systematic and random errors; (2) applicability to the large range of variability in daily precipitation; and (3) better predictive skills. It is found that the multiplicative error model is a much better choice under all three criteria. It extracted the systematic errors more cleanly, was more consistent with the large variability of precipitation measurements, and produced superior predictions of the error characteristics. The additive error model had several weaknesses, such as non constant variance resulting from systematic errors leaking into random errors, and the lack of prediction capability. Therefore, the multiplicative error model is a better choice.

  20. Uncertainties in offsite consequence analysis

    SciTech Connect

    Young, M.L.; Harper, F.T.; Lui, C.H.

    1996-03-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequences from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the U.S. Nuclear Regulatory Commission and the European Commission began co-sponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables using a formal expert judgment elicitation and evaluation process. This paper focuses on the methods used in and results of this on-going joint effort.

  1. The need for annual echocardiography to detect cabergoline-associated valvulopathy in patients with prolactinoma: a systematic review and additional clinical data.

    PubMed

    Caputo, Carmela; Prior, David; Inder, Warrick J

    2015-11-01

    Present recommendations by the US Food and Drug Administration advise that patients with prolactinoma treated with cabergoline should have an annual echocardiogram to screen for valvular heart disease. Here, we present new clinical data and a systematic review of the scientific literature showing that the prevalence of cabergoline-associated valvulopathy is very low. We prospectively assessed 40 patients with prolactinoma taking cabergoline. Cardiovascular examination before echocardiography detected an audible systolic murmur in 10% of cases (all were functional murmurs), and no clinically significant valvular lesion was shown on echocardiogram in the 90% of patients without a murmur. Our systematic review identified 21 studies that assessed the presence of valvular abnormalities in patients with prolactinoma treated with cabergoline. Including our new clinical data, only two (0·11%) of 1811 patients were confirmed to have cabergoline-associated valvulopathy (three [0·17%] if possible cases were included). The probability of clinically significant valvular heart disease is low in the absence of a murmur. On the basis of these findings, we challenge the present recommendations to do routine echocardiography in all patients taking cabergoline for prolactinoma every 12 months. We propose that such patients should be screened by a clinical cardiovascular examination and that echocardiogram should be reserved for those patients with an audible murmur, those treated for more than 5 years at a dose of more than 3 mg per week, or those who maintain cabergoline treatment after the age of 50 years.

  2. Orbital State Uncertainty Realism

    NASA Astrophysics Data System (ADS)

    Horwood, J.; Poore, A. B.

    2012-09-01

    Fundamental to the success of the space situational awareness (SSA) mission is the rigorous inclusion of uncertainty in the space surveillance network. The *proper characterization of uncertainty* in the orbital state of a space object is a common requirement to many SSA functions including tracking and data association, resolution of uncorrelated tracks (UCTs), conjunction analysis and probability of collision, sensor resource management, and anomaly detection. While tracking environments, such as air and missile defense, make extensive use of Gaussian and local linearity assumptions within algorithms for uncertainty management, space surveillance is inherently different due to long time gaps between updates, high misdetection rates, nonlinear and non-conservative dynamics, and non-Gaussian phenomena. The latter implies that "covariance realism" is not always sufficient. SSA also requires "uncertainty realism"; the proper characterization of both the state and covariance and all non-zero higher-order cumulants. In other words, a proper characterization of a space object's full state *probability density function (PDF)* is required. In order to provide a more statistically rigorous treatment of uncertainty in the space surveillance tracking environment and to better support the aforementioned SSA functions, a new class of multivariate PDFs are formulated which more accurately characterize the uncertainty of a space object's state or orbit. The new distribution contains a parameter set controlling the higher-order cumulants which gives the level sets a distinctive "banana" or "boomerang" shape and degenerates to a Gaussian in a suitable limit. Using the new class of PDFs within the general Bayesian nonlinear filter, the resulting filter prediction step (i.e., uncertainty propagation) is shown to have the *same computational cost as the traditional unscented Kalman filter* with the former able to maintain a proper characterization of the uncertainty for up to *ten

  3. Visualizing Flow of Uncertainty through Analytical Processes.

    PubMed

    Wu, Yingcai; Yuan, Guo-Xun; Ma, Kwan-Liu

    2012-12-01

    Uncertainty can arise in any stage of a visual analytics process, especially in data-intensive applications with a sequence of data transformations. Additionally, throughout the process of multidimensional, multivariate data analysis, uncertainty due to data transformation and integration may split, merge, increase, or decrease. This dynamic characteristic along with other features of uncertainty pose a great challenge to effective uncertainty-aware visualization. This paper presents a new framework for modeling uncertainty and characterizing the evolution of the uncertainty information through analytical processes. Based on the framework, we have designed a visual metaphor called uncertainty flow to visually and intuitively summarize how uncertainty information propagates over the whole analysis pipeline. Our system allows analysts to interact with and analyze the uncertainty information at different levels of detail. Three experiments were conducted to demonstrate the effectiveness and intuitiveness of our design.

  4. Improved Event Location Uncertainty Estimates

    DTIC Science & Technology

    2008-06-30

    model (such as Gaussian, spherical or exponential) typically used in geostatistics, we define the robust variogram model as the median regression curve...variogram model estimation We define the robust variogram model as the median regression curve of the residual difference squares for station pairs of...develop methodologies that improve location uncertainties in the presence of correlated, systematic model errors and non-Gaussian measurement errors. We

  5. Efficacy of additional psychosocial intervention in reducing low birth weight and preterm birth in teenage pregnancy: A systematic review and meta-analysis.

    PubMed

    Sukhato, Kanokporn; Wongrathanandha, Chathaya; Thakkinstian, Ammarin; Dellow, Alan; Horsuwansak, Pornpot; Anothaisintawee, Thunyarat

    2015-10-01

    This systematic review aimed to assess the efficacy of psychosocial interventions in reducing risk of low birth weight (LBW) and preterm birth (PTB) in teenage pregnancy. Relevant studies were identified from Medline, Scopus, CINAHL, and CENTRAL databases. Randomized controlled trials investigating effect of psychosocial interventions on risk of LBW and PTB, compared to routine antenatal care (ANC) were eligible. Relative risks (RR) of LBW and PTB were pooled using inverse variance method. Mean differences of birth weight (BW) between intervention and control groups were pooled using unstandardized mean difference (USMD). Five studies were included in the review. Compared with routine ANC, psychosocial interventions significantly reduced risk of LBW by 40% (95%CI: 8%,62%) but not for PTB (pooled RR = 0.67, 95%CI: 0.42,1.05). Mean BW of the intervention group was significantly higher than that of the control group with USMD of 200.63 g (95% CI: 21.02, 380.25). Results of our study suggest that psychosocial interventions significantly reduced risk of LBW in teenage pregnancy.

  6. Quantifying Mixed Uncertainties in Cyber Attacker Payoffs

    SciTech Connect

    Chatterjee, Samrat; Halappanavar, Mahantesh; Tipireddy, Ramakrishna; Oster, Matthew R.; Saha, Sudip

    2015-04-15

    Representation and propagation of uncertainty in cyber attacker payoffs is a key aspect of security games. Past research has primarily focused on representing the defender’s beliefs about attacker payoffs as point utility estimates. More recently, within the physical security domain, attacker payoff uncertainties have been represented as Uniform and Gaussian probability distributions, and intervals. Within cyber-settings, continuous probability distributions may still be appropriate for addressing statistical (aleatory) uncertainties where the defender may assume that the attacker’s payoffs differ over time. However, systematic (epistemic) uncertainties may exist, where the defender may not have sufficient knowledge or there is insufficient information about the attacker’s payoff generation mechanism. Such epistemic uncertainties are more suitably represented as probability boxes with intervals. In this study, we explore the mathematical treatment of such mixed payoff uncertainties.

  7. Uncertainty Estimation in Intensity-Modulated Radiotherapy Absolute Dosimetry Verification

    SciTech Connect

    Sanchez-Doblado, Francisco . E-mail: paco@us.es; Hartmann, Guenther H.; Pena, Javier; Capote, Roberto; Paiusco, Marta; Rhein, Bernhard; Leal, Antonio; Lagares, Juan Ignacio

    2007-05-01

    Purpose: Intensity-modulated radiotherapy (IMRT) represents an important method for improving RT. The IMRT relative dosimetry checks are well established; however, open questions remain in reference dosimetry with ionization chambers (ICs). The main problem is the departure of the measurement conditions from the reference ones; thus, additional uncertainty is introduced into the dose determination. The goal of this study was to assess this effect systematically. Methods and Materials: Monte Carlo calculations and dosimetric measurements with five different detectors were performed for a number of representative IMRT cases, covering both step-and-shoot and dynamic delivery. Results: Using ICs with volumes of about 0.125 cm{sup 3} or less, good agreement was observed among the detectors in most of the situations studied. These results also agreed well with the Monte Carlo-calculated nonreference correction factors (c factors). Additionally, we found a general correlation between the IC position relative to a segment and the derived correction factor c, which can be used to estimate the expected overall uncertainty of the treatment. Conclusion: The increase of the reference dose relative standard uncertainty measured with ICs introduced by nonreference conditions when verifying an entire IMRT plan is about 1-1.5%, provided that appropriate small-volume chambers are used. The overall standard uncertainty of the measured IMRT dose amounts to about 2.3%, including the 0.5% of reproducibility and 1.5% of uncertainty associated with the beam calibration factor. Solid state detectors and large-volume chambers are not well suited to IMRT verification dosimetry because of the greater uncertainties. An action level of 5% is appropriate for IMRT verification. Greater discrepancies should lead to a review of the dosimetric procedure, including visual inspection of treatment segments and energy fluence.

  8. Calibration procedure for a laser triangulation scanner with uncertainty evaluation

    NASA Astrophysics Data System (ADS)

    Genta, Gianfranco; Minetola, Paolo; Barbato, Giulio

    2016-11-01

    Most of low cost 3D scanning devices that are nowadays available on the market are sold without a user calibration procedure to correct measurement errors related to changes in environmental conditions. In addition, there is no specific international standard defining a procedure to check the performance of a 3D scanner along time. This paper aims at detailing a thorough methodology to calibrate a 3D scanner and assess its measurement uncertainty. The proposed procedure is based on the use of a reference ball plate and applied to a triangulation laser scanner. Experimental results show that the metrological performance of the instrument can be greatly improved by the application of the calibration procedure that corrects systematic errors and reduces the device's measurement uncertainty.

  9. Characterizing Epistemic Uncertainty for Launch Vehicle Designs

    NASA Technical Reports Server (NTRS)

    Novack, Steven D.; Rogers, Jim; Hark, Frank; Al Hassan, Mohammad

    2016-01-01

    NASA Probabilistic Risk Assessment (PRA) has the task of estimating the aleatory (randomness) and epistemic (lack of knowledge) uncertainty of launch vehicle loss of mission and crew risk and communicating the results. Launch vehicles are complex engineered systems designed with sophisticated subsystems that are built to work together to accomplish mission success. Some of these systems or subsystems are in the form of heritage equipment, while some have never been previously launched. For these cases, characterizing the epistemic uncertainty is of foremost importance, and it is anticipated that the epistemic uncertainty of a modified launch vehicle design versus a design of well understood heritage equipment would be greater. For reasons that will be discussed, standard uncertainty propagation methods using Monte Carlo simulation produce counter intuitive results and significantly underestimate epistemic uncertainty for launch vehicle models. Furthermore, standard PRA methods such as Uncertainty-Importance analyses used to identify components that are significant contributors to uncertainty are rendered obsolete since sensitivity to uncertainty changes are not reflected in propagation of uncertainty using Monte Carlo methods.This paper provides a basis of the uncertainty underestimation for complex systems and especially, due to nuances of launch vehicle logic, for launch vehicles. It then suggests several alternative methods for estimating uncertainty and provides examples of estimation results. Lastly, the paper shows how to implement an Uncertainty-Importance analysis using one alternative approach, describes the results, and suggests ways to reduce epistemic uncertainty by focusing on additional data or testing of selected components.

  10. Characterizing Epistemic Uncertainty for Launch Vehicle Designs

    NASA Technical Reports Server (NTRS)

    Novack, Steven D.; Rogers, Jim; Al Hassan, Mohammad; Hark, Frank

    2016-01-01

    NASA Probabilistic Risk Assessment (PRA) has the task of estimating the aleatory (randomness) and epistemic (lack of knowledge) uncertainty of launch vehicle loss of mission and crew risk, and communicating the results. Launch vehicles are complex engineered systems designed with sophisticated subsystems that are built to work together to accomplish mission success. Some of these systems or subsystems are in the form of heritage equipment, while some have never been previously launched. For these cases, characterizing the epistemic uncertainty is of foremost importance, and it is anticipated that the epistemic uncertainty of a modified launch vehicle design versus a design of well understood heritage equipment would be greater. For reasons that will be discussed, standard uncertainty propagation methods using Monte Carlo simulation produce counter intuitive results, and significantly underestimate epistemic uncertainty for launch vehicle models. Furthermore, standard PRA methods, such as Uncertainty-Importance analyses used to identify components that are significant contributors to uncertainty, are rendered obsolete, since sensitivity to uncertainty changes are not reflected in propagation of uncertainty using Monte Carlo methods. This paper provides a basis of the uncertainty underestimation for complex systems and especially, due to nuances of launch vehicle logic, for launch vehicles. It then suggests several alternative methods for estimating uncertainty and provides examples of estimation results. Lastly, the paper describes how to implement an Uncertainty-Importance analysis using one alternative approach, describes the results, and suggests ways to reduce epistemic uncertainty by focusing on additional data or testing of selected components.

  11. Additions and corrections to the systematics of mayfly species assigned to the genus Callibaetis Eaton 1881 (Ephemeroptera: Baetidae) from South America.

    PubMed

    Cruz, Paulo Vilela; Salles, Frederico Falcão; Hamada, Neusa

    2017-02-13

    Due to historical taxonomic impediments, species of Callibaetis Eaton are difficult to identify. Recent studies have attempted to resolve this problem, although many species still lack complete descriptions; nymphs of several species remain undetermined; and type specimens are lost or poorly known. Given these hindrances, the aim of this study is to review some of the type specimens of Callibaetis from South America. This review provides a series of taxonomic additions and corrections supported by improved morphological evaluations, illustrations and photographs of Callibaetis camposi Navás, C. (Abaetetuba) capixaba Cruz, Salles & Hamada, C. gregarius Navás, C. (C.) guttatus Navás, C. jaffueli Navás, C. (C.) jocosus Navás, C. nigrivenosus Banks, C. (A.) pollens Needham & Murphy, C. (C.) radiatus Navás, C. (A.) sellacki (Weyenbergh), C. stictogaster Navás, C. (C.) viviparus Needham & Murphy, C. (C.) willineri Navás, and C. (C.) zonalis Navás. From among these species, C. stictogaster and C. jaffueli are revalidated; C. nigrivenosus and C. gregarius are designated as nomina dubia; C. (C.) fluminensis Cruz, Salles & Hamada is proposed as a junior subjective synonym of C. (C.) zonalis; and C. gloriosus Navás is proposed as a junior subjective synonym of C. (A.) sellacki (Weyenbergh). Lectotypes are designated for C. camposi, C. jaffueli, C. (C.) radiatus and C. stictogaster.

  12. The uncertainties in estimating measurement uncertainties

    SciTech Connect

    Clark, J.P.; Shull, A.H.

    1994-07-01

    All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by the individuals who need measurements to make decisions. This paper will discuss the concepts of measurement, measurements errors (accuracy or bias and precision or random error), physical and error models, measurement control programs, examples of measurement uncertainty, and uncertainty as related to measurement quality. Measurements are comparisons of unknowns to knowns, estimates of some true value plus uncertainty; and are no better than the standards to which they are compared. Direct comparisons of unknowns that match the composition of known standards will normally have small uncertainties. In the real world, measurements usually involve indirect comparisons of significantly different materials (e.g., measuring a physical property of a chemical element in a sample having a matrix that is significantly different from calibration standards matrix). Consequently, there are many sources of error involved in measurement processes that can affect the quality of a measurement and its associated uncertainty. How the uncertainty estimates are determined and what they mean is as important as the measurement. The process of calculating the uncertainty of a measurement itself has uncertainties that must be handled correctly. Examples of chemistry laboratory measurement will be reviewed in this report and recommendations made for improving measurement uncertainties.

  13. Quantifying uncertainty in LCA-modelling of waste management systems

    SciTech Connect

    Clavreul, Julie; Guyonnet, Dominique; Christensen, Thomas H.

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer Uncertainty in LCA-modelling of waste management is significant. Black-Right-Pointing-Pointer Model, scenario and parameter uncertainties contribute. Black-Right-Pointing-Pointer Sequential procedure for quantifying uncertainty is proposed. Black-Right-Pointing-Pointer Application of procedure is illustrated by a case-study. - Abstract: Uncertainty analysis in LCA studies has been subject to major progress over the last years. In the context of waste management, various methods have been implemented but a systematic method for uncertainty analysis of waste-LCA studies is lacking. The objective of this paper is (1) to present the sources of uncertainty specifically inherent to waste-LCA studies, (2) to select and apply several methods for uncertainty analysis and (3) to develop a general framework for quantitative uncertainty assessment of LCA of waste management systems. The suggested method is a sequence of four steps combining the selected methods: (Step 1) a sensitivity analysis evaluating the sensitivities of the results with respect to the input uncertainties, (Step 2) an uncertainty propagation providing appropriate tools for representing uncertainties and calculating the overall uncertainty of the model results, (Step 3) an uncertainty contribution analysis quantifying the contribution of each parameter uncertainty to the final uncertainty and (Step 4) as a new approach, a combined sensitivity analysis providing a visualisation of the shift in the ranking of different options due to variations of selected key parameters. This tiered approach optimises the resources available to LCA practitioners by only propagating the most influential uncertainties.

  14. Simple uncertainty propagation for early design phase aircraft sizing

    NASA Astrophysics Data System (ADS)

    Lenz, Annelise

    Many designers and systems analysts are aware of the uncertainty inherent in their aircraft sizing studies; however, few incorporate methods to address and quantify this uncertainty. Many aircraft design studies use semi-empirical predictors based on a historical database and contain uncertainty -- a portion of which can be measured and quantified. In cases where historical information is not available, surrogate models built from higher-fidelity analyses often provide predictors for design studies where the computational cost of directly using the high-fidelity analyses is prohibitive. These surrogate models contain uncertainty, some of which is quantifiable. However, rather than quantifying this uncertainty, many designers merely include a safety factor or design margin in the constraints to account for the variability between the predicted and actual results. This can become problematic if a designer does not estimate the amount of variability correctly, which then can result in either an "over-designed" or "under-designed" aircraft. "Under-designed" and some "over-designed" aircraft will likely require design changes late in the process and will ultimately require more time and money to create; other "over-designed" aircraft concepts may not require design changes, but could end up being more costly than necessary. Including and propagating uncertainty early in the design phase so designers can quantify some of the errors in the predictors could help mitigate the extent of this additional cost. The method proposed here seeks to provide a systematic approach for characterizing a portion of the uncertainties that designers are aware of and propagating it throughout the design process in a procedure that is easy to understand and implement. Using Monte Carlo simulations that sample from quantified distributions will allow a systems analyst to use a carpet plot-like approach to make statements like: "The aircraft is 'P'% likely to weigh 'X' lbs or less, given the

  15. Finite Frames and Graph Theoretic Uncertainty Principles

    NASA Astrophysics Data System (ADS)

    Koprowski, Paul J.

    The subject of analytical uncertainty principles is an important field within harmonic analysis, quantum physics, and electrical engineering. We explore uncertainty principles in the context of the graph Fourier transform, and we prove additive results analogous to the multiplicative version of the classical uncertainty principle. We establish additive uncertainty principles for finite Parseval frames. Lastly, we examine the feasibility region of simultaneous values of the norms of a graph differential operator acting on a function f ∈ l2(G) and its graph Fourier transform.

  16. Generalized uncertainty relations

    NASA Astrophysics Data System (ADS)

    Herdegen, Andrzej; Ziobro, Piotr

    2017-04-01

    The standard uncertainty relations (UR) in quantum mechanics are typically used for unbounded operators (like the canonical pair). This implies the need for the control of the domain problems. On the other hand, the use of (possibly bounded) functions of basic observables usually leads to more complex and less readily interpretable relations. In addition, UR may turn trivial for certain states if the commutator of observables is not proportional to a positive operator. In this letter we consider a generalization of standard UR resulting from the use of two, instead of one, vector states. The possibility to link these states to each other in various ways adds additional flexibility to UR, which may compensate some of the above-mentioned drawbacks. We discuss applications of the general scheme, leading not only to technical improvements, but also to interesting new insight.

  17. Uncertainty as knowledge

    PubMed Central

    Lewandowsky, Stephan; Ballard, Timothy; Pancost, Richard D.

    2015-01-01

    This issue of Philosophical Transactions examines the relationship between scientific uncertainty about climate change and knowledge. Uncertainty is an inherent feature of the climate system. Considerable effort has therefore been devoted to understanding how to effectively respond to a changing, yet uncertain climate. Politicians and the public often appeal to uncertainty as an argument to delay mitigative action. We argue that the appropriate response to uncertainty is exactly the opposite: uncertainty provides an impetus to be concerned about climate change, because greater uncertainty increases the risks associated with climate change. We therefore suggest that uncertainty can be a source of actionable knowledge. We survey the papers in this issue, which address the relationship between uncertainty and knowledge from physical, economic and social perspectives. We also summarize the pervasive psychological effects of uncertainty, some of which may militate against a meaningful response to climate change, and we provide pointers to how those difficulties may be ameliorated. PMID:26460108

  18. Auditory Localisation Biases Increase with Sensory Uncertainty

    PubMed Central

    Garcia, Sara E.; Jones, Pete R.; Rubin, Gary S.; Nardini, Marko

    2017-01-01

    Psychophysical studies have frequently found that adults with normal hearing exhibit systematic errors (biases) in their auditory localisation judgments. Here we tested (i) whether systematic localisation errors could reflect reliance on prior knowledge, as has been proposed for other systematic perceptual biases, and (ii) whether auditory localisation biases can be reduced following training with accurate visual feedback. Twenty-four normal hearing participants were asked to localise the position of a noise burst along the azimuth before, during, and after training with visual feedback. Consistent with reliance on prior knowledge to reduce sensory uncertainty, we found that auditory localisation biases increased when auditory localisation uncertainty increased. Specifically, participants mis-localised auditory stimuli as being more eccentric than they were, and did so more when auditory uncertainty was greater. However, biases also increased with eccentricity, despite no corresponding increase in uncertainty, which is not readily explained by use of a simple prior favouring peripheral locations. Localisation biases decreased (improved) following training with visual feedback, but the reliability of the visual feedback stimulus did not change the effects of training. We suggest that further research is needed to identify alternative mechanisms, besides use of prior knowledge, that could account for increased perceptual biases under sensory uncertainty. PMID:28074913

  19. A Strategy for Uncertainty Visualization Design

    DTIC Science & Technology

    2009-10-01

    clearer understanding. As an example application of the UVDS, it is applied to current research regarding uncertainty visualization for the Canadian...the design a clearer understanding. As an example application of the UVDS, it is applied to current research regarding uncertainty visualization...TMs) aimed at creating foundational documents on the topic of uncertainty visualization which can be used in defence applications . In addition, the

  20. Do uncertainty analyses reveal uncertainties? Using the introduction of DNA vaccines to aquaculture as a case.

    PubMed

    Gillund, Frøydis; Kjølberg, Kamilla A; von Krauss, Martin Krayer; Myhr, Anne I

    2008-12-15

    The Walker and Harremoës (W&H) uncertainty framework is a tool to systematically identify scientific uncertainty. We applied the W&H uncertainty framework to elicit scientists' judgements of potential sources of uncertainty associated with the use of DNA vaccination in aquaculture. DNA vaccination is considered a promising solution to combat pathological fish diseases. There is, however, lack of knowledge regarding its ecological and social implications. Our findings indicate that scientists are open and aware of a number of uncertainties associated with DNA vaccination e.g. with regard to immune response, degradation and distribution of the DNA plasmid after injection and environmental release, and consider most of these uncertainties to be adequately reduced through more research. We proceed to discuss our experience of using the W&H uncertainty framework. Some challenges related to the application of the framework were recognised. This was especially related to the respondents' unfamiliarity with the concepts used and their lack of experience in discussing qualitative aspects of uncertainties. As we see it, the W&H framework should be considered as a useful tool to stimulate reflection on uncertainty and an important first step in a more extensive process of including and properly dealing with uncertainties in science and policymaking.

  1. An Approach of Uncertainty Evaluation for Thermal-Hydraulic Analysis

    SciTech Connect

    Katsunori Ogura; Hisashi Ninokata

    2002-07-01

    An approach to evaluate uncertainty systematically for thermal-hydraulic analysis programs is demonstrated. The approach is applied to the Peach Bottom Unit 2 Turbine Trip 2 Benchmark and is validated. (authors)

  2. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  3. Identifying uncertainties in Arctic climate change projections

    NASA Astrophysics Data System (ADS)

    Hodson, Daniel L. R.; Keeley, Sarah P. E.; West, Alex; Ridley, Jeff; Hawkins, Ed; Hewitt, Helene T.

    2013-06-01

    Wide ranging climate changes are expected in the Arctic by the end of the 21st century, but projections of the size of these changes vary widely across current global climate models. This variation represents a large source of uncertainty in our understanding of the evolution of Arctic climate. Here we systematically quantify and assess the model uncertainty in Arctic climate changes in two CO2 doubling experiments: a multimodel ensemble (CMIP3) and an ensemble constructed using a single model (HadCM3) with multiple parameter perturbations (THC-QUMP). These two ensembles allow us to assess the contribution that both structural and parameter variations across models make to the total uncertainty and to begin to attribute sources of uncertainty in projected changes. We find that parameter uncertainty is an major source of uncertainty in certain aspects of Arctic climate. But also that uncertainties in the mean climate state in the 20th century, most notably in the northward Atlantic ocean heat transport and Arctic sea ice volume, are a significant source of uncertainty for projections of future Arctic change. We suggest that better observational constraints on these quantities will lead to significant improvements in the precision of projections of future Arctic climate change.

  4. Direct Aerosol Forcing Uncertainty

    DOE Data Explorer

    Mccomiskey, Allison

    2008-01-15

    Understanding sources of uncertainty in aerosol direct radiative forcing (DRF), the difference in a given radiative flux component with and without aerosol, is essential to quantifying changes in Earth's radiation budget. We examine the uncertainty in DRF due to measurement uncertainty in the quantities on which it depends: aerosol optical depth, single scattering albedo, asymmetry parameter, solar geometry, and surface albedo. Direct radiative forcing at the top of the atmosphere and at the surface as well as sensitivities, the changes in DRF in response to unit changes in individual aerosol or surface properties, are calculated at three locations representing distinct aerosol types and radiative environments. The uncertainty in DRF associated with a given property is computed as the product of the sensitivity and typical measurement uncertainty in the respective aerosol or surface property. Sensitivity and uncertainty values permit estimation of total uncertainty in calculated DRF and identification of properties that most limit accuracy in estimating forcing. Total uncertainties in modeled local diurnally averaged forcing range from 0.2 to 1.3 W m-2 (42 to 20%) depending on location (from tropical to polar sites), solar zenith angle, surface reflectance, aerosol type, and aerosol optical depth. The largest contributor to total uncertainty in DRF is usually single scattering albedo; however decreasing measurement uncertainties for any property would increase accuracy in DRF. Comparison of two radiative transfer models suggests the contribution of modeling error is small compared to the total uncertainty although comparable to uncertainty arising from some individual properties.

  5. The uncertainty of reference standards--a guide to understanding factors impacting uncertainty, uncertainty calculations, and vendor certifications.

    PubMed

    Gates, Kevin; Chang, Ning; Dilek, Isil; Jian, Huahua; Pogue, Sherri; Sreenivasan, Uma

    2009-10-01

    Certified solution standards are widely used in forensic toxicological, clinical/diagnostic, and environmental testing. Typically, these standards are purchased as ampouled solutions with a certified concentration. Vendors present concentration and uncertainty differently on their Certificates of Analysis. Understanding the factors that impact uncertainty and which factors have been considered in the vendor's assignment of uncertainty are critical to understanding the accuracy of the standard and the impact on testing results. Understanding these variables is also important for laboratories seeking to comply with ISO/IEC 17025 requirements and for those preparing reference solutions from neat materials at the bench. The impact of uncertainty associated with the neat material purity (including residual water, residual solvent, and inorganic content), mass measurement (weighing techniques), and solvent addition (solution density) on the overall uncertainty of the certified concentration is described along with uncertainty calculations.

  6. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 3 2012-01-01 2012-01-01 false Uncertainty analyses. 436.24 Section 436.24 Energy... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... by conducting additional analyses using any standard engineering economics method such as...

  7. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 3 2014-01-01 2014-01-01 false Uncertainty analyses. 436.24 Section 436.24 Energy... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... by conducting additional analyses using any standard engineering economics method such as...

  8. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  9. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  10. 10 CFR 436.24 - Uncertainty analyses.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Procedures for Life Cycle Cost Analyses § 436.24 Uncertainty analyses. If particular items of cost data or... impact of uncertainty on the calculation of life cycle cost effectiveness or the assignment of rank order... and probabilistic analysis. If additional analysis casts substantial doubt on the life cycle...

  11. Two basic Uncertainty Relations in Quantum Mechanics

    SciTech Connect

    Angelow, Andrey

    2011-04-07

    In the present article, we discuss two types of uncertainty relations in Quantum Mechanics-multiplicative and additive inequalities for two canonical observables. The multiplicative uncertainty relation was discovered by Heisenberg. Few years later (1930) Erwin Schroedinger has generalized and made it more precise than the original. The additive uncertainty relation is based on the three independent statistical moments in Quantum Mechanics-Cov(q,p), Var(q) and Var(p). We discuss the existing symmetry of both types of relations and applicability of the additive form for the estimation of the total error.

  12. Two basic Uncertainty Relations in Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Angelow, Andrey

    2011-04-01

    In the present article, we discuss two types of uncertainty relations in Quantum Mechanics-multiplicative and additive inequalities for two canonical observables. The multiplicative uncertainty relation was discovered by Heisenberg. Few years later (1930) Erwin Schrödinger has generalized and made it more precise than the original. The additive uncertainty relation is based on the three independent statistical moments in Quantum Mechanics-Cov(q,p), Var(q) and Var(p). We discuss the existing symmetry of both types of relations and applicability of the additive form for the estimation of the total error.

  13. Uncertainty and Sensitivity Analyses Plan

    SciTech Connect

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  14. Uncertainty in tsunami sediment transport modeling

    USGS Publications Warehouse

    Jaffe, Bruce E.; Goto, Kazuhisa; Sugawara, Daisuke; Gelfenbaum, Guy R.; La Selle, SeanPaul M.

    2016-01-01

    Erosion and deposition from tsunamis record information about tsunami hydrodynamics and size that can be interpreted to improve tsunami hazard assessment. We explore sources and methods for quantifying uncertainty in tsunami sediment transport modeling. Uncertainty varies with tsunami, study site, available input data, sediment grain size, and model. Although uncertainty has the potential to be large, published case studies indicate that both forward and inverse tsunami sediment transport models perform well enough to be useful for deciphering tsunami characteristics, including size, from deposits. New techniques for quantifying uncertainty, such as Ensemble Kalman Filtering inversion, and more rigorous reporting of uncertainties will advance the science of tsunami sediment transport modeling. Uncertainty may be decreased with additional laboratory studies that increase our understanding of the semi-empirical parameters and physics of tsunami sediment transport, standardized benchmark tests to assess model performance, and development of hybrid modeling approaches to exploit the strengths of forward and inverse models.

  15. Uncertainty and Cognitive Control

    PubMed Central

    Mushtaq, Faisal; Bland, Amy R.; Schaefer, Alexandre

    2011-01-01

    A growing trend of neuroimaging, behavioral, and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1) There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2) There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3) The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the “need for control”; (4) Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders. PMID:22007181

  16. Quantifying uncertainty in LCA-modelling of waste management systems.

    PubMed

    Clavreul, Julie; Guyonnet, Dominique; Christensen, Thomas H

    2012-12-01

    Uncertainty analysis in LCA studies has been subject to major progress over the last years. In the context of waste management, various methods have been implemented but a systematic method for uncertainty analysis of waste-LCA studies is lacking. The objective of this paper is (1) to present the sources of uncertainty specifically inherent to waste-LCA studies, (2) to select and apply several methods for uncertainty analysis and (3) to develop a general framework for quantitative uncertainty assessment of LCA of waste management systems. The suggested method is a sequence of four steps combining the selected methods: (Step 1) a sensitivity analysis evaluating the sensitivities of the results with respect to the input uncertainties, (Step 2) an uncertainty propagation providing appropriate tools for representing uncertainties and calculating the overall uncertainty of the model results, (Step 3) an uncertainty contribution analysis quantifying the contribution of each parameter uncertainty to the final uncertainty and (Step 4) as a new approach, a combined sensitivity analysis providing a visualisation of the shift in the ranking of different options due to variations of selected key parameters. This tiered approach optimises the resources available to LCA practitioners by only propagating the most influential uncertainties.

  17. Systematic reviews.

    PubMed

    Milner, Kerry A

    2015-01-01

    Systematic reviews are a type of literature review in which authors systematically search for, critically appraise, and synthesize evidence from several studies on the same topic (Grant & Booth, 2009). The precise and systematic method differentiates systematic reviews from traditional reviews (Khan, Kunz, Kleijnen, & Antes, 2003). In all types of systematic reviews, a quality assessment is done of the individual studies that meet inclusion criteria. These individual assessments are synthesized, and aggregated results are reported. Systematic reviews are considered the highest level of evidence in evidence-based health care because the reviewers strive to use transparent, rigorous methods that minimize bias.

  18. RUMINATIONS ON NDA MEASUREMENT UNCERTAINTY COMPARED TO DA UNCERTAINTY

    SciTech Connect

    Salaymeh, S.; Ashley, W.; Jeffcoat, R.

    2010-06-17

    It is difficult to overestimate the importance that physical measurements performed with nondestructive assay instruments play throughout the nuclear fuel cycle. They underpin decision making in many areas and support: criticality safety, radiation protection, process control, safeguards, facility compliance, and waste measurements. No physical measurement is complete or indeed meaningful, without a defensible and appropriate accompanying statement of uncertainties and how they combine to define the confidence in the results. The uncertainty budget should also be broken down in sufficient detail suitable for subsequent uses to which the nondestructive assay (NDA) results will be applied. Creating an uncertainty budget and estimating the total measurement uncertainty can often be an involved process, especially for non routine situations. This is because data interpretation often involves complex algorithms and logic combined in a highly intertwined way. The methods often call on a multitude of input data subject to human oversight. These characteristics can be confusing and pose a barrier to developing and understanding between experts and data consumers. ASTM subcommittee C26-10 recognized this problem in the context of how to summarize and express precision and bias performance across the range of standards and guides it maintains. In order to create a unified approach consistent with modern practice and embracing the continuous improvement philosophy a consensus arose to prepare a procedure covering the estimation and reporting of uncertainties in non destructive assay of nuclear materials. This paper outlines the needs analysis, objectives and on-going development efforts. In addition to emphasizing some of the unique challenges and opportunities facing the NDA community we hope this article will encourage dialog and sharing of best practice and furthermore motivate developers to revisit the treatment of measurement uncertainty.

  19. Quantifying uncertainty from material inhomogeneity.

    SciTech Connect

    Battaile, Corbett Chandler; Emery, John M.; Brewer, Luke N.; Boyce, Brad Lee

    2009-09-01

    Most engineering materials are inherently inhomogeneous in their processing, internal structure, properties, and performance. Their properties are therefore statistical rather than deterministic. These inhomogeneities manifest across multiple length and time scales, leading to variabilities, i.e. statistical distributions, that are necessary to accurately describe each stage in the process-structure-properties hierarchy, and are ultimately the primary source of uncertainty in performance of the material and component. When localized events are responsible for component failure, or when component dimensions are on the order of microstructural features, this uncertainty is particularly important. For ultra-high reliability applications, the uncertainty is compounded by a lack of data describing the extremely rare events. Hands-on testing alone cannot supply sufficient data for this purpose. To date, there is no robust or coherent method to quantify this uncertainty so that it can be used in a predictive manner at the component length scale. The research presented in this report begins to address this lack of capability through a systematic study of the effects of microstructure on the strain concentration at a hole. To achieve the strain concentration, small circular holes (approximately 100 {micro}m in diameter) were machined into brass tensile specimens using a femto-second laser. The brass was annealed at 450 C, 600 C, and 800 C to produce three hole-to-grain size ratios of approximately 7, 1, and 1/7. Electron backscatter diffraction experiments were used to guide the construction of digital microstructures for finite element simulations of uniaxial tension. Digital image correlation experiments were used to qualitatively validate the numerical simulations. The simulations were performed iteratively to generate statistics describing the distribution of plastic strain at the hole in varying microstructural environments. In both the experiments and simulations, the

  20. Uncertainty in environmental health impact assessment: quantitative methods and perspectives.

    PubMed

    Mesa-Frias, Marco; Chalabi, Zaid; Vanni, Tazio; Foss, Anna M

    2013-01-01

    Environmental health impact assessment models are subjected to great uncertainty due to the complex associations between environmental exposures and health. Quantifying the impact of uncertainty is important if the models are used to support health policy decisions. We conducted a systematic review to identify and appraise current methods used to quantify the uncertainty in environmental health impact assessment. In the 19 studies meeting the inclusion criteria, several methods were identified. These were grouped into random sampling methods, second-order probability methods, Bayesian methods, fuzzy sets, and deterministic sensitivity analysis methods. All 19 studies addressed the uncertainty in the parameter values but only 5 of the studies also addressed the uncertainty in the structure of the models. None of the articles reviewed considered conceptual sources of uncertainty associated with the framing assumptions or the conceptualisation of the model. Future research should attempt to broaden the way uncertainty is taken into account in environmental health impact assessments.

  1. Ground-based imaging remote sensing of ice clouds: uncertainties caused by sensor, method and atmosphere

    NASA Astrophysics Data System (ADS)

    Zinner, Tobias; Hausmann, Petra; Ewald, Florian; Bugliaro, Luca; Emde, Claudia; Mayer, Bernhard

    2016-09-01

    In this study a method is introduced for the retrieval of optical thickness and effective particle size of ice clouds over a wide range of optical thickness from ground-based transmitted radiance measurements. Low optical thickness of cirrus clouds and their complex microphysics present a challenge for cloud remote sensing. In transmittance, the relationship between optical depth and radiance is ambiguous. To resolve this ambiguity the retrieval utilizes the spectral slope of radiance between 485 and 560 nm in addition to the commonly employed combination of a visible and a short-wave infrared wavelength.An extensive test of retrieval sensitivity was conducted using synthetic test spectra in which all parameters introducing uncertainty into the retrieval were varied systematically: ice crystal habit and aerosol properties, instrument noise, calibration uncertainty and the interpolation in the lookup table required by the retrieval process. The most important source of errors identified are uncertainties due to habit assumption: Averaged over all test spectra, systematic biases in the effective radius retrieval of several micrometre can arise. The statistical uncertainties of any individual retrieval can easily exceed 10 µm. Optical thickness biases are mostly below 1, while statistical uncertainties are in the range of 1 to 2.5.For demonstration and comparison to satellite data the retrieval is applied to observations by the Munich hyperspectral imager specMACS (spectrometer of the Munich Aerosol and Cloud Scanner) at the Schneefernerhaus observatory (2650 m a.s.l.) during the ACRIDICON-Zugspitze campaign in September and October 2012. Results are compared to MODIS and SEVIRI satellite-based cirrus retrievals (ACRIDICON - Aerosol, Cloud, Precipitation, and Radiation Interactions and Dynamics of Convective Cloud Systems; MODIS - Moderate Resolution Imaging Spectroradiometer; SEVIRI - Spinning Enhanced Visible and Infrared Imager). Considering the identified

  2. The Scientific Basis of Uncertainty Factors Used in Setting Occupational Exposure Limits.

    PubMed

    Dankovic, D A; Naumann, B D; Maier, A; Dourson, M L; Levy, L S

    2015-01-01

    The uncertainty factor concept is integrated into health risk assessments for all aspects of public health practice, including by most organizations that derive occupational exposure limits. The use of uncertainty factors is predicated on the assumption that a sufficient reduction in exposure from those at the boundary for the onset of adverse effects will yield a safe exposure level for at least the great majority of the exposed population, including vulnerable subgroups. There are differences in the application of the uncertainty factor approach among groups that conduct occupational assessments; however, there are common areas of uncertainty which are considered by all or nearly all occupational exposure limit-setting organizations. Five key uncertainties that are often examined include interspecies variability in response when extrapolating from animal studies to humans, response variability in humans, uncertainty in estimating a no-effect level from a dose where effects were observed, extrapolation from shorter duration studies to a full life-time exposure, and other insufficiencies in the overall health effects database indicating that the most sensitive adverse effect may not have been evaluated. In addition, a modifying factor is used by some organizations to account for other remaining uncertainties-typically related to exposure scenarios or accounting for the interplay among the five areas noted above. Consideration of uncertainties in occupational exposure limit derivation is a systematic process whereby the factors applied are not arbitrary, although they are mathematically imprecise. As the scientific basis for uncertainty factor application has improved, default uncertainty factors are now used only in the absence of chemical-specific data, and the trend is to replace them with chemical-specific adjustment factors whenever possible. The increased application of scientific data in the development of uncertainty factors for individual chemicals also has

  3. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    Westerberg, I. K.; McMillan, H. K.

    2015-09-01

    Information about rainfall-runoff processes is essential for hydrological analyses, modelling and water-management applications. A hydrological, or diagnostic, signature quantifies such information from observed data as an index value. Signatures are widely used, e.g. for catchment classification, model calibration and change detection. Uncertainties in the observed data - including measurement inaccuracy and representativeness as well as errors relating to data management - propagate to the signature values and reduce their information content. Subjective choices in the calculation method are a further source of uncertainty. We review the uncertainties relevant to different signatures based on rainfall and flow data. We propose a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrate it in two catchments for common signatures including rainfall-runoff thresholds, recession analysis and basic descriptive signatures of flow distribution and dynamics. Our intention is to contribute to awareness and knowledge of signature uncertainty, including typical sources, magnitude and methods for its assessment. We found that the uncertainties were often large (i.e. typical intervals of ±10-40 % relative uncertainty) and highly variable between signatures. There was greater uncertainty in signatures that use high-frequency responses, small data subsets, or subsets prone to measurement errors. There was lower uncertainty in signatures that use spatial or temporal averages. Some signatures were sensitive to particular uncertainty types such as rating-curve form. We found that signatures can be designed to be robust to some uncertainty sources. Signature uncertainties of the magnitudes we found have the potential to change the conclusions of hydrological and ecohydrological analyses, such as cross-catchment comparisons or inferences about dominant processes.

  4. [Ethics, empiricism and uncertainty].

    PubMed

    Porz, R; Zimmermann, H; Exadaktylos, A K

    2011-01-01

    Accidents can lead to difficult boundary situations. Such situations often take place in the emergency units. The medical team thus often and inevitably faces professional uncertainty in their decision-making. It is essential to communicate these uncertainties within the medical team, instead of downplaying or overriding existential hurdles in decision-making. Acknowledging uncertainties might lead to alert and prudent decisions. Thus uncertainty can have ethical value in treatment or withdrawal of treatment. It does not need to be covered in evidence-based arguments, especially as some singular situations of individual tragedies cannot be grasped in terms of evidence-based medicine.

  5. The effectiveness of selected feed and water additives for reducing Salmonella spp. of public health importance in broiler chickens: a systematic review, meta-analysis, and meta-regression approach.

    PubMed

    Totton, Sarah C; Farrar, Ashley M; Wilkins, Wendy; Bucher, Oliver; Waddell, Lisa A; Wilhelm, Barbara J; McEwen, Scott A; Rajić, Andrijana

    2012-10-01

    Eating inappropriately prepared poultry meat is a major cause of foodborne salmonellosis. Our objectives were to determine the efficacy of feed and water additives (other than competitive exclusion and antimicrobials) on reducing Salmonella prevalence or concentration in broiler chickens using systematic review-meta-analysis and to explore sources of heterogeneity found in the meta-analysis through meta-regression. Six electronic databases were searched (Current Contents (1999-2009), Agricola (1924-2009), MEDLINE (1860-2009), Scopus (1960-2009), Centre for Agricultural Bioscience (CAB) (1913-2009), and CAB Global Health (1971-2009)), five topic experts were contacted, and the bibliographies of review articles and a topic-relevant textbook were manually searched to identify all relevant research. Study inclusion criteria comprised: English-language primary research investigating the effects of feed and water additives on the Salmonella prevalence or concentration in broiler chickens. Data extraction and study methodological assessment were conducted by two reviewers independently using pretested forms. Seventy challenge studies (n=910 unique treatment-control comparisons), seven controlled studies (n=154), and one quasi-experiment (n=1) met the inclusion criteria. Compared to an assumed control group prevalence of 44 of 1000 broilers, random-effects meta-analysis indicated that the Salmonella cecal colonization in groups with prebiotics (fructooligosaccharide, lactose, whey, dried milk, lactulose, lactosucrose, sucrose, maltose, mannanoligosaccharide) added to feed or water was 15 out of 1000 broilers; with lactose added to feed or water it was 10 out of 1000 broilers; with experimental chlorate product (ECP) added to feed or water it was 21 out of 1000. For ECP the concentration of Salmonella in the ceca was decreased by 0.61 log(10)cfu/g in the treated group compared to the control group. Significant heterogeneity (Cochran's Q-statistic p≤0.10) was observed

  6. Saccade Adaptation and Visual Uncertainty

    PubMed Central

    Souto, David; Gegenfurtner, Karl R.; Schütz, Alexander C.

    2016-01-01

    Visual uncertainty may affect saccade adaptation in two complementary ways. First, an ideal adaptor should take into account the reliability of visual information for determining the amount of correction, predicting that increasing visual uncertainty should decrease adaptation rates. We tested this by comparing observers' direction discrimination and adaptation rates in an intra-saccadic-step paradigm. Second, clearly visible target steps may generate a slower adaptation rate since the error can be attributed to an external cause, instead of an internal change in the visuo-motor mapping that needs to be compensated. We tested this prediction by measuring saccade adaptation to different step sizes. Most remarkably, we found little correlation between estimates of visual uncertainty and adaptation rates and no slower adaptation rates with more visible step sizes. Additionally, we show that for low contrast targets backward steps are perceived as stationary after the saccade, but that adaptation rates are independent of contrast. We suggest that the saccadic system uses different position signals for adapting dysmetric saccades and for generating a trans-saccadic stable visual percept, explaining that saccade adaptation is found to be independent of visual uncertainty. PMID:27252635

  7. Sensitivity and uncertainty investigations for Hiroshima dose estimates and the applicability of the Little Boy mockup measurements

    SciTech Connect

    Bartine, D.E.; Cacuci, D.G.

    1983-09-13

    This paper describes sources of uncertainty in the data used for calculating dose estimates for the Hiroshima explosion and details a methodology for systematically obtaining best estimates and reduced uncertainties for the radiation doses received. (ACR)

  8. Not Normal: the uncertainties of scientific measurements

    PubMed Central

    2017-01-01

    Judging the significance and reproducibility of quantitative research requires a good understanding of relevant uncertainties, but it is often unclear how well these have been evaluated and what they imply. Reported scientific uncertainties were studied by analysing 41 000 measurements of 3200 quantities from medicine, nuclear and particle physics, and interlaboratory comparisons ranging from chemistry to toxicology. Outliers are common, with 5σ disagreements up to five orders of magnitude more frequent than naively expected. Uncertainty-normalized differences between multiple measurements of the same quantity are consistent with heavy-tailed Student’s t-distributions that are often almost Cauchy, far from a Gaussian Normal bell curve. Medical research uncertainties are generally as well evaluated as those in physics, but physics uncertainty improves more rapidly, making feasible simple significance criteria such as the 5σ discovery convention in particle physics. Contributions to measurement uncertainty from mistakes and unknown problems are not completely unpredictable. Such errors appear to have power-law distributions consistent with how designed complex systems fail, and how unknown systematic errors are constrained by researchers. This better understanding may help improve analysis and meta-analysis of data, and help scientists and the public have more realistic expectations of what scientific results imply. PMID:28280557

  9. Not Normal: the uncertainties of scientific measurements

    NASA Astrophysics Data System (ADS)

    Bailey, David C.

    2017-01-01

    Judging the significance and reproducibility of quantitative research requires a good understanding of relevant uncertainties, but it is often unclear how well these have been evaluated and what they imply. Reported scientific uncertainties were studied by analysing 41 000 measurements of 3200 quantities from medicine, nuclear and particle physics, and interlaboratory comparisons ranging from chemistry to toxicology. Outliers are common, with 5σ disagreements up to five orders of magnitude more frequent than naively expected. Uncertainty-normalized differences between multiple measurements of the same quantity are consistent with heavy-tailed Student's t-distributions that are often almost Cauchy, far from a Gaussian Normal bell curve. Medical research uncertainties are generally as well evaluated as those in physics, but physics uncertainty improves more rapidly, making feasible simple significance criteria such as the 5σ discovery convention in particle physics. Contributions to measurement uncertainty from mistakes and unknown problems are not completely unpredictable. Such errors appear to have power-law distributions consistent with how designed complex systems fail, and how unknown systematic errors are constrained by researchers. This better understanding may help improve analysis and meta-analysis of data, and help scientists and the public have more realistic expectations of what scientific results imply.

  10. Electoral Knowledge and Uncertainty.

    ERIC Educational Resources Information Center

    Blood, R. Warwick; And Others

    Research indicates that the media play a role in shaping the information that voters have about election options. Knowledge of those options has been related to actual vote, but has not been shown to be strongly related to uncertainty. Uncertainty, however, does seem to motivate voters to engage in communication activities, some of which may…

  11. Entanglement and discord assisted entropic uncertainty relations under decoherence

    NASA Astrophysics Data System (ADS)

    Yao, ChunMei; Chen, ZhiHua; Ma, ZhiHao; Severini, Simone; Serafini, Alessio

    2014-09-01

    The uncertainty principle is a crucial aspect of quantum mechanics. It has been shown that quantum entanglement as well as more general notions of correlations, such as quantum discord, can relax or tighten the entropic uncertainty relation in the presence of an ancillary system. We explored the behaviour of entropic uncertainty relations for system of two qubits-one of which subjects to several forms of independent quantum noise, in both Markovian and non-Markovian regimes. The uncertainties and their lower bounds, identified by the entropic uncertainty relations, increase under independent local unital Markovian noisy channels, but they may decrease under non-unital channels. The behaviour of the uncertainties (and lower bounds) exhibit periodical oscillations due to correlation dynamics under independent non-Markovian reservoirs. In addition, we compare different entropic uncertainty relations in several special cases and find that discord-tightened entropic uncertainty relations offer in general a better estimate of the uncertainties in play.

  12. Intolerance of Uncertainty

    PubMed Central

    Beier, Meghan L.

    2015-01-01

    Multiple sclerosis (MS) is a chronic and progressive neurologic condition that, by its nature, carries uncertainty as a hallmark characteristic. Although all patients face uncertainty, there is variability in how individuals cope with its presence. In other populations, the concept of “intolerance of uncertainty” has been conceptualized to explain this variability such that individuals who have difficulty tolerating the possibility of future occurrences may engage in thoughts or behaviors by which they attempt to exert control over that possibility or lessen the uncertainty but may, as a result, experience worse outcomes, particularly in terms of psychological well-being. This topical review introduces MS-focused researchers, clinicians, and patients to intolerance of uncertainty, integrates the concept with what is already understood about coping with MS, and suggests future steps for conceptual, assessment, and treatment-focused research that may benefit from integrating intolerance of uncertainty as a central feature. PMID:26300700

  13. Economic uncertainty and econophysics

    NASA Astrophysics Data System (ADS)

    Schinckus, Christophe

    2009-10-01

    The objective of this paper is to provide a methodological link between econophysics and economics. I will study a key notion of both fields: uncertainty and the ways of thinking about it developed by the two disciplines. After having presented the main economic theories of uncertainty (provided by Knight, Keynes and Hayek), I show how this notion is paradoxically excluded from the economic field. In economics, uncertainty is totally reduced by an a priori Gaussian framework-in contrast to econophysics, which does not use a priori models because it works directly on data. Uncertainty is then not shaped by a specific model, and is partially and temporally reduced as models improve. This way of thinking about uncertainty has echoes in the economic literature. By presenting econophysics as a Knightian method, and a complementary approach to a Hayekian framework, this paper shows that econophysics can be methodologically justified from an economic point of view.

  14. Physical Uncertainty Bounds (PUB)

    SciTech Connect

    Vaughan, Diane Elizabeth; Preston, Dean L.

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  15. Uncertainty for Part Density Determination: An Update

    SciTech Connect

    Valdez, Mario Orlando

    2016-12-14

    Accurate and precise density measurements by hydrostatic weighing requires the use of an analytical balance, configured with a suspension system, to both measure the weight of a part in water and in air. Additionally, the densities of these liquid media (water and air) must be precisely known for the part density determination. To validate the accuracy and precision of these measurements, uncertainty statements are required. The work in this report is a revision of an original report written more than a decade ago, specifically applying principles and guidelines suggested by the Guide to the Expression of Uncertainty in Measurement (GUM) for determining the part density uncertainty through sensitivity analysis. In this work, updated derivations are provided; an original example is revised with the updated derivations and appendix, provided solely to uncertainty evaluations using Monte Carlo techniques, specifically using the NIST Uncertainty Machine, as a viable alternative method.

  16. Characterizing spatial uncertainty when integrating social data in conservation planning.

    PubMed

    Lechner, A M; Raymond, C M; Adams, V M; Polyakov, M; Gordon, A; Rhodes, J R; Mills, M; Stein, A; Ives, C D; Lefroy, E C

    2014-12-01

    Recent conservation planning studies have presented approaches for integrating spatially referenced social (SRS) data with a view to improving the feasibility of conservation action. We reviewed the growing conservation literature on SRS data, focusing on elicited or stated preferences derived through social survey methods such as choice experiments and public participation geographic information systems. Elicited SRS data includes the spatial distribution of willingness to sell, willingness to pay, willingness to act, and assessments of social and cultural values. We developed a typology for assessing elicited SRS data uncertainty which describes how social survey uncertainty propagates when projected spatially and the importance of accounting for spatial uncertainty such as scale effects and data quality. These uncertainties will propagate when elicited SRS data is integrated with biophysical data for conservation planning and may have important consequences for assessing the feasibility of conservation actions. To explore this issue further, we conducted a systematic review of the elicited SRS data literature. We found that social survey uncertainty was commonly tested for, but that these uncertainties were ignored when projected spatially. Based on these results we developed a framework which will help researchers and practitioners estimate social survey uncertainty and use these quantitative estimates to systematically address uncertainty within an analysis. This is important when using SRS data in conservation applications because decisions need to be made irrespective of data quality and well characterized uncertainty can be incorporated into decision theoretic approaches.

  17. The Scientific Basis of Uncertainty Factors Used in Setting Occupational Exposure Limits

    PubMed Central

    Dankovic, D. A.; Naumann, B. D.; Maier, A.; Dourson, M. L.; Levy, L. S.

    2015-01-01

    The uncertainty factor concept is integrated into health risk assessments for all aspects of public health practice, including by most organizations that derive occupational exposure limits. The use of uncertainty factors is predicated on the assumption that a sufficient reduction in exposure from those at the boundary for the onset of adverse effects will yield a safe exposure level for at least the great majority of the exposed population, including vulnerable subgroups. There are differences in the application of the uncertainty factor approach among groups that conduct occupational assessments; however, there are common areas of uncertainty which are considered by all or nearly all occupational exposure limit-setting organizations. Five key uncertainties that are often examined include interspecies variability in response when extrapolating from animal studies to humans, response variability in humans, uncertainty in estimating a no-effect level from a dose where effects were observed, extrapolation from shorter duration studies to a full life-time exposure, and other insufficiencies in the overall health effects database indicating that the most sensitive adverse effect may not have been evaluated. In addition, a modifying factor is used by some organizations to account for other remaining uncertainties—typically related to exposure scenarios or accounting for the interplay among the five areas noted above. Consideration of uncertainties in occupational exposure limit derivation is a systematic process whereby the factors applied are not arbitrary, although they are mathematically imprecise. As the scientific basis for uncertainty factor application has improved, default uncertainty factors are now used only in the absence of chemical-specific data, and the trend is to replace them with chemical-specific adjustment factors whenever possible. The increased application of scientific data in the development of uncertainty factors for individual chemicals also

  18. Impact of model defect and experimental uncertainties on evaluated output

    NASA Astrophysics Data System (ADS)

    Neudecker, D.; Capote, R.; Leeb, H.

    2013-09-01

    One of the current major problems in nuclear data evaluation is the unreasonably small evaluated uncertainties often obtained. These small uncertainties are partly attributed to missing correlations of experimental uncertainties as well as to deficiencies of the model employed for the prior information. In this article, both uncertainty sources are included in an evaluation of 55Mn cross-sections for incident neutrons. Their impact on the evaluated output is studied using a prior obtained by the Full Bayesian Evaluation Technique and a prior obtained by the nuclear model program EMPIRE. It is shown analytically and by means of an evaluation that unreasonably small evaluated uncertainties can be obtained not only if correlated systematic uncertainties of the experiment are neglected but also if prior uncertainties are smaller or about the same magnitude as the experimental ones. Furthermore, it is shown that including model defect uncertainties in the evaluation of 55Mn leads to larger evaluated uncertainties for channels where the model is deficient. It is concluded that including correlated experimental uncertainties is equally important as model defect uncertainties, if the model calculations deviate significantly from the measurements.

  19. Assessing MODIS Macrophysical Cloud Property Uncertainties

    NASA Astrophysics Data System (ADS)

    Maddux, B. C.; Ackerman, S. A.; Frey, R.; Holz, R.

    2013-12-01

    Cloud, being multifarious and ephemeral, is difficult to observe and quantify in a systematic way. Even basic terminology used to describe cloud observations is fraught with ambiguity in the scientific literature. Any observational technique, method, or platform will contain inherent and unavoidable measurement uncertainties. Quantifying these uncertainties in cloud observations is a complex task that requires an understanding of all aspects of the measurement. We will use cloud observations obtained from the Moderate Resolution Imaging Spectroradiameter(MODIS) to obtain metrics of the uncertainty of its cloud observations. Our uncertainty analyses will contain two main components, 1) an attempt to create a bias or uncertainty with respect to active measurements from CALIPSO and 2) a relative uncertainty within the MODIS cloud climatologies themselves. Our method will link uncertainty to the physical observation and its environmental/scene characteristics. Our aim is to create statistical uncertainties that are based on the cloud observational values, satellite view geometry, surface type, etc, for cloud amount and cloud top pressure. The MODIS instruments on the NASA Terra and Aqua satellites provide observations over a broad spectral range (36 bands between 0.415 and 14.235 micron) and high spatial resolution (250 m for two bands, 500 m for five bands, 1000 m for 29 bands), which the MODIS cloud mask algorithm (MOD35) utilizes to provide clear/cloud determinations over a wide array of surface types, solar illuminations and view geometries. For this study we use the standard MODIS products, MOD03, MOD06 and MOD35, all of which were obtained from the NASA Level 1 and Atmosphere Archive and Distribution System.

  20. Comparison of methods for the estimation of measurement uncertainty for an analytical method for sulphonamides.

    PubMed

    Dabalus Islam, M; Schweikert Turcu, M; Cannavan, A

    2008-12-01

    A simple and inexpensive liquid chromatographic method for the determination of seven sulphonamides in animal tissues was validated. The measurement uncertainty of the method was estimated using two approaches: a 'top-down' approach based on in-house validation data, which used either repeatability data or intra-laboratory reproducibility; and a 'bottom-up' approach, which included repeatability data from spiking experiments. The decision limits (CCalpha) applied in the European Union were calculated for comparison. The bottom-up approach was used to identify critical steps in the analytical procedure, which comprised extraction, concentration, hexane-wash and HPLC-UV analysis. Six replicates of porcine kidney were fortified at the maximum residue limit (100 microg kg(-1)) at three different stages of the analytical procedure, extraction, evaporation, and final wash/HPLC analysis, to provide repeatability data for each step. The uncertainties of the gravimetric and volumetric measurements were estimated and integrated in the calculation of the total combined uncertainties by the bottom-up approach. Estimates for systematic error components were included in both approaches. Combined uncertainty estimates for the seven compounds using the 'top-down' approach ranged from 7.9 to 12.5% (using reproducibility) and from 5.4 to 9.5% (using repeatability data) and from 5.1 to 9.0% using the bottom-up approach. CCalpha values ranged from 105.6 to 108.5 microg kg(-1). The major contributor to the combined uncertainty for each analyte was identified as the extraction step. Since there was no statistical difference between the uncertainty values obtained by either approach, the analyst would be justified in applying the 'top-down' estimation using method validation data, rather than performing additional experiments to obtain uncertainty data.

  1. Fragility, uncertainty, and healthcare.

    PubMed

    Rogers, Wendy A; Walker, Mary J

    2016-02-01

    Medicine seeks to overcome one of the most fundamental fragilities of being human, the fragility of good health. No matter how robust our current state of health, we are inevitably susceptible to future illness and disease, while current disease serves to remind us of various frailties inherent in the human condition. This article examines the relationship between fragility and uncertainty with regard to health, and argues that there are reasons to accept rather than deny at least some forms of uncertainty. In situations of current ill health, both patients and doctors seek to manage this fragility through diagnoses that explain suffering and provide some certainty about prognosis as well as treatment. However, both diagnosis and prognosis are inevitably uncertain to some degree, leading to questions about how much uncertainty health professionals should disclose, and how to manage when diagnosis is elusive, leaving patients in uncertainty. We argue that patients can benefit when they are able to acknowledge, and appropriately accept, some uncertainty. Healthy people may seek to protect the fragility of their good health by undertaking preventative measures including various tests and screenings. However, these attempts to secure oneself against the onset of biological fragility can cause harm by creating rather than eliminating uncertainty. Finally, we argue that there are good reasons for accepting the fragility of health, along with the associated uncertainties.

  2. Path planning under spatial uncertainty.

    PubMed

    Wiener, Jan M; Lafon, Matthieu; Berthoz, Alain

    2008-04-01

    In this article, we present experiments studying path planning under spatial uncertainties. In the main experiment, the participants' task was to navigate the shortest possible path to find an object hidden in one of four places and to bring it to the final destination. The probability of finding the object (probability matrix) was different for each of the four places and varied between conditions. Givensuch uncertainties about the object's location, planning a single path is not sufficient. Participants had to generate multiple consecutive plans (metaplans)--for example: If the object is found in A, proceed to the destination; if the object is not found, proceed to B; and so on. The optimal solution depends on the specific probability matrix. In each condition, participants learned a different probability matrix and were then asked to report the optimal metaplan. Results demonstrate effective integration of the probabilistic information about the object's location during planning. We present a hierarchical planning scheme that could account for participants' behavior, as well as for systematic errors and differences between conditions.

  3. Uncertainty of upland soil carbon sink estimate for Finland

    NASA Astrophysics Data System (ADS)

    Lehtonen, Aleksi; Heikkinen, Juha

    2016-04-01

    Changes in the soil carbon stock of Finnish upland soils were quantified using forest inventory data, forest statistics, biomass models, litter turnover rates, and the Yasso07 soil model. Uncertainty in the estimated stock changes was assessed by combining model and sampling errors associated with the various data sources into variance-covariance matrices that allowed computationally efficient error propagation in the context of Yasso07 simulations. In sensitivity analysis, we found that the uncertainty increased drastically as a result of adding random year-to-year variation to the litter input. Such variation is smoothed out when using periodic inventory data with constant biomass models and turnover rates. Model errors (biomass, litter, understorey vegetation) and the systematic error of total drain had a marginal effect on the uncertainty regarding soil carbon stock change. Most of the uncertainty appears to be related to uncaptured annual variation in litter amounts. This is due to fact that variation in the slopes of litter input trends dictates the uncertainty of soil carbon stock change. If we assume that there is annual variation only in foliage and fine root litter rates and that this variation is less than 10% from year to year, then we can claim that Finnish upland forest soils have accumulated carbon during the first Kyoto period (2008-2012). The results of the study underline superiority of permanent sample plots compared to temporary ones, when soil model litter input trends have been estimated from forest inventory data. In addition, we also found that the use of IPCC guidelines leads to underestimation of the uncertainty of soil carbon stock change. This underestimation of the error results from the guidance to remove inter-annual variation from the model inputs, here illustrated with constant litter life spans. Model assumptions and model input estimation should be evaluated critically, when GHG-inventory results are used for policy planning

  4. Mutually Exclusive Uncertainty Relations

    NASA Astrophysics Data System (ADS)

    Xiao, Yunlong; Jing, Naihuan

    2016-11-01

    The uncertainty principle is one of the characteristic properties of quantum theory based on incompatibility. Apart from the incompatible relation of quantum states, mutually exclusiveness is another remarkable phenomenon in the information- theoretic foundation of quantum theory. We investigate the role of mutual exclusive physical states in the recent work of stronger uncertainty relations for all incompatible observables by Mccone and Pati and generalize the weighted uncertainty relation to the product form as well as their multi-observable analogues. The new bounds capture both incompatibility and mutually exclusiveness, and are tighter compared with the existing bounds.

  5. Mutually Exclusive Uncertainty Relations.

    PubMed

    Xiao, Yunlong; Jing, Naihuan

    2016-11-08

    The uncertainty principle is one of the characteristic properties of quantum theory based on incompatibility. Apart from the incompatible relation of quantum states, mutually exclusiveness is another remarkable phenomenon in the information- theoretic foundation of quantum theory. We investigate the role of mutual exclusive physical states in the recent work of stronger uncertainty relations for all incompatible observables by Mccone and Pati and generalize the weighted uncertainty relation to the product form as well as their multi-observable analogues. The new bounds capture both incompatibility and mutually exclusiveness, and are tighter compared with the existing bounds.

  6. Mutually Exclusive Uncertainty Relations

    PubMed Central

    Xiao, Yunlong; Jing, Naihuan

    2016-01-01

    The uncertainty principle is one of the characteristic properties of quantum theory based on incompatibility. Apart from the incompatible relation of quantum states, mutually exclusiveness is another remarkable phenomenon in the information- theoretic foundation of quantum theory. We investigate the role of mutual exclusive physical states in the recent work of stronger uncertainty relations for all incompatible observables by Mccone and Pati and generalize the weighted uncertainty relation to the product form as well as their multi-observable analogues. The new bounds capture both incompatibility and mutually exclusiveness, and are tighter compared with the existing bounds. PMID:27824161

  7. Optimal Universal Uncertainty Relations

    PubMed Central

    Li, Tao; Xiao, Yunlong; Ma, Teng; Fei, Shao-Ming; Jing, Naihuan; Li-Jost, Xianqing; Wang, Zhi-Xi

    2016-01-01

    We study universal uncertainty relations and present a method called joint probability distribution diagram to improve the majorization bounds constructed independently in [Phys. Rev. Lett. 111, 230401 (2013)] and [J. Phys. A. 46, 272002 (2013)]. The results give rise to state independent uncertainty relations satisfied by any nonnegative Schur-concave functions. On the other hand, a remarkable recent result of entropic uncertainty relation is the direct-sum majorization relation. In this paper, we illustrate our bounds by showing how they provide a complement to that in [Phys. Rev. A. 89, 052115 (2014)]. PMID:27775010

  8. On the relationship between aerosol model uncertainty and radiative forcing uncertainty

    PubMed Central

    Reddington, Carly L.; Carslaw, Kenneth S.

    2016-01-01

    The largest uncertainty in the historical radiative forcing of climate is caused by the interaction of aerosols with clouds. Historical forcing is not a directly measurable quantity, so reliable assessments depend on the development of global models of aerosols and clouds that are well constrained by observations. However, there has been no systematic assessment of how reduction in the uncertainty of global aerosol models will feed through to the uncertainty in the predicted forcing. We use a global model perturbed parameter ensemble to show that tight observational constraint of aerosol concentrations in the model has a relatively small effect on the aerosol-related uncertainty in the calculated forcing between preindustrial and present-day periods. One factor is the low sensitivity of present-day aerosol to natural emissions that determine the preindustrial aerosol state. However, the major cause of the weak constraint is that the full uncertainty space of the model generates a large number of model variants that are equally acceptable compared to present-day aerosol observations. The narrow range of aerosol concentrations in the observationally constrained model gives the impression of low aerosol model uncertainty. However, these multiple “equifinal” models predict a wide range of forcings. To make progress, we need to develop a much deeper understanding of model uncertainty and ways to use observations to constrain it. Equifinality in the aerosol model means that tuning of a small number of model processes to achieve model−observation agreement could give a misleading impression of model robustness. PMID:26848136

  9. Extrapolation, uncertainty factors, and the precautionary principle.

    PubMed

    Steel, Daniel

    2011-09-01

    This essay examines the relationship between the precautionary principle and uncertainty factors used by toxicologists to estimate acceptable exposure levels for toxic chemicals from animal experiments. It shows that the adoption of uncertainty factors in the United States in the 1950s can be understood by reference to the precautionary principle, but not by cost-benefit analysis because of a lack of relevant quantitative data at that time. In addition, it argues that uncertainty factors continue to be relevant to efforts to implement the precautionary principle and that the precautionary principle should not be restricted to cases involving unquantifiable hazards.

  10. Reducing Zero-point Systematics in Dark Energy Supernova Experiments

    SciTech Connect

    Faccioli, Lorenzo; Kim, Alex G; Miquel, Ramon; Bernstein, Gary; Bonissent, Alain; Brown, Matthew; Carithers, William; Christiansen, Jodi; Connolly, Natalia; Deustua, Susana; Gerdes, David; Gladney, Larry; Kushner, Gary; Linder, Eric; McKee, Shawn; Mostek, Nick; Shukla, Hemant; Stebbins, Albert; Stoughton, Chris; Tucker, David

    2011-04-01

    We study the effect of filter zero-point uncertainties on future supernova dark energy missions. Fitting for calibration parameters using simultaneous analysis of all Type Ia supernova standard candles achieves a significant improvement over more traditional fit methods. This conclusion is robust under diverse experimental configurations (number of observed supernovae, maximum survey redshift, inclusion of additional systematics). This approach to supernova fitting considerably eases otherwise stringent mission cali- bration requirements. As an example we simulate a space-based mission based on the proposed JDEM satellite; however the method and conclusions are general and valid for any future supernova dark energy mission, ground or space-based.

  11. Uncertainty Analysis of the NASA Glenn 8x6 Supersonic Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Stephens, Julia; Hubbard, Erin; Walter, Joel; McElroy, Tyler

    2016-01-01

    This paper presents methods and results of a detailed measurement uncertainty analysis that was performed for the 8- by 6-foot Supersonic Wind Tunnel located at the NASA Glenn Research Center. The statistical methods and engineering judgments used to estimate elemental uncertainties are described. The Monte Carlo method of propagating uncertainty was selected to determine the uncertainty of calculated variables of interest. A detailed description of the Monte Carlo method as applied for this analysis is provided. Detailed uncertainty results for the uncertainty in average free stream Mach number as well as other variables of interest are provided. All results are presented as random (variation in observed values about a true value), systematic (potential offset between observed and true value), and total (random and systematic combined) uncertainty. The largest sources contributing to uncertainty are determined and potential improvement opportunities for the facility are investigated.

  12. Communicating scientific uncertainty.

    PubMed

    Fischhoff, Baruch; Davis, Alex L

    2014-09-16

    All science has uncertainty. Unless that uncertainty is communicated effectively, decision makers may put too much or too little faith in it. The information that needs to be communicated depends on the decisions that people face. Are they (i) looking for a signal (e.g., whether to evacuate before a hurricane), (ii) choosing among fixed options (e.g., which medical treatment is best), or (iii) learning to create options (e.g., how to regulate nanotechnology)? We examine these three classes of decisions in terms of how to characterize, assess, and convey the uncertainties relevant to each. We then offer a protocol for summarizing the many possible sources of uncertainty in standard terms, designed to impose a minimal burden on scientists, while gradually educating those whose decisions depend on their work. Its goals are better decisions, better science, and better support for science.

  13. Uncertainty in chemistry.

    PubMed

    Menger, Fredric M

    2010-09-01

    It might come as a disappointment to some chemists, but just as there are uncertainties in physics and mathematics, there are some chemistry questions we may never know the answer to either, suggests Fredric M. Menger.

  14. Communicating scientific uncertainty

    PubMed Central

    Fischhoff, Baruch; Davis, Alex L.

    2014-01-01

    All science has uncertainty. Unless that uncertainty is communicated effectively, decision makers may put too much or too little faith in it. The information that needs to be communicated depends on the decisions that people face. Are they (i) looking for a signal (e.g., whether to evacuate before a hurricane), (ii) choosing among fixed options (e.g., which medical treatment is best), or (iii) learning to create options (e.g., how to regulate nanotechnology)? We examine these three classes of decisions in terms of how to characterize, assess, and convey the uncertainties relevant to each. We then offer a protocol for summarizing the many possible sources of uncertainty in standard terms, designed to impose a minimal burden on scientists, while gradually educating those whose decisions depend on their work. Its goals are better decisions, better science, and better support for science. PMID:25225390

  15. The Crucial Role of Error Correlation for Uncertainty Modeling of CFD-Based Aerodynamics Increments

    NASA Technical Reports Server (NTRS)

    Hemsch, Michael J.; Walker, Eric L.

    2011-01-01

    The Ares I ascent aerodynamics database for Design Cycle 3 (DAC-3) was built from wind-tunnel test results and CFD solutions. The wind tunnel results were used to build the baseline response surfaces for wind-tunnel Reynolds numbers at power-off conditions. The CFD solutions were used to build increments to account for Reynolds number effects. We calculate the validation errors for the primary CFD code results at wind tunnel Reynolds number power-off conditions and would like to be able to use those errors to predict the validation errors for the CFD increments. However, the validation errors are large compared to the increments. We suggest a way forward that is consistent with common practice in wind tunnel testing which is to assume that systematic errors in the measurement process and/or the environment will subtract out when increments are calculated, thus making increments more reliable with smaller uncertainty than absolute values of the aerodynamic coefficients. A similar practice has arisen for the use of CFD to generate aerodynamic database increments. The basis of this practice is the assumption of strong correlation of the systematic errors inherent in each of the results used to generate an increment. The assumption of strong correlation is the inferential link between the observed validation uncertainties at wind-tunnel Reynolds numbers and the uncertainties to be predicted for flight. In this paper, we suggest a way to estimate the correlation coefficient and demonstrate the approach using code-to-code differences that were obtained for quality control purposes during the Ares I CFD campaign. Finally, since we can expect the increments to be relatively small compared to the baseline response surface and to be typically of the order of the baseline uncertainty, we find that it is necessary to be able to show that the correlation coefficients are close to unity to avoid overinflating the overall database uncertainty with the addition of the increments.

  16. Conundrums with uncertainty factors.

    PubMed

    Cooke, Roger

    2010-03-01

    The practice of uncertainty factors as applied to noncancer endpoints in the IRIS database harkens back to traditional safety factors. In the era before risk quantification, these were used to build in a "margin of safety." As risk quantification takes hold, the safety factor methods yield to quantitative risk calculations to guarantee safety. Many authors believe that uncertainty factors can be given a probabilistic interpretation as ratios of response rates, and that the reference values computed according to the IRIS methodology can thus be converted to random variables whose distributions can be computed with Monte Carlo methods, based on the distributions of the uncertainty factors. Recent proposals from the National Research Council echo this view. Based on probabilistic arguments, several authors claim that the current practice of uncertainty factors is overprotective. When interpreted probabilistically, uncertainty factors entail very strong assumptions on the underlying response rates. For example, the factor for extrapolating from animal to human is the same whether the dosage is chronic or subchronic. Together with independence assumptions, these assumptions entail that the covariance matrix of the logged response rates is singular. In other words, the accumulated assumptions entail a log-linear dependence between the response rates. This in turn means that any uncertainty analysis based on these assumptions is ill-conditioned; it effectively computes uncertainty conditional on a set of zero probability. The practice of uncertainty factors is due for a thorough review. Two directions are briefly sketched, one based on standard regression models, and one based on nonparametric continuous Bayesian belief nets.

  17. Uncertainty in QSAR predictions.

    PubMed

    Sahlin, Ullrika

    2013-03-01

    It is relevant to consider uncertainty in individual predictions when quantitative structure-activity (or property) relationships (QSARs) are used to support decisions of high societal concern. Successful communication of uncertainty in the integration of QSARs in chemical safety assessment under the EU Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) system can be facilitated by a common understanding of how to define, characterise, assess and evaluate uncertainty in QSAR predictions. A QSAR prediction is, compared to experimental estimates, subject to added uncertainty that comes from the use of a model instead of empirically-based estimates. A framework is provided to aid the distinction between different types of uncertainty in a QSAR prediction: quantitative, i.e. for regressions related to the error in a prediction and characterised by a predictive distribution; and qualitative, by expressing our confidence in the model for predicting a particular compound based on a quantitative measure of predictive reliability. It is possible to assess a quantitative (i.e. probabilistic) predictive distribution, given the supervised learning algorithm, the underlying QSAR data, a probability model for uncertainty and a statistical principle for inference. The integration of QSARs into risk assessment may be facilitated by the inclusion of the assessment of predictive error and predictive reliability into the "unambiguous algorithm", as outlined in the second OECD principle.

  18. Military veterans with mental health problems: a protocol for a systematic review to identify whether they have an additional risk of contact with criminal justice systems compared with other veterans groups

    PubMed Central

    2012-01-01

    Background There is concern that some veterans of armed forces, in particular those with mental health, drug or alcohol problems, experience difficulty returning to a civilian way of life and may subsequently come into contact with criminal justice services and imprisonment. The aim of this review is to examine whether military veterans with mental health problems, including substance use, have an additional risk of contact with criminal justice systems when compared with veterans who do not have such problems. The review will also seek to identify veterans’ views and experiences on their contact with criminal justice services, what contributed to or influenced their contact and whether there are any differences, including international and temporal, in incidence, contact type, veteran type, their presenting health needs and reported experiences. Methods/design In this review we will adopt a methodological model similar to that previously used by other researchers when reviewing intervention studies. The model, which we will use as a framework for conducting a review of observational and qualitative studies, consists of two parallel synthesis stages within the review process; one for quantitative research and the other for qualitative research. The third stage involves a cross study synthesis, enabling a deeper understanding of the results of the quantitative synthesis. A range of electronic databases, including MEDLINE, PsychINFO, CINAHL, will be systematically searched, from 1939 to present day, using a broad range of search terms that cover four key concepts: mental health, military veterans, substance misuse, and criminal justice. Studies will be screened against topic specific inclusion/exclusion criteria and then against a smaller subset of design specific inclusion/exclusion criteria. Data will be extracted for those studies that meet the inclusion criteria, and all eligible studies will be critically appraised. Included studies, both quantitative and

  19. Hard Constraints in Optimization Under Uncertainty

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Giesy, Daniel P.; Kenny, Sean P.

    2008-01-01

    This paper proposes a methodology for the analysis and design of systems subject to parametric uncertainty where design requirements are specified via hard inequality constraints. Hard constraints are those that must be satisfied for all parameter realizations within a given uncertainty model. Uncertainty models given by norm-bounded perturbations from a nominal parameter value, i.e., hyper-spheres, and by sets of independently bounded uncertain variables, i.e., hyper-rectangles, are the focus of this paper. These models, which are also quite practical, allow for a rigorous mathematical treatment within the proposed framework. Hard constraint feasibility is determined by sizing the largest uncertainty set for which the design requirements are satisfied. Analytically verifiable assessments of robustness are attained by comparing this set with the actual uncertainty model. Strategies that enable the comparison of the robustness characteristics of competing design alternatives, the description and approximation of the robust design space, and the systematic search for designs with improved robustness are also proposed. Since the problem formulation is generic and the tools derived only require standard optimization algorithms for their implementation, this methodology is applicable to a broad range of engineering problems.

  20. Transportable Optical Lattice Clock with 7 ×10-17 Uncertainty

    NASA Astrophysics Data System (ADS)

    Koller, S. B.; Grotti, J.; Vogt, St.; Al-Masoudi, A.; Dörscher, S.; Häfner, S.; Sterr, U.; Lisdat, Ch.

    2017-02-01

    We present a transportable optical clock (TOC) with Sr 87 . Its complete characterization against a stationary lattice clock resulted in a systematic uncertainty of 7.4 ×10-17, which is currently limited by the statistics of the determination of the residual lattice light shift, and an instability of 1.3 ×10-15/√{τ } with an averaging time τ in seconds. Measurements confirm that the systematic uncertainty can be reduced to below the design goal of 1 ×10-17. To our knowledge, these are the best uncertainties and instabilities reported for any transportable clock to date. For autonomous operation, the TOC has been installed in an air-conditioned car trailer. It is suitable for chronometric leveling with submeter resolution as well as for intercontinental cross-linking of optical clocks, which is essential for a redefinition of the International System of Units (SI) second. In addition, the TOC will be used for high precision experiments for fundamental science that are commonly tied to precise frequency measurements and its development is an important step to space-borne optical clocks.

  1. Multi-thresholds for fault isolation in the presence of uncertainties.

    PubMed

    Touati, Youcef; Mellal, Mohamed Arezki; Benazzouz, Djamel

    2016-05-01

    Monitoring of the faults is an important task in mechatronics. It involves the detection and isolation of faults which are performed by using the residuals. These residuals represent numerical values that define certain intervals called thresholds. In fact, the fault is detected if the residuals exceed the thresholds. In addition, each considered fault must activate a unique set of residuals to be isolated. However, in the presence of uncertainties, false decisions can occur due to the low sensitivity of certain residuals towards faults. In this paper, an efficient approach to make decision on fault isolation in the presence of uncertainties is proposed. Based on the bond graph tool, the approach is developed in order to generate systematically the relations between residuals and faults. The generated relations allow the estimation of the minimum detectable and isolable fault values. The latter is used to calculate the thresholds of isolation for each residual.

  2. Quantifying Uncertainty in Epidemiological Models

    SciTech Connect

    Ramanathan, Arvind; Jha, Sumit Kumar

    2012-01-01

    Modern epidemiology has made use of a number of mathematical models, including ordinary differential equation (ODE) based models and agent based models (ABMs) to describe the dynamics of how a disease may spread within a population and enable the rational design of strategies for intervention that effectively contain the spread of the disease. Although such predictions are of fundamental importance in preventing the next global pandemic, there is a significant gap in trusting the outcomes/predictions solely based on such models. Hence, there is a need to develop approaches such that mathematical models can be calibrated against historical data. In addition, there is a need to develop rigorous uncertainty quantification approaches that can provide insights into when a model will fail and characterize the confidence in the (possibly multiple) model outcomes/predictions, when such retrospective analysis cannot be performed. In this paper, we outline an approach to develop uncertainty quantification approaches for epidemiological models using formal methods and model checking. By specifying the outcomes expected from a model in a suitable spatio-temporal logic, we use probabilistic model checking methods to quantify the probability with which the epidemiological model satisfies the specification. We argue that statistical model checking methods can solve the uncertainty quantification problem for complex epidemiological models.

  3. Food additives

    PubMed Central

    Spencer, Michael

    1974-01-01

    Food additives are discussed from the food technology point of view. The reasons for their use are summarized: (1) to protect food from chemical and microbiological attack; (2) to even out seasonal supplies; (3) to improve their eating quality; (4) to improve their nutritional value. The various types of food additives are considered, e.g. colours, flavours, emulsifiers, bread and flour additives, preservatives, and nutritional additives. The paper concludes with consideration of those circumstances in which the use of additives is (a) justified and (b) unjustified. PMID:4467857

  4. [Dealing with diagnostic uncertainty in general practice].

    PubMed

    Wübken, Magdalena; Oswald, Jana; Schneider, Antonius

    2013-01-01

    In general, the prevalence of diseases is low in primary care. Therefore, the positive predictive value of diagnostic tests is lower than in hospitals where patients are highly selected. In addition, the patients present with milder forms of disease; and many diseases might hide behind the initial symptom(s). These facts lead to diagnostic uncertainty which is somewhat inherent to general practice. This narrative review discusses different sources of and reasons for uncertainty and strategies to deal with it in the context of the current literature. Fear of uncertainty correlates with higher diagnostic activities. The attitude towards uncertainty correlates with the choice of medical speciality by vocational trainees or medical students. An intolerance of uncertainty, which still increases as medicine is making steady progress, might partly explain the growing shortage of general practitioners. The bio-psycho-social context appears to be important to diagnostic decision-making. The effect of intuition and heuristics are investigated by cognitive psychologists. It is still unclear whether these aspects are prone to bias or useful, which might depend on the context of medical decisions. Good communication is of great importance to share uncertainty with the patients in a transparent way and to alleviate shared decision-making. Dealing with uncertainty should be seen as an important core component of general practice and needs to be investigated in more detail to improve the respective medical decisions.

  5. Quantum preparation uncertainty and lack of information

    NASA Astrophysics Data System (ADS)

    Rozpędek, Filip; Kaniewski, Jędrzej; Coles, Patrick J.; Wehner, Stephanie

    2017-02-01

    The quantum uncertainty principle famously predicts that there exist measurements that are inherently incompatible, in the sense that their outcomes cannot be predicted simultaneously. In contrast, no such uncertainty exists in the classical domain, where all uncertainty results from ignorance about the exact state of the physical system. Here, we critically examine the concept of preparation uncertainty and ask whether similarly in the quantum regime, some of the uncertainty that we observe can actually also be understood as a lack of information (LOI), albeit a lack of quantum information. We answer this question affirmatively by showing that for the well known measurements employed in BB84 quantum key distribution (Bennett and Brassard 1984 Int. Conf. on Computer System and Signal Processing), the amount of uncertainty can indeed be related to the amount of available information about additional registers determining the choice of the measurement. We proceed to show that also for other measurements the amount of uncertainty is in part connected to a LOI. Finally, we discuss the conceptual implications of our observation to the security of cryptographic protocols that make use of BB84 states.

  6. Estimating uncertainty of inference for validation

    SciTech Connect

    Booker, Jane M; Langenbrunner, James R; Hemez, Francois M; Ross, Timothy J

    2010-09-30

    We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the

  7. Interpreting uncertainty terms.

    PubMed

    Holtgraves, Thomas

    2014-08-01

    Uncertainty terms (e.g., some, possible, good, etc.) are words that do not have a fixed referent and hence are relatively ambiguous. A model is proposed that specifies how, from the hearer's perspective, recognition of facework as a potential motive for the use of an uncertainty term results in a calibration of the intended meaning of that term. Four experiments are reported that examine the impact of face threat, and the variables that affect it (e.g., power), on the manner in which a variety of uncertainty terms (probability terms, quantifiers, frequency terms, etc.) are interpreted. Overall, the results demonstrate that increased face threat in a situation will result in a more negative interpretation of an utterance containing an uncertainty term. That the interpretation of so many different types of uncertainty terms is affected in the same way suggests the operation of a fundamental principle of language use, one with important implications for the communication of risk, subjective experience, and so on.

  8. Uncertainty quantification in lattice QCD calculations for nuclear physics

    SciTech Connect

    Beane, Silas R.; Detmold, William; Orginos, Kostas; Savage, Martin J.

    2015-02-05

    The numerical technique of Lattice QCD holds the promise of connecting the nuclear forces, nuclei, the spectrum and structure of hadrons, and the properties of matter under extreme conditions with the underlying theory of the strong interactions, quantum chromodynamics. A distinguishing, and thus far unique, feature of this formulation is that all of the associated uncertainties, both statistical and systematic can, in principle, be systematically reduced to any desired precision with sufficient computational and human resources. As a result, we review the sources of uncertainty inherent in Lattice QCD calculations for nuclear physics, and discuss how each is quantified in current efforts.

  9. Quantifying reliability uncertainty : a proof of concept.

    SciTech Connect

    Diegert, Kathleen V.; Dvorack, Michael A.; Ringland, James T.; Mundt, Michael Joseph; Huzurbazar, Aparna; Lorio, John F.; Fatherley, Quinn; Anderson-Cook, Christine; Wilson, Alyson G.; Zurn, Rena M.

    2009-10-01

    This paper develops Classical and Bayesian methods for quantifying the uncertainty in reliability for a system of mixed series and parallel components for which both go/no-go and variables data are available. Classical methods focus on uncertainty due to sampling error. Bayesian methods can explore both sampling error and other knowledge-based uncertainties. To date, the reliability community has focused on qualitative statements about uncertainty because there was no consensus on how to quantify them. This paper provides a proof of concept that workable, meaningful quantification methods can be constructed. In addition, the application of the methods demonstrated that the results from the two fundamentally different approaches can be quite comparable. In both approaches, results are sensitive to the details of how one handles components for which no failures have been seen in relatively few tests.

  10. Optimising uncertainty in physical sample preparation.

    PubMed

    Lyn, Jennifer A; Ramsey, Michael H; Damant, Andrew P; Wood, Roger

    2005-11-01

    Uncertainty associated with the result of a measurement can be dominated by the physical sample preparation stage of the measurement process. In view of this, the Optimised Uncertainty (OU) methodology has been further developed to allow the optimisation of the uncertainty from this source, in addition to that from the primary sampling and the subsequent chemical analysis. This new methodology for the optimisation of physical sample preparation uncertainty (u(prep), estimated as s(prep)) is applied for the first time, to a case study of myclobutanil in retail strawberries. An increase in expenditure (+7865%) on the preparatory process was advised in order to reduce the s(prep) by the 69% recommended. This reduction is desirable given the predicted overall saving, under optimised conditions, of 33,000 pounds Sterling per batch. This new methodology has been shown to provide guidance on the appropriate distribution of resources between the three principle stages of a measurement process, including physical sample preparation.

  11. Geometric formulation of the uncertainty principle

    NASA Astrophysics Data System (ADS)

    Bosyk, G. M.; Osán, T. M.; Lamberti, P. W.; Portesi, M.

    2014-03-01

    A geometric approach to formulate the uncertainty principle between quantum observables acting on an N-dimensional Hilbert space is proposed. We consider the fidelity between a density operator associated with a quantum system and a projector associated with an observable, and interpret it as the probability of obtaining the outcome corresponding to that projector. We make use of fidelity-based metrics such as angle, Bures, and root infidelity to propose a measure of uncertainty. The triangle inequality allows us to derive a family of uncertainty relations. In the case of the angle metric, we recover the Landau-Pollak inequality for pure states and show, in a natural way, how to extend it to the case of mixed states in arbitrary dimension. In addition, we derive and compare alternative uncertainty relations when using other known fidelity-based metrics.

  12. Measurement uncertainty relations

    SciTech Connect

    Busch, Paul; Lahti, Pekka; Werner, Reinhard F.

    2014-04-15

    Measurement uncertainty relations are quantitative bounds on the errors in an approximate joint measurement of two observables. They can be seen as a generalization of the error/disturbance tradeoff first discussed heuristically by Heisenberg. Here we prove such relations for the case of two canonically conjugate observables like position and momentum, and establish a close connection with the more familiar preparation uncertainty relations constraining the sharpness of the distributions of the two observables in the same state. Both sets of relations are generalized to means of order α rather than the usual quadratic means, and we show that the optimal constants are the same for preparation and for measurement uncertainty. The constants are determined numerically and compared with some bounds in the literature. In both cases, the near-saturation of the inequalities entails that the state (resp. observable) is uniformly close to a minimizing one.

  13. Damage-expectancy uncertainties for deeply buried targets. Technical report 1 Jan-10 Jul 89

    SciTech Connect

    Wright, S.C.

    1991-07-01

    This targeting uncertainties methodology quantifies the effect of systematic errors on the targeting prediction measure damage expectancy. The methodology, developed specifically to support the SEPW Phase 2 Study, evaluates and ranks the effect of uncertainties for applications involving a single buried target and a single earth penetrating weapon. This methodology models the penetration of the EPW, the ground shock environment, and the probability of damage to the underground target. Potential error sources treated in the methodology are impact velocity, geology, depth of burst, target hardness, ground shock prediction model, ground shock depth-to-effect, weapon aiming accuracy, distance damage prediction model, and target depth, location and size. The methodology makes minimal use of assumed distributions and Monte Carlo techniques; instead it uses three points to describe the inherent dispersion for all but one error source. The values are sorted and used to define a distribution for DE. In addition, and analysis of variance technique is used to calculate the relative contributions of each error source to the uncertainty in DE. The added benefit of ranking the error sources in terms of their contribution to the uncertainty in DE can provide insight into deeply buried targeting problems.

  14. Uncertainty and Dimensional Calibrations

    PubMed Central

    Doiron, Ted; Stoup, John

    1997-01-01

    The calculation of uncertainty for a measurement is an effort to set reasonable bounds for the measurement result according to standardized rules. Since every measurement produces only an estimate of the answer, the primary requisite of an uncertainty statement is to inform the reader of how sure the writer is that the answer is in a certain range. This report explains how we have implemented these rules for dimensional calibrations of nine different types of gages: gage blocks, gage wires, ring gages, gage balls, roundness standards, optical flats indexing tables, angle blocks, and sieves. PMID:27805114

  15. A Certain Uncertainty

    NASA Astrophysics Data System (ADS)

    Silverman, Mark P.

    2014-07-01

    1. Tools of the trade; 2. The 'fundamental problem' of a practical physicist; 3. Mother of all randomness I: the random disintegration of matter; 4. Mother of all randomness II: the random creation of light; 5. A certain uncertainty; 6. Doing the numbers: nuclear physics and the stock market; 7. On target: uncertainties of projectile flight; 8. The guesses of groups; 9. The random flow of energy I: power to the people; 10. The random flow of energy II: warning from the weather underground; Index.

  16. The legacy of uncertainty

    NASA Technical Reports Server (NTRS)

    Brown, Laurie M.

    1993-01-01

    An historical account is given of the circumstances whereby the uncertainty relations were introduced into physics by Heisenberg. The criticisms of QED on measurement-theoretical grounds by Landau and Peierls are then discussed, as well as the response to them by Bohr and Rosenfeld. Finally, some examples are given of how the new freedom to advance radical proposals, in part the result of the revolution brought about by 'uncertainty,' was implemented in dealing with the new phenomena encountered in elementary particle physics in the 1930's.

  17. Uncertainty and Dimensional Calibrations.

    PubMed

    Doiron, Ted; Stoup, John

    1997-01-01

    The calculation of uncertainty for a measurement is an effort to set reasonable bounds for the measurement result according to standardized rules. Since every measurement produces only an estimate of the answer, the primary requisite of an uncertainty statement is to inform the reader of how sure the writer is that the answer is in a certain range. This report explains how we have implemented these rules for dimensional calibrations of nine different types of gages: gage blocks, gage wires, ring gages, gage balls, roundness standards, optical flats indexing tables, angle blocks, and sieves.

  18. Weighted Uncertainty Relations

    PubMed Central

    Xiao, Yunlong; Jing, Naihuan; Li-Jost, Xianqing; Fei, Shao-Ming

    2016-01-01

    Recently, Maccone and Pati have given two stronger uncertainty relations based on the sum of variances and one of them is nontrivial when the quantum state is not an eigenstate of the sum of the observables. We derive a family of weighted uncertainty relations to provide an optimal lower bound for all situations and remove the restriction on the quantum state. Generalization to multi-observable cases is also given and an optimal lower bound for the weighted sum of the variances is obtained in general quantum situation. PMID:26984295

  19. Individuals’ Uncertainty about Future Social Security Benefits and Portfolio Choice

    PubMed Central

    Delavande, Adeline

    2013-01-01

    Summary Little is known about the degree to which individuals are uncertain about their future Social Security benefits, how this varies within the U.S. population, and whether this uncertainty influences financial decisions related to retirement planning. To illuminate these issues, we present empirical evidence from the Health and Retirement Study Internet Survey and document systematic variation in respondents’ uncertainty about their future Social Security benefits by individual characteristics. We find that respondents with higher levels of uncertainty about future benefits hold a smaller share of their wealth in stocks. PMID:23914049

  20. Quantifying Uncertainties in Land-Surface Microwave Emissivity Retrievals

    NASA Technical Reports Server (NTRS)

    Tian, Yudong; Peters-Lidard, Christa D.; Harrison, Kenneth W.; Prigent, Catherine; Norouzi, Hamidreza; Aires, Filipe; Boukabara, Sid-Ahmed; Furuzawa, Fumie A.; Masunaga, Hirohiko

    2013-01-01

    Uncertainties in the retrievals of microwaveland-surface emissivities are quantified over two types of land surfaces: desert and tropical rainforest. Retrievals from satellite-based microwave imagers, including the Special Sensor Microwave Imager, the Tropical Rainfall Measuring Mission Microwave Imager, and the Advanced Microwave Scanning Radiometer for Earth Observing System, are studied. Our results show that there are considerable differences between the retrievals from different sensors and from different groups over these two land-surface types. In addition, the mean emissivity values show different spectral behavior across the frequencies. With the true emissivity assumed largely constant over both of the two sites throughout the study period, the differences are largely attributed to the systematic and random errors inthe retrievals. Generally, these retrievals tend to agree better at lower frequencies than at higher ones, with systematic differences ranging 1%-4% (3-12 K) over desert and 1%-7% (3-20 K) over rainforest. The random errors within each retrieval dataset are in the range of 0.5%-2% (2-6 K). In particular, at 85.5/89.0 GHz, there are very large differences between the different retrieval datasets, and within each retrieval dataset itself. Further investigation reveals that these differences are most likely caused by rain/cloud contamination, which can lead to random errors up to 10-17 K under the most severe conditions.

  1. Quantifying Uncertainties in Land Surface Microwave Emissivity Retrievals

    NASA Technical Reports Server (NTRS)

    Tian, Yudong; Peters-Lidard, Christa D.; Harrison, Kenneth W.; Prigent, Catherine; Norouzi, Hamidreza; Aires, Filipe; Boukabara, Sid-Ahmed; Furuzawa, Fumie A.; Masunaga, Hirohiko

    2012-01-01

    Uncertainties in the retrievals of microwave land surface emissivities were quantified over two types of land surfaces: desert and tropical rainforest. Retrievals from satellite-based microwave imagers, including SSM/I, TMI and AMSR-E, were studied. Our results show that there are considerable differences between the retrievals from different sensors and from different groups over these two land surface types. In addition, the mean emissivity values show different spectral behavior across the frequencies. With the true emissivity assumed largely constant over both of the two sites throughout the study period, the differences are largely attributed to the systematic and random errors in the retrievals. Generally these retrievals tend to agree better at lower frequencies than at higher ones, with systematic differences ranging 14% (312 K) over desert and 17% (320 K) over rainforest. The random errors within each retrieval dataset are in the range of 0.52% (26 K). In particular, at 85.0/89.0 GHz, there are very large differences between the different retrieval datasets, and within each retrieval dataset itself. Further investigation reveals that these differences are mostly likely caused by rain/cloud contamination, which can lead to random errors up to 1017 K under the most severe conditions.

  2. Measurement uncertainty of adsorption testing of desiccant materials

    SciTech Connect

    Bingham, C E; Pesaran, A A

    1988-12-01

    The technique of measurement uncertainty analysis as described in the current ANSI/ASME standard is applied to the testing of desiccant materials in SERI`s Sorption Test Facility. This paper estimates the elemental precision and systematic errors in these tests and propagates them separately to obtain the resulting uncertainty of the test parameters, including relative humidity ({plus_minus}.03) and sorption capacity ({plus_minus}.002 g/g). Errors generated by instrument calibration, data acquisition, and data reduction are considered. Measurement parameters that would improve the uncertainty of the results are identified. Using the uncertainty in the moisture capacity of a desiccant, the design engineer can estimate the uncertainty in performance of a dehumidifier for desiccant cooling systems with confidence. 6 refs., 2 figs., 8 tabs.

  3. Uncertainties in Hauser-Feshbach Neutron Capture Calculations for Astrophysics

    SciTech Connect

    Bertolli, M.G. Kawano, T.; Little, H.

    2014-06-15

    The calculation of neutron capture cross sections in a statistical Hauser-Feshbach method has proved successful in numerous astrophysical applications. Of increasing interest is the uncertainty associated with the calculated Maxwellian averaged cross sections (MACS). Aspects of a statistical model that introduce a large amount of uncertainty are the level density model, γ-ray strength function parameter, and the placement of E{sub low} – the cut-off energy below which the Hauser-Feshbach method is not applicable. Utilizing the Los Alamos statistical model code CoH3 we investigate the appropriate treatment of these sources of uncertainty via systematics of nuclei in a local region for which experimental or evaluated data is available. In order to show the impact of uncertainty analysis on nuclear data for astrophysical applications, these new uncertainties will be propagated through the nucleosynthesis code NuGrid.

  4. Avoiding climate change uncertainties in Strategic Environmental Assessment

    SciTech Connect

    Larsen, Sanne Vammen; Kørnøv, Lone; Driscoll, Patrick

    2013-11-15

    This article is concerned with how Strategic Environmental Assessment (SEA) practice handles climate change uncertainties within the Danish planning system. First, a hypothetical model is set up for how uncertainty is handled and not handled in decision-making. The model incorporates the strategies ‘reduction’ and ‘resilience’, ‘denying’, ‘ignoring’ and ‘postponing’. Second, 151 Danish SEAs are analysed with a focus on the extent to which climate change uncertainties are acknowledged and presented, and the empirical findings are discussed in relation to the model. The findings indicate that despite incentives to do so, climate change uncertainties were systematically avoided or downplayed in all but 5 of the 151 SEAs that were reviewed. Finally, two possible explanatory mechanisms are proposed to explain this: conflict avoidance and a need to quantify uncertainty.

  5. Parameterization of Model Validating Sets for Uncertainty Bound Optimizations. Revised

    NASA Technical Reports Server (NTRS)

    Lim, K. B.; Giesy, D. P.

    2000-01-01

    Given measurement data, a nominal model and a linear fractional transformation uncertainty structure with an allowance on unknown but bounded exogenous disturbances, easily computable tests for the existence of a model validating uncertainty set are given. Under mild conditions, these tests are necessary and sufficient for the case of complex, nonrepeated, block-diagonal structure. For the more general case which includes repeated and/or real scalar uncertainties, the tests are only necessary but become sufficient if a collinearity condition is also satisfied. With the satisfaction of these tests, it is shown that a parameterization of all model validating sets of plant models is possible. The new parameterization is used as a basis for a systematic way to construct or perform uncertainty tradeoff with model validating uncertainty sets which have specific linear fractional transformation structure for use in robust control design and analysis. An illustrative example which includes a comparison of candidate model validating sets is given.

  6. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 1: Main report

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harrison, J.D.; Harper, F.T.; Hora, S.C.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models.

  7. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Little, M.P.; Muirhead, C.R.; Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harper, F.T.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models.

  8. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 1: Main report

    SciTech Connect

    Haskin, F.E.; Harper, F.T.; Goossens, L.H.J.; Kraan, B.C.P.; Grupa, J.B.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models.

  9. Uncertainty in NIST Force Measurements.

    PubMed

    Bartel, Tom

    2005-01-01

    This paper focuses upon the uncertainty of force calibration measurements at the National Institute of Standards and Technology (NIST). The uncertainty of the realization of force for the national deadweight force standards at NIST is discussed, as well as the uncertainties associated with NIST's voltage-ratio measuring instruments and with the characteristics of transducers being calibrated. The combined uncertainty is related to the uncertainty of dissemination for force transfer standards sent to NIST for calibration.

  10. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    SciTech Connect

    Langenbrunner, James R.; Booker, Jane M; Hemez, Francois M; Salazar, Issac F; Ross, Timothy J

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  11. Uncertainties in successive measurements

    NASA Astrophysics Data System (ADS)

    Distler, Jacques; Paban, Sonia

    2013-06-01

    When you measure an observable, A, in quantum mechanics, the state of the system changes. This, in turn, affects the quantum-mechanical uncertainty in some noncommuting observable, B. The standard uncertainty relation puts a lower bound on the uncertainty of B in the initial state. What is relevant for a subsequent measurement of B, however, is the uncertainty of B in the postmeasurement state. We re-examine this problem, both in the case where A has a pure point spectrum and in the case where A has a continuous spectrum. In the latter case, the need to include a finite detector resolution, as part of what it means to measure such an observable, has dramatic implications for the result of successive measurements. Ozawa, [Phys. Rev. APLRAAN1050-294710.1103/PhysRevA.67.042105 67, 042105 (2003)] proposed an inequality satisfied in the case of successive measurements. Among our results, we show that his inequality is ineffective (can never come close to being saturated). For the cases of interest, we compute a sharper lower bound.

  12. Uncertainties in repository modeling

    SciTech Connect

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  13. Uncertainty in Computational Aerodynamics

    NASA Technical Reports Server (NTRS)

    Luckring, J. M.; Hemsch, M. J.; Morrison, J. H.

    2003-01-01

    An approach is presented to treat computational aerodynamics as a process, subject to the fundamental quality assurance principles of process control and process improvement. We consider several aspects affecting uncertainty for the computational aerodynamic process and present a set of stages to determine the level of management required to meet risk assumptions desired by the customer of the predictions.

  14. Food additives

    MedlinePlus

    ... or natural. Natural food additives include: Herbs or spices to add flavor to foods Vinegar for pickling ... Certain colors improve the appearance of foods. Many spices, as well as natural and man-made flavors, ...

  15. Quantification and Propagation of Nuclear Data Uncertainties

    NASA Astrophysics Data System (ADS)

    Rising, Michael E.

    The use of several uncertainty quantification and propagation methodologies is investigated in the context of the prompt fission neutron spectrum (PFNS) uncertainties and its impact on critical reactor assemblies. First, the first-order, linear Kalman filter is used as a nuclear data evaluation and uncertainty quantification tool combining available PFNS experimental data and a modified version of the Los Alamos (LA) model. The experimental covariance matrices, not generally given in the EXFOR database, are computed using the GMA methodology used by the IAEA to establish more appropriate correlations within each experiment. Then, using systematics relating the LA model parameters across a suite of isotopes, the PFNS for both the uranium and plutonium actinides are evaluated leading to a new evaluation including cross-isotope correlations. Next, an alternative evaluation approach, the unified Monte Carlo (UMC) method, is studied for the evaluation of the PFNS for the n(0.5 MeV)+Pu-239 fission reaction and compared to the Kalman filter. The UMC approach to nuclear data evaluation is implemented in a variety of ways to test convergence toward the Kalman filter results and to determine the nonlinearities present in the LA model. Ultimately, the UMC approach is shown to be comparable to the Kalman filter for a realistic data evaluation of the PFNS and is capable of capturing the nonlinearities present in the LA model. Next, the impact that the PFNS uncertainties have on important critical assemblies is investigated. Using the PFNS covariance matrices in the ENDF/B-VII.1 nuclear data library, the uncertainties of the effective multiplication factor, leakage, and spectral indices of the Lady Godiva and Jezebel critical assemblies are quantified. Using principal component analysis on the PFNS covariance matrices results in needing only 2-3 principal components to retain the PFNS uncertainties. Then, using the polynomial chaos expansion (PCE) on the uncertain output

  16. Systematic errors in long baseline oscillation experiments

    SciTech Connect

    Harris, Deborah A.; /Fermilab

    2006-02-01

    This article gives a brief overview of long baseline neutrino experiments and their goals, and then describes the different kinds of systematic errors that are encountered in these experiments. Particular attention is paid to the uncertainties that come about because of imperfect knowledge of neutrino cross sections and more generally how neutrinos interact in nuclei. Near detectors are planned for most of these experiments, and the extent to which certain uncertainties can be reduced by the presence of near detectors is also discussed.

  17. The uncertainty of UTCI due to uncertainties in the determination of radiation fluxes derived from measured and observed meteorological data.

    PubMed

    Weihs, Philipp; Staiger, Henning; Tinz, Birger; Batchvarova, Ekaterina; Rieder, Harald; Vuilleumier, Laurent; Maturilli, Marion; Jendritzky, Gerd

    2012-05-01

    In the present study, we investigate the determination accuracy of the Universal Thermal Climate Index (UTCI). We study especially the UTCI uncertainties due to uncertainties in radiation fluxes, whose impacts on UTCI are evaluated via the mean radiant temperature (Tmrt). We assume "normal conditions", which means that usual meteorological information and data are available but no special additional measurements. First, the uncertainty arising only from the measurement uncertainties of the meteorological data is determined. Here, simulations show that uncertainties between 0.4 and 2 K due to the uncertainty of just one of the meteorological input parameters may be expected. We then analyse the determination accuracy when not all radiation data are available and modelling of the missing data is required. Since radiative transfer models require a lot of information that is usually not available, we concentrate only on the determination accuracy achievable with empirical models. The simulations show that uncertainties in the calculation of the diffuse irradiance may lead to Tmrt uncertainties of up to ±2.9 K. If long-wave radiation is missing, we may expect an uncertainty of ±2 K. If modelling of diffuse radiation and of longwave radiation is used for the calculation of Tmrt, we may then expect a determination uncertainty of ±3 K. If all radiative fluxes are modelled based on synoptic observation, the uncertainty in Tmrt is ±5.9 K. Because Tmrt is only one of the four input data required in the calculation of UTCI, the uncertainty in UTCI due to the uncertainty in radiation fluxes is less than ±2 K. The UTCI uncertainties due to uncertainties of the four meteorological input values are not larger than the 6 K reference intervals of the UTCI scale, which means that UTCI may only be wrong by one UTCI scale. This uncertainty may, however, be critical at the two temperature extremes, i.e. under extreme hot or extreme cold conditions.

  18. Estimating discharge measurement uncertainty using the interpolated variance estimator

    USGS Publications Warehouse

    Cohn, T.; Kiang, J.; Mason, R.

    2012-01-01

    Methods for quantifying the uncertainty in discharge measurements typically identify various sources of uncertainty and then estimate the uncertainty from each of these sources by applying the results of empirical or laboratory studies. If actual measurement conditions are not consistent with those encountered in the empirical or laboratory studies, these methods may give poor estimates of discharge uncertainty. This paper presents an alternative method for estimating discharge measurement uncertainty that uses statistical techniques and at-site observations. This Interpolated Variance Estimator (IVE) estimates uncertainty based on the data collected during the streamflow measurement and therefore reflects the conditions encountered at the site. The IVE has the additional advantage of capturing all sources of random uncertainty in the velocity and depth measurements. It can be applied to velocity-area discharge measurements that use a velocity meter to measure point velocities at multiple vertical sections in a channel cross section.

  19. Sources of uncertainty in flood inundation maps

    USGS Publications Warehouse

    Bales, J.D.; Wagner, C.R.

    2009-01-01

    Flood inundation maps typically have been used to depict inundated areas for floods having specific exceedance levels. The uncertainty associated with the inundation boundaries is seldom quantified, in part, because all of the sources of uncertainty are not recognized and because data available to quantify uncertainty seldom are available. Sources of uncertainty discussed in this paper include hydrologic data used for hydraulic model development and validation, topographic data, and the hydraulic model. The assumption of steady flow, which typically is made to produce inundation maps, has less of an effect on predicted inundation at lower flows than for higher flows because more time typically is required to inundate areas at high flows than at low flows. Difficulties with establishing reasonable cross sections that do not intersect and that represent water-surface slopes in tributaries contribute additional uncertainties in the hydraulic modelling. As a result, uncertainty in the flood inundation polygons simulated with a one-dimensional model increases with distance from the main channel.

  20. Declarative representation of uncertainty in mathematical models.

    PubMed

    Miller, Andrew K; Britten, Randall D; Nielsen, Poul M F

    2012-01-01

    An important aspect of multi-scale modelling is the ability to represent mathematical models in forms that can be exchanged between modellers and tools. While the development of languages like CellML and SBML have provided standardised declarative exchange formats for mathematical models, independent of the algorithm to be applied to the model, to date these standards have not provided a clear mechanism for describing parameter uncertainty. Parameter uncertainty is an inherent feature of many real systems. This uncertainty can result from a number of situations, such as: when measurements include inherent error; when parameters have unknown values and so are replaced by a probability distribution by the modeller; when a model is of an individual from a population, and parameters have unknown values for the individual, but the distribution for the population is known. We present and demonstrate an approach by which uncertainty can be described declaratively in CellML models, by utilising the extension mechanisms provided in CellML. Parameter uncertainty can be described declaratively in terms of either a univariate continuous probability density function or multiple realisations of one variable or several (typically non-independent) variables. We additionally present an extension to SED-ML (the Simulation Experiment Description Markup Language) to describe sampling sensitivity analysis simulation experiments. We demonstrate the usability of the approach by encoding a sample model in the uncertainty markup language, and by developing a software implementation of the uncertainty specification (including the SED-ML extension for sampling sensitivty analyses) in an existing CellML software library, the CellML API implementation. We used the software implementation to run sampling sensitivity analyses over the model to demonstrate that it is possible to run useful simulations on models with uncertainty encoded in this form.

  1. Uncertainty Modeling for Structural Control Analysis and Synthesis

    NASA Technical Reports Server (NTRS)

    Campbell, Mark E.; Crawley, Edward F.

    1996-01-01

    The development of an accurate model of uncertainties for the control of structures that undergo a change in operational environment, based solely on modeling and experimentation in the original environment is studied. The application used throughout this work is the development of an on-orbit uncertainty model based on ground modeling and experimentation. A ground based uncertainty model consisting of mean errors and bounds on critical structural parameters is developed. The uncertainty model is created using multiple data sets to observe all relevant uncertainties in the system. The Discrete Extended Kalman Filter is used as an identification/parameter estimation method for each data set, in addition to providing a covariance matrix which aids in the development of the uncertainty model. Once ground based modal uncertainties have been developed, they are localized to specific degrees of freedom in the form of mass and stiffness uncertainties. Two techniques are presented: a matrix method which develops the mass and stiffness uncertainties in a mathematical manner; and a sensitivity method which assumes a form for the mass and stiffness uncertainties in macroelements and scaling factors. This form allows the derivation of mass and stiffness uncertainties in a more physical manner. The mass and stiffness uncertainties of the ground based system are then mapped onto the on-orbit system, and projected to create an analogous on-orbit uncertainty model in the form of mean errors and bounds on critical parameters. The Middeck Active Control Experiment is introduced as experimental verification for the localization and projection methods developed. In addition, closed loop results from on-orbit operations of the experiment verify the use of the uncertainty model for control analysis and synthesis in space.

  2. Where does the uncertainty come from? Attributing Uncertainty in Conceptual Hydrologic Modelling

    NASA Astrophysics Data System (ADS)

    Abu Shoaib, S.; Marshall, L. A.; Sharma, A.

    2015-12-01

    Defining an appropriate forecasting model is a key phase in water resources planning and design. Quantification of uncertainty is an important step in the development and application of hydrologic models. In this study, we examine the dependency of hydrologic model uncertainty on the observed model inputs, defined model structure, parameter optimization identifiability and identified likelihood. We present here a new uncertainty metric, the Quantile Flow Deviation or QFD, to evaluate the relative uncertainty due to each of these sources under a range of catchment conditions. Through the metric, we may identify the potential spectrum of uncertainty and variability in model simulations. The QFD assesses uncertainty by estimating the deviation in flows at a given quantile across a range of scenarios. By using a quantile based metric, the change in uncertainty across individual percentiles can be assessed, thereby allowing uncertainty to be expressed as a function of time. The QFD method can be disaggregated to examine any part of the modelling process including the selection of certain model subroutines or forcing data. Case study results (including catchments in Australia and USA) suggest that model structure selection is vital irrespective of the flow percentile of interest or the catchment being studied. Examining the QFD across various quantiles additionally demonstrates that lower yielding catchments may have greater variation due to selected model structures. By incorporating multiple model structures, it is possible to assess (i) the relative importance of various sources of uncertainty, (ii) how these vary with the change in catchment location or hydrologic regime; and (iii) the impact of the length of available observations in uncertainty quantification.

  3. Uncertainty Quantification Techniques of SCALE/TSUNAMI

    SciTech Connect

    Rearden, Bradley T; Mueller, Don

    2011-01-01

    additional administrative margin to account for gap in the validation data or to conclude that the impact on the calculated bias and bias uncertainty is negligible. As a result of advances in computer programs and the evolution of cross-section covariance data, analysts can use the sensitivity and uncertainty analysis tools in the TSUNAMI codes to estimate the potential impact on the application-specific bias and bias uncertainty resulting from nuclides not represented in available benchmark experiments. This paper presents the application of methods described in a companion paper.

  4. Incorporating climate change into systematic conservation planning

    USGS Publications Warehouse

    Groves, Craig R.; Game, Edward T.; Anderson, Mark G.; Cross, Molly; Enquist, Carolyn; Ferdana, Zach; Girvetz, Evan; Gondor, Anne; Hall, Kimberly R.; Higgins, Jonathan; Marshall, Rob; Popper, Ken; Schill, Steve; Shafer, Sarah L.

    2012-01-01

    The principles of systematic conservation planning are now widely used by governments and non-government organizations alike to develop biodiversity conservation plans for countries, states, regions, and ecoregions. Many of the species and ecosystems these plans were designed to conserve are now being affected by climate change, and there is a critical need to incorporate new and complementary approaches into these plans that will aid species and ecosystems in adjusting to potential climate change impacts. We propose five approaches to climate change adaptation that can be integrated into existing or new biodiversity conservation plans: (1) conserving the geophysical stage, (2) protecting climatic refugia, (3) enhancing regional connectivity, (4) sustaining ecosystem process and function, and (5) capitalizing on opportunities emerging in response to climate change. We discuss both key assumptions behind each approach and the trade-offs involved in using the approach for conservation planning. We also summarize additional data beyond those typically used in systematic conservation plans required to implement these approaches. A major strength of these approaches is that they are largely robust to the uncertainty in how climate impacts may manifest in any given region.

  5. Radar stage uncertainty

    USGS Publications Warehouse

    Fulford, J.M.; Davies, W.J.

    2005-01-01

    The U.S. Geological Survey is investigating the performance of radars used for stage (or water-level) measurement. This paper presents a comparison of estimated uncertainties and data for radar water-level measurements with float, bubbler, and wire weight water-level measurements. The radar sensor was also temperature-tested in a laboratory. The uncertainty estimates indicate that radar measurements are more accurate than uncorrected pressure sensors at higher water stages, but are less accurate than pressure sensors at low stages. Field data at two sites indicate that radar sensors may have a small negative bias. Comparison of field radar measurements with wire weight measurements found that the radar tends to measure slightly lower values as stage increases. Copyright ASCE 2005.

  6. How Uncertain is Uncertainty?

    NASA Astrophysics Data System (ADS)

    Vámos, Tibor

    The gist of the paper is the fundamental uncertain nature of all kinds of uncertainties and consequently a critical epistemic review of historical and recent approaches, computational methods, algorithms. The review follows the development of the notion from the beginnings of thinking, via the Aristotelian and Skeptic view, the medieval nominalism and the influential pioneering metaphors of ancient India and Persia to the birth of modern mathematical disciplinary reasoning. Discussing the models of uncertainty, e.g. the statistical, other physical and psychological background we reach a pragmatic model related estimation perspective, a balanced application orientation for different problem areas. Data mining, game theories and recent advances in approximation algorithms are discussed in this spirit of modest reasoning.

  7. Uncertainties in transpiration estimates.

    PubMed

    Coenders-Gerrits, A M J; van der Ent, R J; Bogaard, T A; Wang-Erlandsson, L; Hrachowitz, M; Savenije, H H G

    2014-02-13

    arising from S. Jasechko et al. Nature 496, 347-350 (2013)10.1038/nature11983How best to assess the respective importance of plant transpiration over evaporation from open waters, soils and short-term storage such as tree canopies and understories (interception) has long been debated. On the basis of data from lake catchments, Jasechko et al. conclude that transpiration accounts for 80-90% of total land evaporation globally (Fig. 1a). However, another choice of input data, together with more conservative accounting of the related uncertainties, reduces and widens the transpiration ratio estimation to 35-80%. Hence, climate models do not necessarily conflict with observations, but more measurements on the catchment scale are needed to reduce the uncertainty range. There is a Reply to this Brief Communications Arising by Jasechko, S. et al. Nature 506, http://dx.doi.org/10.1038/nature12926 (2014).

  8. Aggregating and Communicating Uncertainty.

    DTIC Science & Technology

    1980-04-01

    means for identifying and communicating uncertainty. i 12- APPENDIX A BIBLIOGRAPHY j| 1. Ajzen , Icek ; "Intuitive Theories of Events and the Effects...disregarding valid but noncausal information." (Icak Ajzen , "Intuitive Theo- ries of Events and the Effects of Base-Rate Information on Prediction...9 4i,* ,4.. -. .- S % to the criterion while disregarding valid but noncausal information." (Icak Ajzen , "Intuitive Theories of Events and the Effects

  9. Uncertainties in climate stabilization

    SciTech Connect

    Wigley, T. M.; Clarke, Leon E.; Edmonds, James A.; Jacoby, H. D.; Paltsev, S.; Pitcher, Hugh M.; Reilly, J. M.; Richels, Richard G.; Sarofim, M. C.; Smith, Steven J.

    2009-11-01

    We explore the atmospheric composition, temperature and sea level implications of new reference and cost-optimized stabilization emissions scenarios produced using three different Integrated Assessment (IA) models for U.S. Climate Change Science Program (CCSP) Synthesis and Assessment Product 2.1a. We also consider an extension of one of these sets of scenarios out to 2300. Stabilization is defined in terms of radiative forcing targets for the sum of gases potentially controlled under the Kyoto Protocol. For the most stringent stabilization case (“Level 1” with CO2 concentration stabilizing at about 450 ppm), peak CO2 emissions occur close to today, implying a need for immediate CO2 emissions abatement if we wish to stabilize at this level. In the extended reference case, CO2 stabilizes at 1000 ppm in 2200 – but even to achieve this target requires large and rapid CO2 emissions reductions over the 22nd century. Future temperature changes for the Level 1 stabilization case show considerable uncertainty even when a common set of climate model parameters is used (a result of different assumptions for non-Kyoto gases). Uncertainties are about a factor of three when climate sensitivity uncertainties are accounted for. We estimate the probability that warming from pre-industrial times will be less than 2oC to be about 50%. For one of the IA models, warming in the Level 1 case is greater out to 2050 than in the reference case, due to the effect of decreasing SO2 emissions that occur as a side effect of the policy-driven reduction in CO2 emissions. Sea level rise uncertainties for the Level 1 case are very large, with increases ranging from 12 to 100 cm over 2000 to 2300.

  10. Variants of Uncertainty

    DTIC Science & Technology

    1981-05-15

    Variants of Uncertainty Daniel Kahneman University of British Columbia Amos Tversky Stanford University DTI-C &%E-IECTE ~JUNO 1i 19 8 1j May 15, 1981... Dennett , 1979) in which different parts have ac- cess to different data, assign then different weights and hold different views of the situation...2robable and t..h1 provable. Oxford- Claredor Press, 1977. Dennett , D.C. Brainstorms. Hassocks: Harvester, 1979. Donchin, E., Ritter, W. & McCallum, W.C

  11. Calibration Under Uncertainty.

    SciTech Connect

    Swiler, Laura Painton; Trucano, Timothy Guy

    2005-03-01

    This report is a white paper summarizing the literature and different approaches to the problem of calibrating computer model parameters in the face of model uncertainty. Model calibration is often formulated as finding the parameters that minimize the squared difference between the model-computed data (the predicted data) and the actual experimental data. This approach does not allow for explicit treatment of uncertainty or error in the model itself: the model is considered the %22true%22 deterministic representation of reality. While this approach does have utility, it is far from an accurate mathematical treatment of the true model calibration problem in which both the computed data and experimental data have error bars. This year, we examined methods to perform calibration accounting for the error in both the computer model and the data, as well as improving our understanding of its meaning for model predictability. We call this approach Calibration under Uncertainty (CUU). This talk presents our current thinking on CUU. We outline some current approaches in the literature, and discuss the Bayesian approach to CUU in detail.

  12. Uncertainty Quantification in Aeroelasticity

    NASA Astrophysics Data System (ADS)

    Beran, Philip; Stanford, Bret; Schrock, Christopher

    2017-01-01

    Physical interactions between a fluid and structure, potentially manifested as self-sustained or divergent oscillations, can be sensitive to many parameters whose values are uncertain. Of interest here are aircraft aeroelastic interactions, which must be accounted for in aircraft certification and design. Deterministic prediction of these aeroelastic behaviors can be difficult owing to physical and computational complexity. New challenges are introduced when physical parameters and elements of the modeling process are uncertain. By viewing aeroelasticity through a nondeterministic prism, where key quantities are assumed stochastic, one may gain insights into how to reduce system uncertainty, increase system robustness, and maintain aeroelastic safety. This article reviews uncertainty quantification in aeroelasticity using traditional analytical techniques not reliant on computational fluid dynamics; compares and contrasts this work with emerging methods based on computational fluid dynamics, which target richer physics; and reviews the state of the art in aeroelastic optimization under uncertainty. Barriers to continued progress, for example, the so-called curse of dimensionality, are discussed.

  13. Uncertainty quantification in nanomechanical measurements using the atomic force microscope.

    PubMed

    Wagner, Ryan; Moon, Robert; Pratt, Jon; Shaw, Gordon; Raman, Arvind

    2011-11-11

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale resolution of both inorganic and biological surfaces and nanomaterials. We present a framework to ascribe uncertainty to local nanomechanical properties of any nanoparticle or surface measured with the AFM by taking into account the main uncertainty sources inherent in such measurements. We demonstrate the framework by quantifying uncertainty in AFM-based measurements of the transverse elastic modulus of cellulose nanocrystals (CNCs), an abundant, plant-derived nanomaterial whose mechanical properties are comparable to Kevlar fibers. For a single, isolated CNC the transverse elastic modulus was found to have a mean of 8.1 GPa and a 95% confidence interval of 2.7-20 GPa. A key result is that multiple replicates of force-distance curves do not sample the important sources of uncertainty, which are systematic in nature. The dominant source of uncertainty is the nondimensional photodiode sensitivity calibration rather than the cantilever stiffness or Z-piezo calibrations. The results underscore the great need for, and open a path towards, quantifying and minimizing uncertainty in AFM-based material property measurements of nanoparticles, nanostructured surfaces, thin films, polymers and biomaterials.

  14. Potlining Additives

    SciTech Connect

    Rudolf Keller

    2004-08-10

    In this project, a concept to improve the performance of aluminum production cells by introducing potlining additives was examined and tested. Boron oxide was added to cathode blocks, and titanium was dissolved in the metal pool; this resulted in the formation of titanium diboride and caused the molten aluminum to wet the carbonaceous cathode surface. Such wetting reportedly leads to operational improvements and extended cell life. In addition, boron oxide suppresses cyanide formation. This final report presents and discusses the results of this project. Substantial economic benefits for the practical implementation of the technology are projected, especially for modern cells with graphitized blocks. For example, with an energy savings of about 5% and an increase in pot life from 1500 to 2500 days, a cost savings of $ 0.023 per pound of aluminum produced is projected for a 200 kA pot.

  15. Phosphazene additives

    DOEpatents

    Harrup, Mason K; Rollins, Harry W

    2013-11-26

    An additive comprising a phosphazene compound that has at least two reactive functional groups and at least one capping functional group bonded to phosphorus atoms of the phosphazene compound. One of the at least two reactive functional groups is configured to react with cellulose and the other of the at least two reactive functional groups is configured to react with a resin, such as an amine resin of a polycarboxylic acid resin. The at least one capping functional group is selected from the group consisting of a short chain ether group, an alkoxy group, or an aryloxy group. Also disclosed are an additive-resin admixture, a method of treating a wood product, and a wood product.

  16. Particle Dark Matter constraints: the effect of Galactic uncertainties

    NASA Astrophysics Data System (ADS)

    Benito, Maria; Bernal, Nicolás; Bozorgnia, Nassim; Calore, Francesca; Iocco, Fabio

    2017-02-01

    Collider, space, and Earth based experiments are now able to probe several extensions of the Standard Model of particle physics which provide viable dark matter candidates. Direct and indirect dark matter searches rely on inputs of astrophysical nature, such as the local dark matter density or the shape of the dark matter density profile in the target in object. The determination of these quantities is highly affected by astrophysical uncertainties. The latter, especially those for our own Galaxy, are ill-known, and often not fully accounted for when analyzing the phenomenology of particle physics models. In this paper we present a systematic, quantitative estimate of how astrophysical uncertainties on Galactic quantities (such as the local galactocentric distance, circular velocity, or the morphology of the stellar disk and bulge) propagate to the determination of the phenomenology of particle physics models, thus eventually affecting the determination of new physics parameters. We present results in the context of two specific extensions of the Standard Model (the Singlet Scalar and the Inert Doublet) that we adopt as case studies for their simplicity in illustrating the magnitude and impact of such uncertainties on the parameter space of the particle physics model itself. Our findings point toward very relevant effects of current Galactic uncertainties on the determination of particle physics parameters, and urge a systematic estimate of such uncertainties in more complex scenarios, in order to achieve constraints on the determination of new physics that realistically include all known uncertainties.

  17. Pauli effects in uncertainty relations

    NASA Astrophysics Data System (ADS)

    Toranzo, I. V.; Sánchez-Moreno, P.; Esquivel, R. O.; Dehesa, J. S.

    2014-10-01

    In this Letter we analyze the effect of the spin dimensionality of a physical system in two mathematical formulations of the uncertainty principle: a generalized Heisenberg uncertainty relation valid for all antisymmetric N-fermion wavefunctions, and the Fisher-information-based uncertainty relation valid for all antisymmetric N-fermion wavefunctions of central potentials. The accuracy of these spin-modified uncertainty relations is examined for all atoms from Hydrogen to Lawrencium in a self-consistent framework.

  18. Infiltration under snow cover: Modeling approaches and predictive uncertainty

    NASA Astrophysics Data System (ADS)

    Meeks, Jessica; Moeck, Christian; Brunner, Philip; Hunkeler, Daniel

    2017-03-01

    method. Further, our study demonstrated that an uncertainty analysis of model predictions is easily accomplished due to the low computational demand of the models and efficient calibration software and is absolutely worth the additional investment. Lastly, development of a systematic instrumentation that evaluates the distributed, temporal evolution of snowpack drainage is vital for optimal understanding and management of cold-climate hydrologic systems.

  19. Dinosaur Systematics

    NASA Astrophysics Data System (ADS)

    Carpenter, Kenneth; Currie, Philip J.

    1992-07-01

    In recent years dinosaurs have captured the attention of the public at an unprecedented level. At the heart of this resurgence in popular interest is an increased level of research activity, much of which is innovative in the field of paleontology. For instance, whereas earlier paleontological studies emphasized basic morphologic description and taxonomic classification, modern studies attempt to examine the role and nature of dinosaurs as living animals. More than ever before, we understand how these extinct species functioned, behaved, interacted with each other and the environment, and evolved. Nevertheless, these studies rely on certain basic building blocks of knowledge, including facts about dinosaur anatomy and taxonomic relationships. One of the purposes of this volume is to unravel some of the problems surrounding dinosaur systematics and to increase our understanding of dinosaurs as a biological species. Dinosaur Systematics presents a current overview of dinosaur systematics using various examples to explore what is a species in a dinosaur, what separates genders in dinosaurs, what morphological changes occur with maturation of a species, and what morphological variations occur within a species.

  20. Two Evaluation Budgets for the Measurement Uncertainty of Glucose in Clinical Chemistry

    PubMed Central

    Zhang, Ling; Bi, Xiaoyun; Deng, Xiaoling

    2011-01-01

    Background Measurement uncertainty characterizes the dispersion of the quantity values attributed to a measurand. Although this concept was introduced to medical laboratories some years ago, not all medical researchers are familiar with it. Therefore, the evaluation and expression of measurement uncertainty must be highlighted using a practical example. Methods In accordance with the procedure for evaluating and expressing uncertainty, provided by the Joint Committee for Guides in Metrology (JCGM), we used plasma glucose (Glu) as an example and defined it as the measurand. We then analyzed the main sources of uncertainty, evaluated each component of uncertainty, and calculated the combined uncertainty and expanded uncertainty with 2 budgets for single measurements and continuous monitoring, respectively. Results During the measurement of Glu, the main sources of uncertainty included imprecision, within-subject biological variance (BVw), calibrator uncertainty, and systematic bias. We evaluated the uncertainty of each component to be 1.26%, 1.91%, 5.70%, 0.42%, and -2.87% for within-run imprecision, between-day imprecision, BVw, calibrator uncertainty, and systematic bias, respectively. For a single specimen, the expanded uncertainty was 7.38% or 6.1±0.45 mmol/L (κ=2); in continuous monitoring of Glu, the expanded uncertainty was 13.58% or 6.1±0.83 mmol/L (κ=2). Conclusions We have demonstrated the overall procedure for evaluating and reporting uncertainty with 2 different budgets. The uncertainty is not only related to the medical laboratory in which the measurement is undertaken, but is also associated with the calibrator uncertainty and the biological variation of the subject. Therefore, it is helpful in explaining the accuracy of test results. PMID:21779190

  1. Long-term changes in net radiation at the Earth's surface: uncertainties and implications

    NASA Astrophysics Data System (ADS)

    Sheffield, Justin; Coccia, Gabriele; Siemann, Amanda; Wood, Eric

    2014-05-01

    Net radiation at the earth's surface plays a key role in terrestrial water, energy and carbon fluxes, but there is large uncertainty in its variation over decadal time scales. Globally, surface and satellite measurements indicate global dimming in solar radiation over many regions since the mid-20th century and then brightening over recent decades due to changes in cloudiness and aerosols. Changes in longwave radiation are driven by long-term increases in greenhouse gases and inter-annual variations in short-lived constituents such as dust and black carbon. These increases are partially offset, however, by increases in surface temperature. Current estimates of these components of the net radiation balance from satellite remote sensing are inconsistent because of inhomogeneities from changes in satellites, sensor calibration, retrieval algorithms, and so on, in addition to systematic biases. Estimates from direct ground observations are hampered by sparse spatial networks and often short-term records, and estimates based on denser networks of meteorological data are affected by errors in empirical radiation models. Some of the largest uncertainties are in the characterization of the global distribution and temporal changes in surface shortwave albedo and infrared emissivity, especially in regions with seasonal and patchy snow cover. This paper presents comparisons of legacy satellite-derived datasets (e.g. ISCCP, GEWEX/SRB) and recently developed datasets based on updated algorithms and homogenized data sources (e.g. NASA Princeton-Measures, HIRS) in the context of long-term changes in the net radiation balance at the earth's surface. We compare these with ground observations and empirical estimates based on meteorological data from in-situ sources and reanalysis. In particular we focus on the uncertainties in the magnitude and variation in surface albedo and emissivity, and their contribution to uncertainties in net radiation. We discuss the implications of these

  2. Monte Carlo analysis of uncertainty propagation in a stratospheric model. 2: Uncertainties due to reaction rates

    NASA Technical Reports Server (NTRS)

    Stolarski, R. S.; Butler, D. M.; Rundel, R. D.

    1977-01-01

    A concise stratospheric model was used in a Monte-Carlo analysis of the propagation of reaction rate uncertainties through the calculation of an ozone perturbation due to the addition of chlorine. Two thousand Monte-Carlo cases were run with 55 reaction rates being varied. Excellent convergence was obtained in the output distributions because the model is sensitive to the uncertainties in only about 10 reactions. For a 1 ppby chlorine perturbation added to a 1.5 ppby chlorine background, the resultant 1 sigma uncertainty on the ozone perturbation is a factor of 1.69 on the high side and 1.80 on the low side. The corresponding 2 sigma factors are 2.86 and 3.23. Results are also given for the uncertainties, due to reaction rates, in the ambient concentrations of stratospheric species.

  3. ENHANCED UNCERTAINTY ANALYSIS FOR SRS COMPOSITE ANALYSIS

    SciTech Connect

    Smith, F.; Phifer, M.

    2011-06-30

    The Composite Analysis (CA) performed for the Savannah River Site (SRS) in 2009 (SRS CA 2009) included a simplified uncertainty analysis. The uncertainty analysis in the CA (Smith et al. 2009b) was limited to considering at most five sources in a separate uncertainty calculation performed for each POA. To perform the uncertainty calculations in a reasonable amount of time, the analysis was limited to using 400 realizations, 2,000 years of simulated transport time, and the time steps used for the uncertainty analysis were increased from what was used in the CA base case analysis. As part of the CA maintenance plan, the Savannah River National Laboratory (SRNL) committed to improving the CA uncertainty/sensitivity analysis. The previous uncertainty analysis was constrained by the standard GoldSim licensing which limits the user to running at most four Monte Carlo uncertainty calculations (also called realizations) simultaneously. Some of the limitations on the number of realizations that could be practically run and the simulation time steps were removed by building a cluster of three HP Proliant windows servers with a total of 36 64-bit processors and by licensing the GoldSim DP-Plus distributed processing software. This allowed running as many as 35 realizations simultaneously (one processor is reserved as a master process that controls running the realizations). These enhancements to SRNL computing capabilities made uncertainty analysis: using 1000 realizations, using the time steps employed in the base case CA calculations, with more sources, and simulating radionuclide transport for 10,000 years feasible. In addition, an importance screening analysis was performed to identify the class of stochastic variables that have the most significant impact on model uncertainty. This analysis ran the uncertainty model separately testing the response to variations in the following five sets of model parameters: (a) K{sub d} values (72 parameters for the 36 CA elements in

  4. Sources of uncertainty in intuitive physics.

    PubMed

    Smith, Kevin A; Vul, Edward

    2013-01-01

    Recent work suggests that people predict how objects interact in a manner consistent with Newtonian physics, but with additional uncertainty. However, the sources of uncertainty have not been examined. In this study, we measure perceptual noise in initial conditions and stochasticity in the physical model used to make predictions. Participants predicted the trajectory of a moving object through occluded motion and bounces, and we compared their behavior to an ideal observer model. We found that human judgments cannot be captured by simple heuristics and must incorporate noisy dynamics. Moreover, these judgments are biased consistently with a prior expectation on object destinations, suggesting that people use simple expectations about outcomes to compensate for uncertainty about their physical models.

  5. Uncertainties in global ocean surface heat flux climatologies derived from ship observations

    SciTech Connect

    Gleckler, P.J.; Weare, B.C.

    1995-08-01

    A methodology to define uncertainties associated with ocean surface heat flux calculations has been developed and applied to a revised version of the Oberhuber global climatology, which utilizes a summary of the COADS surface observations. Systematic and random uncertainties in the net oceanic heat flux and each of its four components at individual grid points and for zonal averages have been estimated for each calendar month and the annual mean. The most important uncertainties of the 2{degree} x 2{degree} grid cell values of each of the heat fluxes are described. Annual mean net shortwave flux random uncertainties associated with errors in estimating cloud cover in the tropics yield total uncertainties which are greater than 25 W m{sup {minus}2}. In the northern latitudes, where the large number of observations substantially reduce the influence of these random errors, the systematic uncertainties in the utilized parameterization are largely responsible for total uncertainties in the shortwave fluxes which usually remain greater than 10 W m{sup {minus}2}. Systematic uncertainties dominate in the zonal means because spatial averaging has led to a further reduction of the random errors. The situation for the annual mean latent heat flux is somewhat different in that even for grid point values the contributions of the systematic uncertainties tend to be larger than those of the random uncertainties at most all latitudes. Latent heat flux uncertainties are greater than 20 W m{sup {minus}2} nearly everywhere south of 40{degree}N, and in excess of 30 W m{sup {minus}2} over broad areas of the subtropics, even those with large numbers of observations. Resulting zonal mean latent heat flux uncertainties are largest ({approximately}30 W m{sup {minus}2}) in the middle latitudes and subtropics and smallest ({approximately}10--25 W m{sup {minus}2}) near the equator and over the northernmost regions.

  6. Picturing Data With Uncertainty

    NASA Technical Reports Server (NTRS)

    Kao, David; Love, Alison; Dungan, Jennifer L.; Pang, Alex

    2004-01-01

    NASA is in the business of creating maps for scientific purposes to represent important biophysical or geophysical quantities over space and time. For example, maps of surface temperature over the globe tell scientists where and when the Earth is heating up; regional maps of the greenness of vegetation tell scientists where and when plants are photosynthesizing. There is always uncertainty associated with each value in any such map due to various factors. When uncertainty is fully modeled, instead of a single value at each map location, there is a distribution expressing a set of possible outcomes at each location. We consider such distribution data as multi-valued data since it consists of a collection of values about a single variable. Thus, a multi-valued data represents both the map and its uncertainty. We have been working on ways to visualize spatial multi-valued data sets effectively for fields with regularly spaced units or grid cells such as those in NASA's Earth science applications. A new way to display distributions at multiple grid locations is to project the distributions from an individual row, column or other user-selectable straight transect from the 2D domain. First at each grid cell in a given slice (row, column or transect), we compute a smooth density estimate from the underlying data. Such a density estimate for the probability density function (PDF) is generally more useful than a histogram, which is a classic density estimate. Then, the collection of PDFs along a given slice are presented vertically above the slice and form a wall. To minimize occlusion of intersecting slices, the corresponding walls are positioned at the far edges of the boundary. The PDF wall depicts the shapes of the distributions very dearly since peaks represent the modes (or bumps) in the PDFs. We've defined roughness as the number of peaks in the distribution. Roughness is another useful summary information for multimodal distributions. The uncertainty of the multi

  7. Satellite altitude determination uncertainties

    NASA Technical Reports Server (NTRS)

    Siry, J. W.

    1972-01-01

    Satellite altitude determination uncertainties will be discussed from the standpoint of the GEOS-C satellite, from the longer range viewpoint afforded by the Geopause concept. Data are focused on methods for short-arc tracking which are essentially geometric in nature. One uses combinations of lasers and collocated cameras. The other method relies only on lasers, using three or more to obtain the position fix. Two typical locales are looked at, the Caribbean area, and a region associated with tracking sites at Goddard, Bermuda and Canada which encompasses a portion of the Gulf Stream in which meanders develop.

  8. Analysis of automated highway system risks and uncertainties. Volume 5

    SciTech Connect

    Sicherman, A.

    1994-10-01

    This volume describes a risk analysis performed to help identify important Automated Highway System (AHS) deployment uncertainties and quantify their effect on costs and benefits for a range of AHS deployment scenarios. The analysis identified a suite of key factors affecting vehicle and roadway costs, capacities and market penetrations for alternative AHS deployment scenarios. A systematic protocol was utilized for obtaining expert judgments of key factor uncertainties in the form of subjective probability percentile assessments. Based on these assessments, probability distributions on vehicle and roadway costs, capacity and market penetration were developed for the different scenarios. The cost/benefit risk methodology and analysis provide insights by showing how uncertainties in key factors translate into uncertainties in summary cost/benefit indices.

  9. Extended Forward Sensitivity Analysis for Uncertainty Quantification

    SciTech Connect

    Haihua Zhao; Vincent A. Mousseau

    2011-09-01

    Verification and validation (V&V) are playing more important roles to quantify uncertainties and realize high fidelity simulations in engineering system analyses, such as transients happened in a complex nuclear reactor system. Traditional V&V in the reactor system analysis focused more on the validation part or did not differentiate verification and validation. The traditional approach to uncertainty quantification is based on a 'black box' approach. The simulation tool is treated as an unknown signal generator, a distribution of inputs according to assumed probability density functions is sent in and the distribution of the outputs is measured and correlated back to the original input distribution. The 'black box' method mixes numerical errors with all other uncertainties. It is also not efficient to perform sensitivity analysis. Contrary to the 'black box' method, a more efficient sensitivity approach can take advantage of intimate knowledge of the simulation code. In these types of approaches equations for the propagation of uncertainty are constructed and the sensitivities are directly solved for as variables in the simulation. This paper presents the forward sensitivity analysis as a method to help uncertainty qualification. By including time step and potentially spatial step as special sensitivity parameters, the forward sensitivity method is extended as one method to quantify numerical errors. Note that by integrating local truncation errors over the whole system through the forward sensitivity analysis process, the generated time step and spatial step sensitivity information reflect global numerical errors. The discretization errors can be systematically compared against uncertainties due to other physical parameters. This extension makes the forward sensitivity method a much more powerful tool to help uncertainty qualification. By knowing the relative sensitivity of time and space steps with other interested physical parameters, the simulation is allowed

  10. Uncertainty bounds using sector theory

    NASA Technical Reports Server (NTRS)

    Waszak, Martin R.; Schmidt, David K.

    1989-01-01

    An approach based on sector-stability theory can furnish a description of the uncertainty associated with the frequency response of a model, given sector-bounds on the individual parameters of the model. The application of the sector-based approach to the formulation of useful uncertainty descriptions for linear, time-invariant multivariable systems is presently explored, and the approach is applied to two generic forms of parameter uncertainty in order to investigate its advantages and limitations. The results obtained show that sector-uncertainty bounds can be used to evaluate the impact of parameter uncertainties on the frequency response of the design model.

  11. Addressing uncertainty in adaptation planning for agriculture

    PubMed Central

    Vermeulen, Sonja J.; Challinor, Andrew J.; Thornton, Philip K.; Campbell, Bruce M.; Eriyagama, Nishadi; Vervoort, Joost M.; Kinyangi, James; Jarvis, Andy; Läderach, Peter; Ramirez-Villegas, Julian; Nicklin, Kathryn J.; Hawkins, Ed; Smith, Daniel R.

    2013-01-01

    We present a framework for prioritizing adaptation approaches at a range of timeframes. The framework is illustrated by four case studies from developing countries, each with associated characterization of uncertainty. Two cases on near-term adaptation planning in Sri Lanka and on stakeholder scenario exercises in East Africa show how the relative utility of capacity vs. impact approaches to adaptation planning differ with level of uncertainty and associated lead time. An additional two cases demonstrate that it is possible to identify uncertainties that are relevant to decision making in specific timeframes and circumstances. The case on coffee in Latin America identifies altitudinal thresholds at which incremental vs. transformative adaptation pathways are robust options. The final case uses three crop–climate simulation studies to demonstrate how uncertainty can be characterized at different time horizons to discriminate where robust adaptation options are possible. We find that impact approaches, which use predictive models, are increasingly useful over longer lead times and at higher levels of greenhouse gas emissions. We also find that extreme events are important in determining predictability across a broad range of timescales. The results demonstrate the potential for robust knowledge and actions in the face of uncertainty. PMID:23674681

  12. Addressing uncertainty in adaptation planning for agriculture.

    PubMed

    Vermeulen, Sonja J; Challinor, Andrew J; Thornton, Philip K; Campbell, Bruce M; Eriyagama, Nishadi; Vervoort, Joost M; Kinyangi, James; Jarvis, Andy; Läderach, Peter; Ramirez-Villegas, Julian; Nicklin, Kathryn J; Hawkins, Ed; Smith, Daniel R

    2013-05-21

    We present a framework for prioritizing adaptation approaches at a range of timeframes. The framework is illustrated by four case studies from developing countries, each with associated characterization of uncertainty. Two cases on near-term adaptation planning in Sri Lanka and on stakeholder scenario exercises in East Africa show how the relative utility of capacity vs. impact approaches to adaptation planning differ with level of uncertainty and associated lead time. An additional two cases demonstrate that it is possible to identify uncertainties that are relevant to decision making in specific timeframes and circumstances. The case on coffee in Latin America identifies altitudinal thresholds at which incremental vs. transformative adaptation pathways are robust options. The final case uses three crop-climate simulation studies to demonstrate how uncertainty can be characterized at different time horizons to discriminate where robust adaptation options are possible. We find that impact approaches, which use predictive models, are increasingly useful over longer lead times and at higher levels of greenhouse gas emissions. We also find that extreme events are important in determining predictability across a broad range of timescales. The results demonstrate the potential for robust knowledge and actions in the face of uncertainty.

  13. Systematic Effects in Atomic Fountain Clocks

    NASA Astrophysics Data System (ADS)

    Gibble, Kurt

    2016-06-01

    We describe recent advances in the accuracies of atomic fountain clocks. New rigorous treatments of the previously large systematic uncertainties, distributed cavity phase, microwave lensing, and background gas collisions, enabled these advances. We also discuss background gas collisions of optical lattice and ion clocks and derive the smooth transition of the microwave lensing frequency shift to photon recoil shifts for large atomic wave packets.

  14. Uncertainties in landscape analysis and ecosystem service assessment.

    PubMed

    Hou, Y; Burkhard, B; Müller, F

    2013-09-01

    Landscape analysis and ecosystem service assessment have drawn increasing concern from research and application at the landscape scale. Thanks to the continuously emerging assessments as well as studies aiming at evaluation method improvement, policy makers and landscape managers have an increasing interest in integrating ecosystem services into their decisions. However, the plausible assessments carry numerous sources of uncertainties, which regrettably tend to be ignored or disregarded by the actors or researchers. In order to cope with uncertainties and make them more transparent for landscape managers, we demonstrate them by reviewing literature, describing an example and proposing approaches for uncertainty analysis. Additionally, we conclude with potential actions to reduce the insecurities accompanying landscape analysis and ecosystem service assessments. As for landscape analysis, the fundamental uncertainty origins are landscape complexity and methodological uncertainties. Concerning the uncertainty sources of ecosystem service assessments, the complexity of the natural system, respondents' preferences and technical problems play essential roles. By analyzing the assessment process, we find that initial data uncertainty pervades the whole assessment and argue that the limited knowledge about the complexity of ecosystems is the focal origin of uncertainties. For analyzing uncertainties in assessments, we propose systems analysis, scenario simulation and the comparison method as promising strategies. To reduce uncertainties, we assume that actions should integrate continuous learning, expanding respondent numbers and sources, considering representativeness, improving and standardizing assessment methods and optimizing spatial and geobiophysical data.

  15. Teaching Quantum Uncertainty1

    NASA Astrophysics Data System (ADS)

    Hobson, Art

    2011-10-01

    An earlier paper2 introduces quantum physics by means of four experiments: Youngs double-slit interference experiment using (1) a light beam, (2) a low-intensity light beam with time-lapse photography, (3) an electron beam, and (4) a low-intensity electron beam with time-lapse photography. It's ironic that, although these experiments demonstrate most of the quantum fundamentals, conventional pedagogy stresses their difficult and paradoxical nature. These paradoxes (i.e., logical contradictions) vanish, and understanding becomes simpler, if one takes seriously the fact that quantum mechanics is the nonrelativistic limit of our most accurate physical theory, namely quantum field theory, and treats the Schroedinger wave function, as well as the electromagnetic field, as quantized fields.2 Both the Schroedinger field, or "matter field," and the EM field are made of "quanta"—spatially extended but energetically discrete chunks or bundles of energy. Each quantum comes nonlocally from the entire space-filling field and interacts with macroscopic systems such as the viewing screen by collapsing into an atom instantaneously and randomly in accordance with the probability amplitude specified by the field. Thus, uncertainty and nonlocality are inherent in quantum physics. This paper is about quantum uncertainty. A planned later paper will take up quantum nonlocality.

  16. Antarctic Photochemistry: Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Stewart, Richard W.; McConnell, Joseph R.

    1999-01-01

    Understanding the photochemistry of the Antarctic region is important for several reasons. Analysis of ice cores provides historical information on several species such as hydrogen peroxide and sulfur-bearing compounds. The former can potentially provide information on the history of oxidants in the troposphere and the latter may shed light on DMS-climate relationships. Extracting such information requires that we be able to model the photochemistry of the Antarctic troposphere and relate atmospheric concentrations to deposition rates and sequestration in the polar ice. This paper deals with one aspect of the uncertainty inherent in photochemical models of the high latitude troposphere: that arising from imprecision in the kinetic data used in the calculations. Such uncertainties in Antarctic models tend to be larger than those in models of mid to low latitude clean air. One reason is the lower temperatures which result in increased imprecision in kinetic data, assumed to be best characterized at 298K. Another is the inclusion of a DMS oxidation scheme in the present model. Many of the rates in this scheme are less precisely known than are rates in the standard chemistry used in many stratospheric and tropospheric models.

  17. Probabilistic Mass Growth Uncertainties

    NASA Technical Reports Server (NTRS)

    Plumer, Eric; Elliott, Darren

    2013-01-01

    Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.

  18. Uncertainty in adaptive capacity

    NASA Astrophysics Data System (ADS)

    Adger, W. Neil; Vincent, Katharine

    2005-03-01

    The capacity to adapt is a critical element of the process of adaptation: it is the vector of resources that represent the asset base from which adaptation actions can be made. Adaptive capacity can in theory be identified and measured at various scales, from the individual to the nation. The assessment of uncertainty within such measures comes from the contested knowledge domain and theories surrounding the nature of the determinants of adaptive capacity and the human action of adaptation. While generic adaptive capacity at the national level, for example, is often postulated as being dependent on health, governance and political rights, and literacy, and economic well-being, the determinants of these variables at national levels are not widely understood. We outline the nature of this uncertainty for the major elements of adaptive capacity and illustrate these issues with the example of a social vulnerability index for countries in Africa. To cite this article: W.N. Adger, K. Vincent, C. R. Geoscience 337 (2005).

  19. Flight Departure Delay and Rerouting Under Uncertainty in En Route Convective Weather

    NASA Technical Reports Server (NTRS)

    Mukherjee, Avijit; Grabbe, Shon; Sridhar, Banavar

    2011-01-01

    Delays caused by uncertainty in weather forecasts can be reduced by improving traffic flow management decisions. This paper presents a methodology for traffic flow management under uncertainty in convective weather forecasts. An algorithm for assigning departure delays and reroutes to aircraft is presented. Departure delay and route assignment are executed at multiple stages, during which, updated weather forecasts and flight schedules are used. At each stage, weather forecasts up to a certain look-ahead time are treated as deterministic and flight scheduling is done to mitigate the impact of weather on four-dimensional flight trajectories. Uncertainty in weather forecasts during departure scheduling results in tactical airborne holding of flights. The amount of airborne holding depends on the accuracy of forecasts as well as the look-ahead time included in the departure scheduling. The weather forecast look-ahead time is varied systematically within the experiments performed in this paper to analyze its effect on flight delays. Based on the results, longer look-ahead times cause higher departure delays and additional flying time due to reroutes. However, the amount of airborne holding necessary to prevent weather incursions reduces when the forecast look-ahead times are higher. For the chosen day of traffic and weather, setting the look-ahead time to 90 minutes yields the lowest total delay cost.

  20. An Integrated Bayesian Uncertainty Estimator: fusion of Input, Parameter and Model Structural Uncertainty Estimation in Hydrologic Prediction System

    NASA Astrophysics Data System (ADS)

    Ajami, N. K.; Duan, Q.; Sorooshian, S.

    2005-12-01

    To-date single conceptual hydrologic models often applied to interpret physical processes within a watershed. Nevertheless hydrologic models regardless of their sophistication and complexity are simplified representation of the complex, spatially distributed and highly nonlinear real world system. Consequently their hydrologic predictions contain considerable uncertainty from different sources including: hydrometeorological forcing inputs, boundary/initial conditions, model structure, model parameters which need to be accounted for. Thus far the effort has gone to address these sources of uncertainty explicitly, making an implicit assumption that uncertainties from different sources are additive. Nevertheless because of the nonlinear nature of the hydrologic systems, it is not feasible to account for these uncertainties independently. Here we present the Integrated Bayesian Uncertainty Estimator (IBUNE) which accounts for total uncertainties from all major sources: inputs forcing, model structure, model parameters. This algorithm explores multi-model framework to tackle model structural uncertainty while using the Bayesian rules to estimate parameter and input uncertainty within individual models. Three hydrologic models including SACramento Soil Moisture Accounting (SAC-SMA) model, Hydrologic model (HYMOD) and Simple Water Balance (SWB) model were considered within IBUNE framework for this study. The results which are presented for the Leaf River Basin, MS, indicates that IBUNE gives a better quantification of uncertainty through hydrological modeling processes, therefore provide more reliable and less bias prediction with realistic uncertainty boundaries.

  1. Earthquake Loss Estimation Uncertainties

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Aleksander

    2013-04-01

    The paper addresses the reliability issues of strong earthquakes loss assessment following strong earthquakes with worldwide Systems' application in emergency mode. Timely and correct action just after an event can result in significant benefits in saving lives. In this case the information about possible damage and expected number of casualties is very critical for taking decision about search, rescue operations and offering humanitarian assistance. Such rough information may be provided by, first of all, global systems, in emergency mode. The experience of earthquakes disasters in different earthquake-prone countries shows that the officials who are in charge of emergency response at national and international levels are often lacking prompt and reliable information on the disaster scope. Uncertainties on the parameters used in the estimation process are numerous and large: knowledge about physical phenomena and uncertainties on the parameters used to describe them; global adequacy of modeling techniques to the actual physical phenomena; actual distribution of population at risk at the very time of the shaking (with respect to immediate threat: buildings or the like); knowledge about the source of shaking, etc. Needless to be a sharp specialist to understand, for example, that the way a given building responds to a given shaking obeys mechanical laws which are poorly known (if not out of the reach of engineers for a large portion of the building stock); if a carefully engineered modern building is approximately predictable, this is far not the case for older buildings which make up the bulk of inhabited buildings. The way population, inside the buildings at the time of shaking, is affected by the physical damage caused to the buildings is not precisely known, by far. The paper analyzes the influence of uncertainties in strong event parameters determination by Alert Seismological Surveys, of simulation models used at all stages from, estimating shaking intensity

  2. Uncertainty relation in Schwarzschild spacetime

    NASA Astrophysics Data System (ADS)

    Feng, Jun; Zhang, Yao-Zhong; Gould, Mark D.; Fan, Heng

    2015-04-01

    We explore the entropic uncertainty relation in the curved background outside a Schwarzschild black hole, and find that Hawking radiation introduces a nontrivial modification on the uncertainty bound for particular observer, therefore it could be witnessed by proper uncertainty game experimentally. We first investigate an uncertainty game between a free falling observer and his static partner holding a quantum memory initially entangled with the quantum system to be measured. Due to the information loss from Hawking decoherence, we find an inevitable increase of the uncertainty on the outcome of measurements in the view of static observer, which is dependent on the mass of the black hole, the distance of observer from event horizon, and the mode frequency of quantum memory. To illustrate the generality of this paradigm, we relate the entropic uncertainty bound with other uncertainty probe, e.g., time-energy uncertainty. In an alternative game between two static players, we show that quantum information of qubit can be transferred to quantum memory through a bath of fluctuating quantum fields outside the black hole. For a particular choice of initial state, we show that the Hawking decoherence cannot counteract entanglement generation after the dynamical evolution of system, which triggers an effectively reduced uncertainty bound that violates the intrinsic limit -log2 ⁡ c. Numerically estimation for a proper choice of initial state shows that our result is comparable with possible real experiments. Finally, a discussion on the black hole firewall paradox in the context of entropic uncertainty relation is given.

  3. DOD ELAP Lab Uncertainties

    DTIC Science & Technology

    2012-03-01

    IPV6, NLLAP, NEFAP  TRAINING Programs  Certification Bodies – ISO /IEC 17021  Accreditation for  Management  System  Certification Bodies that...certify to :  ISO   9001  (QMS),  ISO  14001 (EMS),   TS 16949 (US Automotive)  etc. 2 3 DoD QSM 4.2 standard   ISO /IEC 17025:2005  Each has uncertainty...NOTES Presented at the 9th Annual DoD Environmental Monitoring and Data Quality (EDMQ) Workshop Held 26-29 March 2012 in La Jolla, CA. U.S

  4. Uncertainty as Certaint

    NASA Astrophysics Data System (ADS)

    Petzinger, Tom

    I am trying to make money in the biotech industry from complexity science. And I am doing it with inspiration that I picked up on the edge of Appalachia spending time with June Holley and ACEnet when I was a Wall Street Journal reporter. I took some of those ideas to Pittsburgh, in biotechnology, in a completely private setting with an economic development focus, but also with a mission t o return profit to private capital. And we are doing that. I submit as a hypothesis, something we are figuring out in the post- industrial era, that business evolves. It is not the definition of business, but business critically involves the design of systems in which uncertainty is treated as a certainty. That is what I have seen and what I have tried to put into practice.

  5. Medical decisions under uncertainty.

    PubMed

    Carmi, A

    1993-01-01

    The court applies the criteria of the reasonable doctor and common practice in order to consider the behaviour of a defendant physician. The meaning of our demand that the doctor expects that his or her acts or omissions will bring about certain implications is that, according to the present circumstances and subject to the limited knowledge of the common practice, the course of certain events or situations in the future may be assumed in spite of the fog of uncertainty which surrounds us. The miracles and wonders of creation are concealed from us, and we are not aware of the way and the nature of our bodily functioning. Therefore, there seems to be no way to avoid mistakes, because in several cases the correct diagnosis cannot be determined even with the most advanced application of all information available. Doctors find it difficult to admit that they grope in the dark. They wish to form clear and accurate diagnoses for their patients. The fact that their profession is faced with innumerable and unavoidable risks and mistakes is hard to swallow, and many of them claim that in their everyday work this does not happen. They should not content themselves by changing their style. A radical metamorphosis is needed. They should not be tempted to formulate their diagnoses in 'neutral' statements in order to be on the safe side. Uncertainty should be accepted and acknowledged by the profession and by the public at large as a human phenomenon, as an integral part of any human decision, and as a clear characteristic of any legal or medical diagnosis.(ABSTRACT TRUNCATED AT 250 WORDS)

  6. Uncertainties in stellar ages provided by grid techniques

    NASA Astrophysics Data System (ADS)

    Prada Moroni, P. G.; Valle, G.; Dell'Omodarme, M.; Degl'Innocenti, S.

    2016-09-01

    The determination of the age of single stars by means of grid-based techniques is a well established method. We discuss the impact on these estimates of the uncertainties in several ingredients routinely adopted in stellar computations. The systematic bias on age determination caused by varying the assumed initial helium abundance, the mixing-length and convective core overshooting parameters, and the microscopic diffusion are quantified and compared with the statistical error owing to the current uncertainty in the observations. The typical uncertainty in the observations accounts for 1 σ statistical relative error in age determination ranging on average from about -35 % to +42 %, depending on the mass. However, the age's relative error strongly depends on the evolutionary phase and can be higher than 120 % for stars near the zero-age main-sequence, while it is typically about 20 % or lower in the advanced main-sequence phase. A variation of ± 1 in the helium-to-metal enrichment ratio induces a quite modest systematic bias on age estimates. The maximum bias due to the presence of the convective core overshooting is -7 % for β = 0.2 and -13 % for β = 0.4. The main sources of bias are the uncertainty in the mixing-length value and the neglect of microscopic diffusion, which account each for a bias comparable to the random error uncertainty.

  7. Sensitivity and Uncertainty Analysis to Burnup Estimates on ADS using the ACAB Code

    SciTech Connect

    Cabellos, O.; Sanz, J.; Rodriguez, A.; Gonzalez, E.; Embid, M.; Alvarez, F.; Reyes, S.

    2005-05-24

    Within the scope of the Accelerator Driven System (ADS) concept for nuclear waste management applications, the burnup uncertainty estimates due to uncertainty in the activation cross sections (XSs) are important regarding both the safety and the efficiency of the waste burning process. We have applied both sensitivity analysis and Monte Carlo methodology to actinides burnup calculations in a lead-bismuth cooled subcritical ADS. The sensitivity analysis is used to identify the reaction XSs and the dominant chains that contribute most significantly to the uncertainty. The Monte Carlo methodology gives the burnup uncertainty estimates due to the synergetic/global effect of the complete set of XS uncertainties. These uncertainty estimates are valuable to assess the need of any experimental or systematic re-evaluation of some uncertainty XSs for ADS.

  8. Supporting qualified database for uncertainty evaluation

    SciTech Connect

    Petruzzi, A.; Fiori, F.; Kovtonyuk, A.; D'Auria, F.

    2012-07-01

    Uncertainty evaluation constitutes a key feature of BEPU (Best Estimate Plus Uncertainty) process. The uncertainty can be the result of a Monte Carlo type analysis involving input uncertainty parameters or the outcome of a process involving the use of experimental data and connected code calculations. Those uncertainty methods are discussed in several papers and guidelines (IAEA-SRS-52, OECD/NEA BEMUSE reports). The present paper aims at discussing the role and the depth of the analysis required for merging from one side suitable experimental data and on the other side qualified code calculation results. This aspect is mostly connected with the second approach for uncertainty mentioned above, but it can be used also in the framework of the first approach. Namely, the paper discusses the features and structure of the database that includes the following kinds of documents: 1. The' RDS-facility' (Reference Data Set for the selected facility): this includes the description of the facility, the geometrical characterization of any component of the facility, the instrumentations, the data acquisition system, the evaluation of pressure losses, the physical properties of the material and the characterization of pumps, valves and heat losses; 2. The 'RDS-test' (Reference Data Set for the selected test of the facility): this includes the description of the main phenomena investigated during the test, the configuration of the facility for the selected test (possible new evaluation of pressure and heat losses if needed) and the specific boundary and initial conditions; 3. The 'QR' (Qualification Report) of the code calculation results: this includes the description of the nodalization developed following a set of homogeneous techniques, the achievement of the steady state conditions and the qualitative and quantitative analysis of the transient with the characterization of the Relevant Thermal-Hydraulics Aspects (RTA); 4. The EH (Engineering Handbook) of the input nodalization

  9. Managing uncertainty in collaborative robotics engineering projects: The influence of task structure and peer interaction

    NASA Astrophysics Data System (ADS)

    Jordan, Michelle

    Uncertainty is ubiquitous in life, and learning is an activity particularly likely to be fraught with uncertainty. Previous research suggests that students and teachers struggle in their attempts to manage the psychological experience of uncertainty and that students often fail to experience uncertainty when uncertainty may be warranted. Yet, few educational researchers have explicitly and systematically observed what students do, their behaviors and strategies, as they attempt to manage the uncertainty they experience during academic tasks. In this study I investigated how students in one fifth grade class managed uncertainty they experienced while engaged in collaborative robotics engineering projects, focusing particularly on how uncertainty management was influenced by task structure and students' interactions with their peer collaborators. The study was initiated at the beginning of instruction related to robotics engineering and preceded through the completion of several long-term collaborative robotics projects, one of which was a design project. I relied primarily on naturalistic observation of group sessions, semi-structured interviews, and collection of artifacts. My data analysis was inductive and interpretive, using qualitative discourse analysis techniques and methods of grounded theory. Three theoretical frameworks influenced the conception and design of this study: community of practice, distributed cognition, and complex adaptive systems theory. Uncertainty was a pervasive experience for the students collaborating in this instructional context. Students experienced uncertainty related to the project activity and uncertainty related to the social system as they collaborated to fulfill the requirements of their robotics engineering projects. They managed their uncertainty through a diverse set of tactics for reducing, ignoring, maintaining, and increasing uncertainty. Students experienced uncertainty from more different sources and used more and

  10. The impact of uncertainty on shape optimization of idealized bypass graft models in unsteady flow

    NASA Astrophysics Data System (ADS)

    Sankaran, Sethuraman; Marsden, Alison L.

    2010-12-01

    It is well known that the fluid mechanics of bypass grafts impacts biomechanical responses and is linked to intimal thickening and plaque deposition on the vessel wall. In spite of this, quantitative information about the fluid mechanics is not currently incorporated into surgical planning and bypass graft design. In this work, we use a derivative-free optimization technique for performing systematic design of bypass grafts. The optimization method is coupled to a three-dimensional pulsatile Navier-Stokes solver. We systematically account for inevitable uncertainties that arise in cardiovascular simulations, owing to noise in medical image data, variable physiologic conditions, and surgical implementation. Uncertainties in the simulation input parameters as well as shape design variables are accounted for using the adaptive stochastic collocation technique. The derivative-free optimization framework is coupled with a stochastic response surface technique to make the problem computationally tractable. Two idealized numerical examples, an end-to-side anastomosis, and a bypass graft around a stenosis, demonstrate that accounting for uncertainty significantly changes the optimal graft design. Results show that small changes in the design variables from their optimal values should be accounted for in surgical planning. Changes in the downstream (distal) graft angle resulted in greater sensitivity of the wall-shear stress compared to changes in the upstream (proximal) angle. The impact of cost function choice on the optimal solution was explored. Additionally, this work represents the first use of the stochastic surrogate management framework method for robust shape optimization in a fully three-dimensional unsteady Navier-Stokes design problem.

  11. Impact of discharge data uncertainty on nutrient load uncertainty

    NASA Astrophysics Data System (ADS)

    Westerberg, Ida; Gustavsson, Hanna; Sonesten, Lars

    2016-04-01

    Uncertainty in the rating-curve model of the stage-discharge relationship leads to uncertainty in discharge time series. These uncertainties in turn affect many other analyses based on discharge data, such as nutrient load estimations. It is important to understand how large the impact of discharge data uncertainty is on such analyses, since they are often used as the basis to take important environmental management decisions. In the Baltic Sea basin, nutrient load estimates from river mouths are a central information basis for managing and reducing eutrophication in the Baltic Sea. In this study we investigated rating curve uncertainty and its propagation to discharge data uncertainty and thereafter to uncertainty in the load of phosphorous and nitrogen for twelve Swedish river mouths. We estimated rating curve uncertainty using the Voting Point method, which accounts for random and epistemic errors in the stage-discharge relation and allows drawing multiple rating-curve realisations consistent with the total uncertainty. We sampled 40,000 rating curves, and for each sampled curve we calculated a discharge time series from 15-minute water level data for the period 2005-2014. Each discharge time series was then aggregated to daily scale and used to calculate the load of phosphorous and nitrogen from linearly interpolated monthly water samples, following the currently used methodology for load estimation. Finally the yearly load estimates were calculated and we thus obtained distributions with 40,000 load realisations per year - one for each rating curve. We analysed how the rating curve uncertainty propagated to the discharge time series at different temporal resolutions, and its impact on the yearly load estimates. Two shorter periods of daily water quality sampling around the spring flood peak allowed a comparison of load uncertainty magnitudes resulting from discharge data with those resulting from the monthly water quality sampling.

  12. Uncertainty and Surprise: An Introduction

    NASA Astrophysics Data System (ADS)

    McDaniel, Reuben R.; Driebe, Dean J.

    Much of the traditional scientific and applied scientific work in the social and natural sciences has been built on the supposition that the unknowability of situations is the result of a lack of information. This has led to an emphasis on uncertainty reduction through ever-increasing information seeking and processing, including better measurement and observational instrumentation. Pending uncertainty reduction through better information, efforts are devoted to uncertainty management and hierarchies of controls. A central goal has been the avoidance of surprise.

  13. Scientific uncertainty and its relevance to science education

    NASA Astrophysics Data System (ADS)

    Ruggeri, Nancy Lee

    Uncertainty is inherent to scientific methods and practices, yet is it rarely explicitly discussed in science classrooms. Ironically, science is often equated with certainty in these contexts. Uncertainties that arise in science deserve special attention, as they are increasingly a part of public discussions and are susceptible to manipulation. Clarifying what is meant by scientific uncertainty would include identifying sources of uncertainty in scientific practice, and would help provide an instructional framework for understanding how scientists use methods, data, and models to justify claims about the natural world. This research introduces both a general typology of scientific uncertainty informed by a review of literature from a variety of perspectives, and two additional typologies that emerged from qualitative studies examining student essays about scientific uncertainty in two disciplinary contexts: biological evolution and global climate change. These typologies aim to provide leverage for curricular discussions about scientific knowledge and practices, and to help instructors interested in integrating scientific uncertainty into teaching these subjects. In particular, a focus on uncertainties in data and models can illustrate their integral relationship and can spark critical discussions about methods used to collect empirical data and the models used to explain them and make predictions. This research builds a case for integrating scientific uncertainty into science teaching and emphasizing its importance for understanding the practice of science within particular disciplinary contexts.

  14. Assessing Groundwater Model Uncertainty for the Central Nevada Test Area

    SciTech Connect

    Greg Pohll; Karl Pohlmann; Ahmed Hassan; Jenny Chapman; Todd Mihevc

    2002-06-14

    The purpose of this study is to quantify the flow and transport model uncertainty for the Central Nevada Test Area (CNTA). Six parameters were identified as uncertain, including the specified head boundary conditions used in the flow model, the spatial distribution of the underlying welded tuff unit, effective porosity, sorption coefficients, matrix diffusion coefficient, and the geochemical release function which describes nuclear glass dissolution. The parameter uncertainty was described by assigning prior statistical distributions for each of these parameters. Standard Monte Carlo techniques were used to sample from the parameter distributions to determine the full prediction uncertainty. Additional analysis is performed to determine the most cost-beneficial characterization activities. The maximum radius of the tritium and strontium-90 contaminant boundary was used as the output metric for evaluation of prediction uncertainty. The results indicate that combining all of the uncertainty in the parameters listed above propagates to a prediction uncertainty in the maximum radius of the contaminant boundary of 234 to 308 m and 234 to 302 m, for tritium and strontium-90, respectively. Although the uncertainty in the input parameters is large, the prediction uncertainty in the contaminant boundary is relatively small. The relatively small prediction uncertainty is primarily due to the small transport velocities such that large changes in the uncertain input parameters causes small changes in the contaminant boundary. This suggests that the model is suitable in terms of predictive capability for the contaminant boundary delineation.

  15. Higher-order uncertainty relations

    NASA Astrophysics Data System (ADS)

    Wünsche, A.

    2006-07-01

    Using the non-negativity of Gram determinants of arbitrary order, we derive higher-order uncertainty relations for the symmetric uncertainty matrices of corresponding order n?>?2 to n Hermitean operators (n?=?2 is the usual case). The special cases of third-order and fourth-order uncertainty relations are considered in detail. The obtained third-order uncertainty relations are applied to the Lie groups SU(1,1) with three Hermitean basis operators (K1,K2,K0) and SU(2) with three Hermitean basis operators (J1,J2,J3) where, in particular, the group-coherent states of Perelomov type and of Barut Girardello type for SU(1,1) and the spin or atomic coherent states for SU(2) are investigated. The uncertainty relations for the determinant of the third-order uncertainty matrix are satisfied with the equality sign for coherent states and this determinant becomes vanishing for the Perelomov type of coherent states for SU(1,1) and SU(2). As an example of the application of fourth-order uncertainty relations, we consider the canonical operators (Q1,P1,Q2,P2) of two boson modes and the corresponding uncertainty matrix formed by the operators of the corresponding mean deviations, taking into account the correlations between the two modes. In two mathematical appendices, we prove the non-negativity of the determinant of correlation matrices of arbitrary order and clarify the principal structure of higher-order uncertainty relations.

  16. SIP: Systematics-Insensitive Periodograms

    NASA Astrophysics Data System (ADS)

    Angus, Ruth

    2016-09-01

    SIP (Systematics-Insensitive Periodograms) extends the generative model used to create traditional sine-fitting periodograms for finding the frequency of a sinusoid by including systematic trends based on a set of eigen light curves in the generative model in addition to using a sum of sine and cosine functions over a grid of frequencies, producing periodograms with vastly reduced systematic features. Acoustic oscillations in giant stars and measurement of stellar rotation periods can be recovered from the SIP periodograms without detrending. The code can also be applied to detection other periodic phenomena, including eclipsing binaries and short-period exoplanet candidates.

  17. A review of uncertainty visualization within the IPCC reports

    NASA Astrophysics Data System (ADS)

    Nocke, Thomas; Reusser, Dominik; Wrobel, Markus

    2015-04-01

    Results derived from climate model simulations confront non-expert users with a variety of uncertainties. This gives rise to the challenge that the scientific information must be communicated such that it can be easily understood, however, the complexity of the science behind is still incorporated. With respect to the assessment reports of the IPCC, the situation is even more complicated, because heterogeneous sources and multiple types of uncertainties need to be compiled together. Within this work, we systematically (1) analyzed the visual representation of uncertainties in the IPCC AR4 and AR5 reports, and (2) executed a questionnaire to evaluate how different user groups such as decision-makers and teachers understand these uncertainty visualizations. Within the first step, we classified visual uncertainty metaphors for spatial, temporal and abstract representations. As a result, we clearly identified a high complexity of the IPCC visualizations compared to standard presentation graphics, sometimes even integrating two or more uncertainty classes / measures together with the "certain" (mean) information. Further we identified complex written uncertainty explanations within image captions even within the "summary reports for policy makers". In the second step, based on these observations, we designed a questionnaire to investigate how non-climate experts understand these visual representations of uncertainties, how visual uncertainty coding might hinder the perception of the "non-uncertain" data, and if alternatives for certain IPCC visualizations exist. Within the talk/poster, we will present first results from this questionnaire. Summarizing, we identified a clear trend towards complex images within the latest IPCC reports, with a tendency to incorporate as much as possible information into the visual representations, resulting in proprietary, non-standard graphic representations that are not necessarily easy to comprehend on one glimpse. We conclude that

  18. Uncertainties on lung doses from inhaled plutonium.

    PubMed

    Puncher, Matthew; Birchall, Alan; Bull, Richard K

    2011-10-01

    In a recent epidemiological study, Bayesian uncertainties on lung doses have been calculated to determine lung cancer risk from occupational exposures to plutonium. These calculations used a revised version of the Human Respiratory Tract Model (HRTM) published by the ICRP. In addition to the Bayesian analyses, which give probability distributions of doses, point estimates of doses (single estimates without uncertainty) were also provided for that study using the existing HRTM as it is described in ICRP Publication 66; these are to be used in a preliminary analysis of risk. To infer the differences between the point estimates and Bayesian uncertainty analyses, this paper applies the methodology to former workers of the United Kingdom Atomic Energy Authority (UKAEA), who constituted a subset of the study cohort. The resulting probability distributions of lung doses are compared with the point estimates obtained for each worker. It is shown that mean posterior lung doses are around two- to fourfold higher than point estimates and that uncertainties on doses vary over a wide range, greater than two orders of magnitude for some lung tissues. In addition, we demonstrate that uncertainties on the parameter values, rather than the model structure, are largely responsible for these effects. Of these it appears to be the parameters describing absorption from the lungs to blood that have the greatest impact on estimates of lung doses from urine bioassay. Therefore, accurate determination of the chemical form of inhaled plutonium and the absorption parameter values for these materials is important for obtaining reliable estimates of lung doses and hence risk from occupational exposures to plutonium.

  19. Uncertainties in the deprojection of the observed bar properties

    SciTech Connect

    Zou, Yanfei; Shen, Juntai; Li, Zhao-Yu

    2014-08-10

    In observations, it is important to deproject the two fundamental quantities characterizing a bar, i.e., its length (a) and ellipticity (e), to face-on values before any careful analyses. However, systematic estimation on the uncertainties of the commonly used deprojection methods is still lacking. Simulated galaxies are well suited in this study. We project two simulated barred galaxies onto a two-dimensional (2D) plane with different bar orientations and disk inclination angles (i). Bar properties are measured and deprojected with the popular deprojection methods in the literature. Generally speaking, deprojection uncertainties increase with increasing i. All of the deprojection methods behave badly when i is larger than 60°, due to the vertical thickness of the bar. Thus, future statistical studies of barred galaxies should exclude galaxies more inclined than 60°. At moderate inclination angles (i ≤ 60°), 2D deprojection methods (analytical and image stretching), and Fourier-based methods (Fourier decomposition and bar-interbar contrast) perform reasonably well with uncertainties ∼10% in both the bar length and ellipticity, whereas the uncertainties of the one-dimensional (1D) analytical deprojection can be as high as 100% in certain extreme cases. We find that different bar measurement methods show systematic differences in the deprojection uncertainties. We further discuss the deprojection uncertainty factors with the emphasis on the most important one, i.e., the three-dimensional structure of the bar itself. We construct two triaxial toy bar models that can qualitatively reproduce the results of the 1D and 2D analytical deprojections; they confirm that the vertical thickness of the bar is the main source of uncertainties.

  20. Effects of Upstream Turbulence on Measurement Uncertainty of Flow Rate by Venturi

    NASA Astrophysics Data System (ADS)

    Lee, Jungho; Yoon, Seok Ho; Yu, Cheong-Hwan; Park, Sang-Jin; Chung, Chang-Hwan

    2010-06-01

    Venturi has been widely used for measuring flow rate in a variety of engineering applications since pressure loss is relatively small compared with other measuring method. The current study focuses on making detailed estimation of measured uncertainties as the upstream turbulence affects uncertainty levels of the water flows in the closed-loop testing. Upstream turbulences can be controlled by selecting 9 different swirl generators. Measurement uncertainty of flow rate has been estimated by a quantitative uncertainty analysis which is based on the ANSI/ASME PTC 19.1-2005 standard. The best way to reduce error in measuring flow rate was investigated for evaluating its measurement uncertainty. The results of flow rate uncertainty analysis show that the case with systematic error has higher than that without systematic error. Especially the result with systematic error exhibits that the uncertainty of flow rate was gradually increased by upstream turbulence. Uncertainty of flow rate measurement can be mainly affected by differential pressure and discharge coefficient. Flow disturbance can be also reduced by increasing of the upstream straight length of Venturi.

  1. Identifying sources of uncertainty using covariance analysis

    NASA Astrophysics Data System (ADS)

    Hyslop, N. P.; White, W. H.

    2010-12-01

    Atmospheric aerosol monitoring often includes performing multiple analyses on a collected sample. Some common analyses resolve suites of elements or compounds (e.g., spectrometry, chromatography). Concentrations are determined through multi-step processes involving sample collection, physical or chemical analysis, and data reduction. Uncertainties in the individual steps propagate into uncertainty in the calculated concentration. The assumption in most treatments of measurement uncertainty is that errors in the various species concentrations measured in a sample are random and therefore independent of each other. This assumption is often not valid in speciated aerosol data because some errors can be common to multiple species. For example, an error in the sample volume will introduce a common error into all species concentrations determined in the sample, and these errors will correlate with each other. Measurement programs often use paired (collocated) measurements to characterize the random uncertainty in their measurements. Suites of paired measurements provide an opportunity to go beyond the characterization of measurement uncertainties in individual species to examine correlations amongst the measurement uncertainties in multiple species. This additional information can be exploited to distinguish sources of uncertainty that affect all species from those that only affect certain subsets or individual species. Data from the Interagency Monitoring of Protected Visual Environments (IMPROVE) program are used to illustrate these ideas. Nine analytes commonly detected in the IMPROVE network were selected for this analysis. The errors in these analytes can be reasonably modeled as multiplicative, and the natural log of the ratio of concentrations measured on the two samplers provides an approximation of the error. Figure 1 shows the covariation of these log ratios among the different analytes for one site. Covariance is strongest amongst the dust element (Fe, Ca, and

  2. Asymmetric Uncertainty Expression for High Gradient Aerodynamics

    NASA Technical Reports Server (NTRS)

    Pinier, Jeremy T

    2012-01-01

    When the physics of the flow around an aircraft changes very abruptly either in time or space (e.g., flow separation/reattachment, boundary layer transition, unsteadiness, shocks, etc), the measurements that are performed in a simulated environment like a wind tunnel test or a computational simulation will most likely incorrectly predict the exact location of where (or when) the change in physics happens. There are many reasons for this, includ- ing the error introduced by simulating a real system at a smaller scale and at non-ideal conditions, or the error due to turbulence models in a computational simulation. The un- certainty analysis principles that have been developed and are being implemented today do not fully account for uncertainty in the knowledge of the location of abrupt physics changes or sharp gradients, leading to a potentially underestimated uncertainty in those areas. To address this problem, a new asymmetric aerodynamic uncertainty expression containing an extra term to account for a phase-uncertainty, the magnitude of which is emphasized in the high-gradient aerodynamic regions is proposed in this paper. Additionally, based on previous work, a method for dispersing aerodynamic data within asymmetric uncer- tainty bounds in a more realistic way has been developed for use within Monte Carlo-type analyses.

  3. Target Uncertainty Mediates Sensorimotor Error Correction

    PubMed Central

    Vijayakumar, Sethu; Wolpert, Daniel M.

    2017-01-01

    Human movements are prone to errors that arise from inaccuracies in both our perceptual processing and execution of motor commands. We can reduce such errors by both improving our estimates of the state of the world and through online error correction of the ongoing action. Two prominent frameworks that explain how humans solve these problems are Bayesian estimation and stochastic optimal feedback control. Here we examine the interaction between estimation and control by asking if uncertainty in estimates affects how subjects correct for errors that may arise during the movement. Unbeknownst to participants, we randomly shifted the visual feedback of their finger position as they reached to indicate the center of mass of an object. Even though participants were given ample time to compensate for this perturbation, they only fully corrected for the induced error on trials with low uncertainty about center of mass, with correction only partial in trials involving more uncertainty. The analysis of subjects’ scores revealed that participants corrected for errors just enough to avoid significant decrease in their overall scores, in agreement with the minimal intervention principle of optimal feedback control. We explain this behavior with a term in the loss function that accounts for the additional effort of adjusting one’s response. By suggesting that subjects’ decision uncertainty, as reflected in their posterior distribution, is a major factor in determining how their sensorimotor system responds to error, our findings support theoretical models in which the decision making and control processes are fully integrated. PMID:28129323

  4. Target Uncertainty Mediates Sensorimotor Error Correction.

    PubMed

    Acerbi, Luigi; Vijayakumar, Sethu; Wolpert, Daniel M

    2017-01-01

    Human movements are prone to errors that arise from inaccuracies in both our perceptual processing and execution of motor commands. We can reduce such errors by both improving our estimates of the state of the world and through online error correction of the ongoing action. Two prominent frameworks that explain how humans solve these problems are Bayesian estimation and stochastic optimal feedback control. Here we examine the interaction between estimation and control by asking if uncertainty in estimates affects how subjects correct for errors that may arise during the movement. Unbeknownst to participants, we randomly shifted the visual feedback of their finger position as they reached to indicate the center of mass of an object. Even though participants were given ample time to compensate for this perturbation, they only fully corrected for the induced error on trials with low uncertainty about center of mass, with correction only partial in trials involving more uncertainty. The analysis of subjects' scores revealed that participants corrected for errors just enough to avoid significant decrease in their overall scores, in agreement with the minimal intervention principle of optimal feedback control. We explain this behavior with a term in the loss function that accounts for the additional effort of adjusting one's response. By suggesting that subjects' decision uncertainty, as reflected in their posterior distribution, is a major factor in determining how their sensorimotor system responds to error, our findings support theoretical models in which the decision making and control processes are fully integrated.

  5. MODIS Radiometric Calibration and Uncertainty Assessment

    NASA Technical Reports Server (NTRS)

    Xiong, Xiaoxiong; Chiang, Vincent; Sun, Junqiang; Wu, Aisheng

    2011-01-01

    Since launch, Terra and Aqua MODIS have collected more than II and 9 years of datasets for comprehensive studies of the Earth's land, ocean, and atmospheric properties. MODIS observations are made in 36 spectral bands: 20 reflective solar bands (RSB) and 16 thermal emissive bands (TEB). Compared to its heritage sensors, MODIS was developed with very stringent calibration and uncertainty requirements. As a result, MODIS was designed and built with a set of state of the art on-board calibrators (OBC), which allow key sensor performance parameters and on-orbit calibration coefficients to be monitored and updated if necessary. In terms of its calibration traceability, MODIS RSB calibration is reflectance based using an on-board solar diffuser (SD) and the TEB calibration is radiance based using an on-board blackbody (BB). In addition to on-orbit calibration coefficients derived from its OBC, calibration parameters determined from sensor pre-launch calibration and characterization are used in both the RSB and TEB calibration and retrieval algorithms. This paper provides a brief description of MODIS calibration methodologies and discusses details of its on-orbit calibration uncertainties. It assesses uncertainty contributions from individual components and differences between Terra and Aqua MODIS due to their design characteristics and on-orbit periormance. Also discussed in this paper is the use of MODIS LIB uncertainty index CUI) product.

  6. Uncertainties in Arctic Precipitation

    NASA Astrophysics Data System (ADS)

    Majhi, I.; Alexeev, V. A.; Cherry, J. E.; Cohen, J. L.; Groisman, P. Y.

    2012-12-01

    Arctic precipitation is riddled with measurement biases; to address the problem is imperative. Our study focuses on comparison of various datasets and analyzing their biases for the region of Siberia and caution that is needed when using them. Five sources of data were used ranging from NOAA's product (RAW, Bogdanova's correction), Yang's correction technique and two reanalysis products (ERA-Interim and NCEP). The reanalysis dataset performed better for some months in comparison to Yang's product, which tends to overestimate precipitation, and the raw dataset, which tends to underestimate. The sources of bias vary from topography, to wind, to missing data .The final three products chosen show higher biases during the winter and spring season. Emphasis on equations which incorporate blizzards, blowing snow and higher wind speed is necessary for regions which are influenced by any or all of these factors; Bogdanova's correction technique is the most robust of all the datasets analyzed and gives the most reasonable results. One of our future goals is to analyze the impact of precipitation uncertainties on water budget analysis for the Siberian Rivers.

  7. Pandemic influenza: certain uncertainties

    PubMed Central

    Morens, David M.; Taubenberger, Jeffery K.

    2011-01-01

    SUMMARY For at least five centuries, major epidemics and pandemics of influenza have occurred unexpectedly and at irregular intervals. Despite the modern notion that pandemic influenza is a distinct phenomenon obeying such constant (if incompletely understood) rules such as dramatic genetic change, cyclicity, “wave” patterning, virus replacement, and predictable epidemic behavior, much evidence suggests the opposite. Although there is much that we know about pandemic influenza, there appears to be much more that we do not know. Pandemics arise as a result of various genetic mechanisms, have no predictable patterns of mortality among different age groups, and vary greatly in how and when they arise and recur. Some are followed by new pandemics, whereas others fade gradually or abruptly into long-term endemicity. Human influenza pandemics have been caused by viruses that evolved singly or in co-circulation with other pandemic virus descendants and often have involved significant transmission between, or establishment of, viral reservoirs within other animal hosts. In recent decades, pandemic influenza has continued to produce numerous unanticipated events that expose fundamental gaps in scientific knowledge. Influenza pandemics appear to be not a single phenomenon but a heterogeneous collection of viral evolutionary events whose similarities are overshadowed by important differences, the determinants of which remain poorly understood. These uncertainties make it difficult to predict influenza pandemics and, therefore, to adequately plan to prevent them. PMID:21706672

  8. On the Directional Dependence and Null Space Freedom in Uncertainty Bound Identification

    NASA Technical Reports Server (NTRS)

    Lim, K. B.; Giesy, D. P.

    1997-01-01

    In previous work, the determination of uncertainty models via minimum norm model validation is based on a single set of input and output measurement data. Since uncertainty bounds at each frequency is directionally dependent for multivariable systems, this will lead to optimistic uncertainty levels. In addition, the design freedom in the uncertainty model has not been utilized to further reduce uncertainty levels. The above issues are addressed by formulating a min- max problem. An analytical solution to the min-max problem is given to within a generalized eigenvalue problem, thus avoiding a direct numerical approach. This result will lead to less conservative and more realistic uncertainty models for use in robust control.

  9. BICEP2 III: Instrumental systematics

    SciTech Connect

    Ade, P. A. R.

    2015-11-23

    In a companion paper, we have reported a >5σ detection of degree scale B-mode polarization at 150 GHz by the Bicep2 experiment. Here we provide a detailed study of potential instrumental systematic contamination to that measurement. We focus extensively on spurious polarization that can potentially arise from beam imperfections. We present a heuristic classification of beam imperfections according to their symmetries and uniformities, and discuss how resulting contamination adds or cancels in maps that combine observations made at multiple orientations of the telescope about its boresight axis. We introduce a technique, which we call "deprojection," for filtering the leading order beam-induced contamination from time-ordered data, and show that it reduces power in Bicep2's actual and null-test BB spectra consistent with predictions using high signal-to-noise beam shape measurements. We detail the simulation pipeline that we use to directly simulate instrumental systematics and the calibration data used as input to that pipeline. Finally, we present the constraints on BB contamination from individual sources of potential systematics. We find that systematics contribute BB power that is a factor of ~10× below Bicep2's three-year statistical uncertainty, and negligible compared to the observed BB signal. Lastly, the contribution to the best-fit tensor/scalar ratio is at a level equivalent to r = (3–6) × 10–3.

  10. BICEP2 III: Instrumental systematics

    DOE PAGES

    Ade, P. A. R.

    2015-11-23

    In a companion paper, we have reported a >5σ detection of degree scale B-mode polarization at 150 GHz by the Bicep2 experiment. Here we provide a detailed study of potential instrumental systematic contamination to that measurement. We focus extensively on spurious polarization that can potentially arise from beam imperfections. We present a heuristic classification of beam imperfections according to their symmetries and uniformities, and discuss how resulting contamination adds or cancels in maps that combine observations made at multiple orientations of the telescope about its boresight axis. We introduce a technique, which we call "deprojection," for filtering the leading ordermore » beam-induced contamination from time-ordered data, and show that it reduces power in Bicep2's actual and null-test BB spectra consistent with predictions using high signal-to-noise beam shape measurements. We detail the simulation pipeline that we use to directly simulate instrumental systematics and the calibration data used as input to that pipeline. Finally, we present the constraints on BB contamination from individual sources of potential systematics. We find that systematics contribute BB power that is a factor of ~10× below Bicep2's three-year statistical uncertainty, and negligible compared to the observed BB signal. Lastly, the contribution to the best-fit tensor/scalar ratio is at a level equivalent to r = (3–6) × 10–3.« less

  11. Bicep2. III. INSTRUMENTAL SYSTEMATICS

    SciTech Connect

    Ade, P. A. R.; Aikin, R. W.; Bock, J. J.; Brevik, J. A.; Filippini, J. P.; Golwala, S. R.; Hildebrandt, S. R.; Barkats, D.; Benton, S. J.; Bischoff, C. A.; Buder, I.; Karkare, K. S.; Bullock, E.; Dowell, C. D.; Duband, L.; Fliescher, S.; Halpern, M.; Hasselfield, M.; Hilton, G. C.; Irwin, K. D.; Collaboration: Bicep2 Collaboration; and others

    2015-12-01

    In a companion paper, we have reported a >5σ detection of degree scale B-mode polarization at 150 GHz by the Bicep2 experiment. Here we provide a detailed study of potential instrumental systematic contamination to that measurement. We focus extensively on spurious polarization that can potentially arise from beam imperfections. We present a heuristic classification of beam imperfections according to their symmetries and uniformities, and discuss how resulting contamination adds or cancels in maps that combine observations made at multiple orientations of the telescope about its boresight axis. We introduce a technique, which we call “deprojection,” for filtering the leading order beam-induced contamination from time-ordered data, and show that it reduces power in Bicep2's actual and null-test BB spectra consistent with predictions using high signal-to-noise beam shape measurements. We detail the simulation pipeline that we use to directly simulate instrumental systematics and the calibration data used as input to that pipeline. Finally, we present the constraints on BB contamination from individual sources of potential systematics. We find that systematics contribute BB power that is a factor of ∼10× below Bicep2's three-year statistical uncertainty, and negligible compared to the observed BB signal. The contribution to the best-fit tensor/scalar ratio is at a level equivalent to r = (3–6) × 10{sup −3}.

  12. Uncertainty in Simulating Wheat Yields Under Climate Change

    SciTech Connect

    Asseng, S.; Ewert, F.; Rosenzweig, C.; Jones, J.W.; Hatfield, Jerry; Ruane, Alex; Boote, K. J.; Thorburn, Peter; Rotter, R.P.; Cammarano, D.; Brisson, N.; Basso, B.; Martre, P.; Aggarwal, P.K.; Angulo, C.; Bertuzzi, P.; Biernath, C.; Challinor, AJ; Doltra, J.; Gayler, S.; Goldberg, R.; Grant, Robert; Heng, L.; Hooker, J.; Hunt, L.A.; Ingwersen, J.; Izaurralde, Roberto C.; Kersebaum, K.C.; Mueller, C.; Naresh Kumar, S.; Nendel, C.; O'Leary, G.O.; Olesen, JE; Osborne, T.; Palosuo, T.; Priesack, E.; Ripoche, D.; Semenov, M.A.; Shcherbak, I.; Steduto, P.; Stockle, Claudio O.; Stratonovitch, P.; Streck, T.; Supit, I.; Tao, F.; Travasso, M.; Waha, K.; Wallach, D.; White, J.W.; Williams, J.R.; Wolf, J.

    2013-09-01

    Anticipating the impacts of climate change on crop yields is critical for assessing future food security. Process-based crop simulation models are the most commonly used tools in such assessments1,2. Analysis of uncertainties in future greenhouse gas emissions and their impacts on future climate change has been increasingly described in the literature3,4 while assessments of the uncertainty in crop responses to climate change are very rare. Systematic and objective comparisons across impact studies is difficult, and thus has not been fully realized5. Here we present the largest coordinated and standardized crop model intercomparison for climate change impacts on wheat production to date. We found that several individual crop models are able to reproduce measured grain yields under current diverse environments, particularly if sufficient details are provided to execute them. However, simulated climate change impacts can vary across models due to differences in model structures and algorithms. The crop-model component of uncertainty in climate change impact assessments was considerably larger than the climate-model component from Global Climate Models (GCMs). Model responses to high temperatures and temperature-by-CO2 interactions are identified as major sources of simulated impact uncertainties. Significant reductions in impact uncertainties through model improvements in these areas and improved quantification of uncertainty through multi-model ensembles are urgently needed for a more reliable translation of climate change scenarios into agricultural impacts in order to develop adaptation strategies and aid policymaking.

  13. Uncertainty and risk in wildland fire management: a review.

    PubMed

    Thompson, Matthew P; Calkin, Dave E

    2011-08-01

    Wildland fire management is subject to manifold sources of uncertainty. Beyond the unpredictability of wildfire behavior, uncertainty stems from inaccurate/missing data, limited resource value measures to guide prioritization across fires and resources at risk, and an incomplete scientific understanding of ecological response to fire, of fire behavior response to treatments, and of spatiotemporal dynamics involving disturbance regimes and climate change. This work attempts to systematically align sources of uncertainty with the most appropriate decision support methodologies, in order to facilitate cost-effective, risk-based wildfire planning efforts. We review the state of wildfire risk assessment and management, with a specific focus on uncertainties challenging implementation of integrated risk assessments that consider a suite of human and ecological values. Recent advances in wildfire simulation and geospatial mapping of highly valued resources have enabled robust risk-based analyses to inform planning across a variety of scales, although improvements are needed in fire behavior and ignition occurrence models. A key remaining challenge is a better characterization of non-market resources at risk, both in terms of their response to fire and how society values those resources. Our findings echo earlier literature identifying wildfire effects analysis and value uncertainty as the primary challenges to integrated wildfire risk assessment and wildfire management. We stress the importance of identifying and characterizing uncertainties in order to better quantify and manage them. Leveraging the most appropriate decision support tools can facilitate wildfire risk assessment and ideally improve decision-making.

  14. Uncertainty in Simulating Wheat Yields Under Climate Change

    NASA Technical Reports Server (NTRS)

    Asseng, S.; Ewert, F.; Rosenzweig, Cynthia; Jones, J. W.; Hatfield, J. W.; Ruane, A. C.; Boote, K. J.; Thornburn, P. J.; Rotter, R. P.; Cammarano, D.; Brisson, N.; Basso, B.; Martre, P.; Angulo, C.; Bertuzzi, P.; Biernath, C.; Challinor, A. J.; Doltra, J.; Gayler, S.; Goldberg, R.; Grant, R.; Heng, L.; Hooker, J.; Hunt, L. A.; Ingwersen, J.

    2013-01-01

    Projections of climate change impacts on crop yields are inherently uncertain1. Uncertainty is often quantified when projecting future greenhouse gas emissions and their influence on climate2. However, multi-model uncertainty analysis of crop responses to climate change is rare because systematic and objective comparisons among process-based crop simulation models1,3 are difficult4. Here we present the largest standardized model intercomparison for climate change impacts so far. We found that individual crop models are able to simulate measured wheat grain yields accurately under a range of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi-model ensembles. Less uncertainty in describing how climate change may affect agricultural productivity will aid adaptation strategy development and policymaking.

  15. Quantification of Emission Factor Uncertainty

    EPA Science Inventory

    Emissions factors are important for estimating and characterizing emissions from sources of air pollution. There is no quantitative indication of uncertainty for these emission factors, most factors do not have an adequate data set to compute uncertainty, and it is very difficult...

  16. Uncertainties in nuclear fission data

    NASA Astrophysics Data System (ADS)

    Talou, Patrick; Kawano, Toshihiko; Chadwick, Mark B.; Neudecker, Denise; Rising, Michael E.

    2015-03-01

    We review the current status of our knowledge of nuclear fission data, and quantify uncertainties related to each fission observable whenever possible. We also discuss the roles that theory and experiment play in reducing those uncertainties, contributing to the improvement of our fundamental understanding of the nuclear fission process as well as of evaluated nuclear data libraries used in nuclear applications.

  17. Mama Software Features: Uncertainty Testing

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-30

    This document reviews how the uncertainty in the calculations is being determined with test image data. The results of this testing give an ‘initial uncertainty’ number than can be used to estimate the ‘back end’ uncertainty in digital image quantification in images. Statisticians are refining these numbers as part of a UQ effort.

  18. Research strategies for addressing uncertainties

    USGS Publications Warehouse

    Busch, David E.; Brekke, Levi D.; Averyt, Kristen; Jardine, Angela; Welling, Leigh; Garfin, Gregg; Jardine, Angela; Merideth, Robert; Black, Mary; LeRoy, Sarah

    2013-01-01

    Research Strategies for Addressing Uncertainties builds on descriptions of research needs presented elsewhere in the book; describes current research efforts and the challenges and opportunities to reduce the uncertainties of climate change; explores ways to improve the understanding of changes in climate and hydrology; and emphasizes the use of research to inform decision making.

  19. Setup Uncertainties of Anatomical Sub-Regions in Head-and-Neck Cancer Patients After Offline CBCT Guidance

    SciTech Connect

    Kranen, Simon van; Beek, Suzanne van; Rasch, Coen; Herk, Marcel van; Sonke, Jan-Jakob

    2009-04-01

    Purpose: To quantify local geometrical uncertainties in anatomical sub-regions during radiotherapy for head-and-neck cancer patients. Methods and Materials: Local setup accuracy was analyzed for 38 patients, who had received intensity-modulated radiotherapy and were regularly scanned during treatment with cone beam computed tomography (CBCT) for offline patient setup correction. In addition to the clinically used large region of interest (ROI), we defined eight ROIs in the planning CT that contained rigid bony structures: the mandible, larynx, jugular notch, occiput bone, vertebrae C1-C3, C3-C5, and C5-C7, and the vertebrae caudal of C7. By local rigid registration to successive CBCT scans, the local setup accuracy of each ROI was determined and compared with the overall setup error assessed with the large ROI. Deformations were distinguished from rigid body movements by expressing movement relative to a reference ROI (vertebrae C1-C3). Results: The offline patient setup correction protocol using the large ROI resulted in residual systematic errors (1 SD) within 1.2 mm and random errors within 1.5 mm for each direction. Local setup errors were larger, ranging from 1.1 to 3.4 mm (systematic) and 1.3 to 2.5 mm (random). Systematic deformations ranged from 0.4 mm near the reference C1-C3 to 3.8 mm for the larynx. Random deformations ranged from 0.5 to 3.6 mm. Conclusion: Head-and-neck cancer patients show considerable local setup variations, exceeding residual global patient setup uncertainty in an offline correction protocol. Current planning target volume margins may be inadequate to account for these uncertainties. We propose registration of multiple ROIs to drive correction protocols and adaptive radiotherapy to reduce the impact of local setup variations.

  20. An integrated modeling approach to support management decisions of coupled groundwater-agricultural systems under multiple uncertainties

    NASA Astrophysics Data System (ADS)

    Hagos Subagadis, Yohannes; Schütze, Niels; Grundmann, Jens

    2015-04-01

    The planning and implementation of effective water resources management strategies need an assessment of multiple (physical, environmental, and socio-economic) issues, and often requires new research in which knowledge of diverse disciplines are combined in a unified methodological and operational frameworks. Such integrative research to link different knowledge domains faces several practical challenges. Such complexities are further compounded by multiple actors frequently with conflicting interests and multiple uncertainties about the consequences of potential management decisions. A fuzzy-stochastic multiple criteria decision analysis tool was developed in this study to systematically quantify both probabilistic and fuzzy uncertainties associated with complex hydrosystems management. It integrated physical process-based models, fuzzy logic, expert involvement and stochastic simulation within a general framework. Subsequently, the proposed new approach is applied to a water-scarce coastal arid region water management problem in northern Oman, where saltwater intrusion into a coastal aquifer due to excessive groundwater extraction for irrigated agriculture has affected the aquifer sustainability, endangering associated socio-economic conditions as well as traditional social structure. Results from the developed method have provided key decision alternatives which can serve as a platform for negotiation and further exploration. In addition, this approach has enabled to systematically quantify both probabilistic and fuzzy uncertainties associated with the decision problem. Sensitivity analysis applied within the developed tool has shown that the decision makers' risk aversion and risk taking attitude may yield in different ranking of decision alternatives. The developed approach can be applied to address the complexities and uncertainties inherent in water resources systems to support management decisions, while serving as a platform for stakeholder participation.

  1. Uncertainty in Integrated Assessment Scenarios

    SciTech Connect

    Mort Webster

    2005-10-17

    The determination of climate policy is a decision under uncertainty. The uncertainty in future climate change impacts is large, as is the uncertainty in the costs of potential policies. Rational and economically efficient policy choices will therefore seek to balance the expected marginal costs with the expected marginal benefits. This approach requires that the risks of future climate change be assessed. The decision process need not be formal or quantitative for descriptions of the risks to be useful. Whatever the decision procedure, a useful starting point is to have as accurate a description of climate risks as possible. Given the goal of describing uncertainty in future climate change, we need to characterize the uncertainty in the main causes of uncertainty in climate impacts. One of the major drivers of uncertainty in future climate change is the uncertainty in future emissions, both of greenhouse gases and other radiatively important species such as sulfur dioxide. In turn, the drivers of uncertainty in emissions are uncertainties in the determinants of the rate of economic growth and in the technologies of production and how those technologies will change over time. This project uses historical experience and observations from a large number of countries to construct statistical descriptions of variability and correlation in labor productivity growth and in AEEI. The observed variability then provides a basis for constructing probability distributions for these drivers. The variance of uncertainty in growth rates can be further modified by expert judgment if it is believed that future variability will differ from the past. But often, expert judgment is more readily applied to projected median or expected paths through time. Analysis of past variance and covariance provides initial assumptions about future uncertainty for quantities that are less intuitive and difficult for experts to estimate, and these variances can be normalized and then applied to mean

  2. Equivalence theorem of uncertainty relations

    NASA Astrophysics Data System (ADS)

    Li, Jun-Li; Qiao, Cong-Feng

    2017-01-01

    We present an equivalence theorem to unify the two classes of uncertainty relations, i.e. the variance-based ones and the entropic forms, showing that the entropy of an operator in a quantum system can be built from the variances of a set of commutative operators. This means that an uncertainty relation in the language of entropy may be mapped onto a variance-based one, and vice versa. Employing the equivalence theorem, alternative formulations of entropic uncertainty relations are obtained for the qubit system that are stronger than the existing ones in the literature, and variance-based uncertainty relations for spin systems are reached from the corresponding entropic uncertainty relations.

  3. Reformulating the Quantum Uncertainty Relation.

    PubMed

    Li, Jun-Li; Qiao, Cong-Feng

    2015-08-03

    Uncertainty principle is one of the cornerstones of quantum theory. In the literature, there are two types of uncertainty relations, the operator form concerning the variances of physical observables and the entropy form related to entropic quantities. Both these forms are inequalities involving pairwise observables, and are found to be nontrivial to incorporate multiple observables. In this work we introduce a new form of uncertainty relation which may give out complete trade-off relations for variances of observables in pure and mixed quantum systems. Unlike the prevailing uncertainty relations, which are either quantum state dependent or not directly measurable, our bounds for variances of observables are quantum state independent and immune from the "triviality" problem of having zero expectation values. Furthermore, the new uncertainty relation may provide a geometric explanation for the reason why there are limitations on the simultaneous determination of different observables in N-dimensional Hilbert space.

  4. Flood resilience and uncertainty in flood risk assessment

    NASA Astrophysics Data System (ADS)

    Beven, K.; Leedal, D.; Neal, J.; Bates, P.; Hunter, N.; Lamb, R.; Keef, C.

    2012-04-01

    Flood risk assessments do not normally take account of the uncertainty in assessing flood risk. There is no requirement in the EU Floods Directive to do so. But given the generally short series (and potential non-stationarity) of flood discharges, the extrapolation to smaller exceedance potentials may be highly uncertain. This means that flood risk mapping may also be highly uncertainty, with additional uncertainties introduced by the representation of flood plain and channel geometry, conveyance and infrastructure. This suggests that decisions about flood plain management should be based on exceedance probability of risk rather than the deterministic hazard maps that are common in most EU countries. Some examples are given from 2 case studies in the UK where a framework for good practice in assessing uncertainty in flood risk mapping has been produced as part of the Flood Risk Management Research Consortium and Catchment Change Network Projects. This framework provides a structure for the communication and audit of assumptions about uncertainties.

  5. Theoretical Analysis of Positional Uncertainty in Direct Georeferencing

    NASA Astrophysics Data System (ADS)

    Coskun Kiraci, Ali; Toz, Gonul

    2016-10-01

    GNSS/INS system composed of Global Navigation Satellite System and Inertial Navigation System together can provide orientation parameters directly by the observations collected during the flight. Thus orientation parameters can be obtained by GNSS/INS integration process without any need for aero triangulation after the flight. In general, positional uncertainty can be estimated with known coordinates of Ground Control Points (GCP) which require field works such as marker construction and GNSS measurement leading additional cost to the project. Here the question arises what should be the theoretical uncertainty of point coordinates depending on the uncertainties of orientation parameters. In this study the contribution of each orientation parameter on positional uncertainty is examined and theoretical positional uncertainty is computed without GCP measurement for direct georeferencing using a graphical user interface developed in MATLAB.

  6. Communicating Storm Surge Forecast Uncertainty

    NASA Astrophysics Data System (ADS)

    Troutman, J. A.; Rhome, J.

    2015-12-01

    When it comes to tropical cyclones, storm surge is often the greatest threat to life and property along the coastal United States. The coastal population density has dramatically increased over the past 20 years, putting more people at risk. Informing emergency managers, decision-makers and the public about the potential for wind driven storm surge, however, has been extremely difficult. Recently, the Storm Surge Unit at the National Hurricane Center in Miami, Florida has developed a prototype experimental storm surge watch/warning graphic to help communicate this threat more effectively by identifying areas most at risk for life-threatening storm surge. This prototype is the initial step in the transition toward a NWS storm surge watch/warning system and highlights the inundation levels that have a 10% chance of being exceeded. The guidance for this product is the Probabilistic Hurricane Storm Surge (P-Surge) model, which predicts the probability of various storm surge heights by statistically evaluating numerous SLOSH model simulations. Questions remain, however, if exceedance values in addition to the 10% may be of equal importance to forecasters. P-Surge data from 2014 Hurricane Arthur is used to ascertain the practicality of incorporating other exceedance data into storm surge forecasts. Extracting forecast uncertainty information through analyzing P-surge exceedances overlaid with track and wind intensity forecasts proves to be beneficial for forecasters and decision support.

  7. Decisions on new product development under uncertainties

    NASA Astrophysics Data System (ADS)

    Huang, Yeu-Shiang; Liu, Li-Chen; Ho, Jyh-Wen

    2015-04-01

    In an intensively competitive market, developing a new product has become a valuable strategy for companies to establish their market positions and enhance their competitive advantages. Therefore, it is essential to effectively manage the process of new product development (NPD). However, since various problems may arise in NPD projects, managers should set up some milestones and subsequently construct evaluative mechanisms to assess their feasibility. This paper employed the approach of Bayesian decision analysis to deal with the two crucial uncertainties for NPD, which are the future market share and the responses of competitors. The proposed decision process can provide a systematic analytical procedure to determine whether an NPD project should be continued or not under the consideration of whether effective usage is being made of the organisational resources. Accordingly, the proposed decision model can assist the managers in effectively addressing the NPD issue under the competitive market.

  8. Uncertainty in monitoring E. coli concentrations in streams and stormwater runoff

    NASA Astrophysics Data System (ADS)

    Harmel, R. D.; Hathaway, J. M.; Wagner, K. L.; Wolfe, J. E.; Karthikeyan, R.; Francesconi, W.; McCarthy, D. T.

    2016-03-01

    Microbial contamination of surface waters, a substantial public health concern throughout the world, is typically identified by fecal indicator bacteria such as Escherichia coli. Thus, monitoring E. coli concentrations is critical to evaluate current conditions, determine restoration effectiveness, and inform model development and calibration. An often overlooked component of these monitoring and modeling activities is understanding the inherent random and systematic uncertainty present in measured data. In this research, a review and subsequent analysis was performed to identify, document, and analyze measurement uncertainty of E. coli data collected in stream flow and stormwater runoff as individual discrete samples or throughout a single runoff event. Data on the uncertainty contributed by sample collection, sample preservation/storage, and laboratory analysis in measured E. coli concentrations were compiled and analyzed, and differences in sampling method and data quality scenarios were compared. The analysis showed that: (1) manual integrated sampling produced the lowest random and systematic uncertainty in individual samples, but automated sampling typically produced the lowest uncertainty when sampling throughout runoff events; (2) sample collection procedures often contributed the highest amount of uncertainty, although laboratory analysis introduced substantial random uncertainty and preservation/storage introduced substantial systematic uncertainty under some scenarios; and (3) the uncertainty in measured E. coli concentrations was greater than that of sediment and nutrients, but the difference was not as great as may be assumed. This comprehensive analysis of uncertainty in E. coli concentrations measured in streamflow and runoff should provide valuable insight for designing E. coli monitoring projects, reducing uncertainty in quality assurance efforts, regulatory and policy decision making, and fate and transport modeling.

  9. Integrated uncertainty assessment of flow predictions in a Swiss catchment

    NASA Astrophysics Data System (ADS)

    Honti, M.; Stamm, C.; Reichert, P.

    2012-04-01

    Despite the vivid scientific debate on the suitability of RCM predictions for hydrological forecasting, impact studies relying on climatic input data and hydrological models are still the exclusive methods to provide some insight into the expected evolution of streams in the close future. While the climatic uncertainty is usually considered being dominant in such studies, more and more sophisticated uncertainty assessment methods reveal that the uncertainty of our hydrological models has been systematically underestimated by inappropriate assessment methods and that our predictive power for the present conditions can be as weak as it was considered for the future. The integrated treatment of various uncertainty sources allows us to quantify the overall predictive uncertainty for such studies and to decide if the anticipated impacts are relevant compared to the existing uncertainty. The Mönchaltorfer Aa catchment (46 km2) in Switzerland was modelled as a case study. A conceptual rainfall-runoff model was calibrated on measured discharge data with Bayesian parameter inference assuming a statistical error process that can account for various uncertainty sources. Climatic input data were produced by statistical downscaling from the outputs of 10 ENSEMBLES GCM-RCM model chains for the A1B emission scenario with the time horizon of 2050. Hourly rainfall data were produced with the Neyman-Scott rectangular pulses model (Rodriguez-Iturbe et al. 1987) while other weather parameters were generated on daily scale with the UKCP09 weather generator (Murphy et al. 2009). Expected landuse changes were assessed by creating divergent regional storylines from countrywide socio-economic scenarios. Despite the good performance of the hydrological model (Nash-Sutcliffe =0.8), its total predictive uncertainty was significant even for the present conditions. Due to the significant contribution of input uncertainty, individual flood peaks could be predicted with poor confidence. However

  10. Violation of Heisenberg's error-disturbance uncertainty relation in neutron-spin measurements

    NASA Astrophysics Data System (ADS)

    Sulyok, Georg; Sponar, Stephan; Erhart, Jacqueline; Badurek, Gerald; Ozawa, Masanao; Hasegawa, Yuji

    2013-08-01

    In its original formulation, Heisenberg's uncertainty principle dealt with the relationship between the error of a quantum measurement and the thereby induced disturbance on the measured object. Meanwhile, Heisenberg's heuristic arguments have turned out to be correct only for special cases. An alternative universally valid relation was derived by Ozawa in 2003. Here, we demonstrate that Ozawa's predictions hold for projective neutron-spin measurements. The experimental inaccessibility of error and disturbance claimed elsewhere has been overcome using a tomographic method. By a systematic variation of experimental parameters in the entire configuration space, the physical behavior of error and disturbance for projective spin-(1)/(2) measurements is illustrated comprehensively. The violation of Heisenberg's original relation, as well as the validity of Ozawa's relation become manifest. In addition, our results conclude that the widespread assumption of a reciprocal relation between error and disturbance is not valid in general.

  11. A Guideline for Applying Systematic Reviews to Child Language Intervention

    ERIC Educational Resources Information Center

    Hargrove, Patricia; Lund, Bonnie; Griffer, Mona

    2005-01-01

    This article focuses on applying systematic reviews to the Early Intervention (EI) literature. Systematic reviews are defined and differentiated from traditional, or narrative, reviews and from meta-analyses. In addition, the steps involved in critiquing systematic reviews and an illustration of a systematic review from the EI literature are…

  12. Uncertainty quantification in nanomechanical measurements using the atomic force microscope

    NASA Astrophysics Data System (ADS)

    Wagner, Ryan; Moon, Robert; Pratt, Jon; Shaw, Gordon; Raman, Arvind

    2011-11-01

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale resolution of both inorganic and biological surfaces and nanomaterials. We present a framework to ascribe uncertainty to local nanomechanical properties of any nanoparticle or surface measured with the AFM by taking into account the main uncertainty sources inherent in such measurements. We demonstrate the framework by quantifying uncertainty in AFM-based measurements of the transverse elastic modulus of cellulose nanocrystals (CNCs), an abundant, plant-derived nanomaterial whose mechanical properties are comparable to Kevlar fibers. For a single, isolated CNC the transverse elastic modulus was found to have a mean of 8.1 GPa and a 95% confidence interval of 2.7-20 GPa. A key result is that multiple replicates of force-distance curves do not sample the important sources of uncertainty, which are systematic in nature. The dominant source of uncertainty is the nondimensional photodiode sensitivity calibration rather than the cantilever stiffness or Z-piezo calibrations. The results underscore the great need for, and open a path towards, quantifying and minimizing uncertainty in AFM-based material property measurements of nanoparticles, nanostructured surfaces, thin films, polymers and biomaterials. This work is a partial contribution of the USDA Forest Service and NIST, agencies of the US government, and is not subject to copyright.

  13. Quantifying and Qualifying USGS ShakeMap Uncertainty

    USGS Publications Warehouse

    Wald, David J.; Lin, Kuo-Wan; Quitoriano, Vincent

    2008-01-01

    We describe algorithms for quantifying and qualifying uncertainties associated with USGS ShakeMap ground motions. The uncertainty values computed consist of latitude/longitude grid-based multiplicative factors that scale the standard deviation associated with the ground motion prediction equation (GMPE) used within the ShakeMap algorithm for estimating ground motions. The resulting grid-based 'uncertainty map' is essential for evaluation of losses derived using ShakeMaps as the hazard input. For ShakeMap, ground motion uncertainty at any point is dominated by two main factors: (i) the influence of any proximal ground motion observations, and (ii) the uncertainty of estimating ground motions from the GMPE, most notably, elevated uncertainty due to initial, unconstrained source rupture geometry. The uncertainty is highest for larger magnitude earthquakes when source finiteness is not yet constrained and, hence, the distance to rupture is also uncertain. In addition to a spatially-dependant, quantitative assessment, many users may prefer a simple, qualitative grading for the entire ShakeMap. We developed a grading scale that allows one to quickly gauge the appropriate level of confidence when using rapidly produced ShakeMaps as part of the post-earthquake decision-making process or for qualitative assessments of archived or historical earthquake ShakeMaps. We describe an uncertainty letter grading ('A' through 'F', for high to poor quality, respectively) based on the uncertainty map. A middle-range ('C') grade corresponds to a ShakeMap for a moderate-magnitude earthquake suitably represented with a point-source location. Lower grades 'D' and 'F' are assigned for larger events (M>6) where finite-source dimensions are not yet constrained. The addition of ground motion observations (or observed macroseismic intensities) reduces uncertainties over data-constrained portions of the map. Higher grades ('A' and 'B') correspond to ShakeMaps with constrained fault dimensions

  14. Dimensionality reduction for uncertainty quantification of nuclear engineering models.

    SciTech Connect

    Roderick, O.; Wang, Z.; Anitescu, M.

    2011-01-01

    The task of uncertainty quantification consists of relating the available information on uncertainties in the model setup to the resulting variation in the outputs of the model. Uncertainty quantification plays an important role in complex simulation models of nuclear engineering, where better understanding of uncertainty results in greater confidence in the model and in the improved safety and efficiency of engineering projects. In our previous work, we have shown that the effect of uncertainty can be approximated by polynomial regression with derivatives (PRD): a hybrid regression method that uses first-order derivatives of the model output as additional fitting conditions for a polynomial expansion. Numerical experiments have demonstrated the advantage of this approach over classical methods of uncertainty analysis: in precision, computational efficiency, or both. To obtain derivatives, we used automatic differentiation (AD) on the simulation code; hand-coded derivatives are acceptable for simpler models. We now present improvements on the method. We use a tuned version of the method of snapshots, a technique based on proper orthogonal decomposition (POD), to set up the reduced order representation of essential information on uncertainty in the model inputs. The automatically obtained sensitivity information is required to set up the method. Dimensionality reduction in combination with PRD allows analysis on a larger dimension of the uncertainty space (>100), at modest computational cost.

  15. Latin hypercube approach to estimate uncertainty in ground water vulnerability

    USGS Publications Warehouse

    Gurdak, J.J.; McCray, J.E.; Thyne, G.; Qi, S.L.

    2007-01-01

    A methodology is proposed to quantify prediction uncertainty associated with ground water vulnerability models that were developed through an approach that coupled multivariate logistic regression with a geographic information system (GIS). This method uses Latin hypercube sampling (LHS) to illustrate the propagation of input error and estimate uncertainty associated with the logistic regression predictions of ground water vulnerability. Central to the proposed method is the assumption that prediction uncertainty in ground water vulnerability models is a function of input error propagation from uncertainty in the estimated logistic regression model coefficients (model error) and the values of explanatory variables represented in the GIS (data error). Input probability distributions that represent both model and data error sources of uncertainty were simultaneously sampled using a Latin hypercube approach with logistic regression calculations of probability of elevated nonpoint source contaminants in ground water. The resulting probability distribution represents the prediction intervals and associated uncertainty of the ground water vulnerability predictions. The method is illustrated through a ground water vulnerability assessment of the High Plains regional aquifer. Results of the LHS simulations reveal significant prediction uncertainties that vary spatially across the regional aquifer. Additionally, the proposed method enables a spatial deconstruction of the prediction uncertainty that can lead to improved prediction of ground water vulnerability. ?? 2007 National Ground Water Association.

  16. Uncertainties of soil moisture in historical simulations and future projections

    NASA Astrophysics Data System (ADS)

    Cheng, Shanjun; Huang, Jianping; Ji, Fei; Lin, Lei

    2017-02-01

    Uncertainties of soil moisture in historical simulations (1920-2005) and future projections (2006-2080) were investigated by using the outputs from the Coupled Model Intercomparison Project Phase 5 and Community Earth System Model. The results showed that soil moisture climatology varies greatly among models despite the good agreement between the ensemble mean of simulated soil moisture and the Global Land Data Assimilation System data. The uncertainties of initial conditions and model structure showed similar spatial patterns and magnitudes, with high uncertainties in dry regions and low uncertainties in wet regions. In addition, the long-term variability of model structure uncertainty rapidly decreased before 1980 and increased thereafter, but the uncertainty in initial conditions showed an upward trend over the entire time span. The model structure and initial conditions can cause uncertainties at all time scales. Despite these large uncertainties, almost all of the simulations showed significant decreasing linear trends in soil moisture for the 21st century, especially in the Mediterranean region, northeast and southwest South America, southern Africa, and southwestern USA.

  17. Amplification uncertainty relation for probabilistic amplifiers

    NASA Astrophysics Data System (ADS)

    Namiki, Ryo

    2015-09-01

    Traditionally, quantum amplification limit refers to the property of inevitable noise addition on canonical variables when the field amplitude of an unknown state is linearly transformed through a quantum channel. Recent theoretical studies have determined amplification limits for cases of probabilistic quantum channels or general quantum operations by specifying a set of input states or a state ensemble. However, it remains open how much excess noise on canonical variables is unavoidable and whether there exists a fundamental trade-off relation between the canonical pair in a general amplification process. In this paper we present an uncertainty-product form of amplification limits for general quantum operations by assuming an input ensemble of Gaussian-distributed coherent states. It can be derived as a straightforward consequence of canonical uncertainty relations and retrieves basic properties of the traditional amplification limit. In addition, our amplification limit turns out to give a physical limitation on probabilistic reduction of an Einstein-Podolsky-Rosen uncertainty. In this regard, we find a condition that probabilistic amplifiers can be regarded as local filtering operations to distill entanglement. This condition establishes a clear benchmark to verify an advantage of non-Gaussian operations beyond Gaussian operations with a feasible input set of coherent states and standard homodyne measurements.

  18. Uncertainties of Mayak urine data

    SciTech Connect

    Miller, Guthrie; Vostrotin, Vadim; Vvdensky, Vladimir

    2008-01-01

    For internal dose calculations for the Mayak worker epidemiological study, quantitative estimates of uncertainty of the urine measurements are necessary. Some of the data consist of measurements of 24h urine excretion on successive days (e.g. 3 or 4 days). In a recent publication, dose calculations were done where the uncertainty of the urine measurements was estimated starting from the statistical standard deviation of these replicate mesurements. This approach is straightforward and accurate when the number of replicate measurements is large, however, a Monte Carlo study showed it to be problematic for the actual number of replicate measurements (median from 3 to 4). Also, it is sometimes important to characterize the uncertainty of a single urine measurement. Therefore this alternate method has been developed. A method of parameterizing the uncertainty of Mayak urine bioassay measmements is described. The Poisson lognormal model is assumed and data from 63 cases (1099 urine measurements in all) are used to empirically determine the lognormal normalization uncertainty, given the measurement uncertainties obtained from count quantities. The natural logarithm of the geometric standard deviation of the normalization uncertainty is found to be in the range 0.31 to 0.35 including a measurement component estimated to be 0.2.

  19. DNA systematics. Volume II

    SciTech Connect

    Dutta, S.K.

    1986-01-01

    This book discusses the following topics: PLANTS: PLANT DNA: Contents and Systematics. Repeated DNA Sequences and Polyploidy in Cereal Crops. Homology of Nonrepeated DNA Sequences in Phylogeny of Fungal Species. Chloropast DNA and Phylogenetic Relationships. rDNA: Evolution Over a Billion Years. 23S rRNA-derived Small Ribosomal RNAs: Their Structure and Evolution with Reference to Plant Phylogeny. Molecular Analysis of Plant DNA Genomes: Conserved and Diverged DNA Sequences. A Critical Review of Some Terminologies Used for Additional DNA in Plant Chromosomes and Index.

  20. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices

    SciTech Connect

    Haskin, F.E.; Harper, F.T.; Goossens, L.H.J.; Kraan, B.C.P.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  1. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Boardman, J.; Jones, J.A.; Harper, F.T.; Young, M.L.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  2. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Harrison, J.D.; Harper, F.T.; Hora, S.C.

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  3. Credible Computations: Standard and Uncertainty

    NASA Technical Reports Server (NTRS)

    Mehta, Unmeel B.; VanDalsem, William (Technical Monitor)

    1995-01-01

    The discipline of computational fluid dynamics (CFD) is at a crossroad. Most of the significant advances related to computational methods have taken place. The emphasis is now shifting from methods to results. Significant efforts are made in applying CFD to solve design problems. The value of CFD results in design depends on the credibility of computed results for the intended use. The process of establishing credibility requires a standard so that there is a consistency and uniformity in this process and in the interpretation of its outcome. The key element for establishing the credibility is the quantification of uncertainty. This paper presents salient features of a proposed standard and a procedure for determining the uncertainty. A customer of CFD products - computer codes and computed results - expects the following: A computer code in terms of its logic, numerics, and fluid dynamics and the results generated by this code are in compliance with specified requirements. This expectation is fulfilling by verification and validation of these requirements. The verification process assesses whether the problem is solved correctly and the validation process determines whether the right problem is solved. Standards for these processes are recommended. There is always some uncertainty, even if one uses validated models and verified computed results. The value of this uncertainty is important in the design process. This value is obtained by conducting a sensitivity-uncertainty analysis. Sensitivity analysis is generally defined as the procedure for determining the sensitivities of output parameters to input parameters. This analysis is a necessary step in the uncertainty analysis, and the results of this analysis highlight which computed quantities and integrated quantities in computations need to be determined accurately and which quantities do not require such attention. Uncertainty analysis is generally defined as the analysis of the effect of the uncertainties

  4. Using data assimilation for systematic model improvement

    NASA Astrophysics Data System (ADS)

    Lang, Matthew S.; van Leeuwen, Peter Jan; Browne, Phil

    2016-04-01

    In Numerical Weather Prediction parameterisations are used to simulate missing physics in the model. These can be due to a lack of scientific understanding or a lack of computing power available to address all the known physical processes. Parameterisations are sources of large uncertainty in a model as parameter values used in these parameterisations cannot be measured directly and hence are often not well known, and the parameterisations themselves are approximations of the processes present in the true atmosphere. Whilst there are many efficient and effective methods for combined state/parameter estimation in data assimilation, such as state augmentation, these are not effective at estimating the structure of parameterisations. A new method of parameterisation estimation is proposed that uses sequential data assimilation methods to estimate errors in the numerical models at each space-time point for each model equation. These errors are then fitted to predetermined functional forms of missing physics or parameterisations, that are based upon prior information. The method picks out the functional form, or that combination of functional forms, that bests fits the error structure. The prior information typically takes the form of expert knowledge. We applied the method to a one-dimensional advection model with additive model error, and it is shown that the method can accurately estimate parameterisations, with consistent error estimates. It is also demonstrated that state augmentation is not successful. The results indicate that this new method is a powerful tool in systematic model improvement.

  5. On solar geoengineering and climate uncertainty

    NASA Astrophysics Data System (ADS)

    MacMartin, Douglas G.; Kravitz, Ben; Rasch, Philip J.

    2015-09-01

    Uncertain climate system response has been raised as a concern regarding solar geoengineering. We explore the effects of geoengineering on one source of climate system uncertainty by evaluating the intermodel spread across 12 climate models participating in the Geoengineering Model Intercomparison project. The model spread in simulations of climate change and the model spread in the response to solar geoengineering are not additive but rather partially cancel. That is, the model spread in regional temperature and precipitation changes is reduced with CO2 and a solar reduction, in comparison to the case with increased CO2 alone. Furthermore, differences between models in their efficacy (the relative global mean temperature effect of solar versus CO2 radiative forcing) explain most of the regional differences between models in their response to an increased CO2 concentration that is offset by a solar reduction. These conclusions are important for clarifying geoengineering risks regarding uncertainty.

  6. On solar geoengineering and climate uncertainty

    SciTech Connect

    MacMartin, Douglas; Kravitz, Benjamin S.; Rasch, Philip J.

    2015-09-03

    Uncertainty in the climate system response has been raised as a concern regarding solar geoengineering. Here we show that model projections of regional climate change outcomes may have greater agreement under solar geoengineering than with CO2 alone. We explore the effects of geoengineering on one source of climate system uncertainty by evaluating the inter-model spread across 12 climate models participating in the Geoengineering Model Intercomparison project (GeoMIP). The model spread in regional temperature and precipitation changes is reduced with CO2 and a solar reduction, in comparison to the case with increased CO2 alone. That is, the intermodel spread in predictions of climate change and the model spread in the response to solar geoengineering are not additive but rather partially cancel. Furthermore, differences in efficacy explain most of the differences between models in their temperature response to an increase in CO2 that is offset by a solar reduction. These conclusions are important for clarifying geoengineering risks.

  7. Uncertainty relations for characteristic functions

    NASA Astrophysics Data System (ADS)

    Rudnicki, Łukasz; Tasca, D. S.; Walborn, S. P.

    2016-02-01

    We present the uncertainty relation for the characteristic functions (ChUR) of the quantum mechanical position and momentum probability distributions. This inequality is more general than the Heisenberg uncertainty relation and is saturated in two extreme cases for wave functions described by periodic Dirac combs. We further discuss a broad spectrum of applications of the ChUR; in particular, we constrain quantum optical measurements involving general detection apertures and provide the uncertainty relation that is relevant for loop quantum cosmology. A method to measure the characteristic function directly using an auxiliary qubit is also briefly discussed.

  8. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 1: Main report

    SciTech Connect

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M.; Boardman, J.; Jones, J.A.; Harper, F.T.; Young, M.L.; Hora, S.C.

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models.

  9. Interpretation of the peak areas in gamma-ray spectra that have a large relative uncertainty.

    PubMed

    Korun, M; Maver Modec, P; Vodenik, B

    2012-06-01

    Empirical evidence is provided that the areas of peaks having a relative uncertainty in excess of 30% are overestimated. This systematic influence is of a statistical nature and originates in way the peak-analyzing routine recognizes the small peaks. It is not easy to detect this influence since it is smaller than the peak-area uncertainty. However, the systematic influence can be revealed in repeated measurements under the same experimental conditions, e.g., in background measurements. To evaluate the systematic influence, background measurements were analyzed with the peak-analyzing procedure described by Korun et al. (2008). The magnitude of the influence depends on the relative uncertainty of the peak area and may amount, in the conditions used in the peak analysis, to a factor of 5 at relative uncertainties exceeding 60%. From the measurements, the probability for type-II errors, as a function of the relative uncertainty of the peak area, was extracted. This probability is near zero below an uncertainty of 30% and rises to 90% at uncertainties exceeding 50%.

  10. Uncertainty in measurement: a review of monte carlo simulation using microsoft excel for the calculation of uncertainties through functional relationships, including uncertainties in empirically derived constants.

    PubMed

    Farrance, Ian; Frenkel, Robert

    2014-02-01

    The Guide to the Expression of Uncertainty in Measurement (usually referred to as the GUM) provides the basic framework for evaluating uncertainty in measurement. The GUM however does not always provide clearly identifiable procedures suitable for medical laboratory applications, particularly when internal quality control (IQC) is used to derive most of the uncertainty estimates. The GUM modelling approach requires advanced mathematical skills for many of its procedures, but Monte Carlo simulation (MCS) can be used as an alternative for many medical laboratory applications. In particular, calculations for determining how uncertainties in the input quantities to a functional relationship propagate through to the output can be accomplished using a readily available spreadsheet such as Microsoft Excel. The MCS procedure uses algorithmically generated pseudo-random numbers which are then forced to follow a prescribed probability distribution. When IQC data provide the uncertainty estimates the normal (Gaussian) distribution is generally considered appropriate, but MCS is by no means restricted to this particular case. With input variations simulated by random numbers, the functional relationship then provides the corresponding variations in the output in a manner which also provides its probability distribution. The MCS procedure thus provides output uncertainty estimates without the need for the differential equations associated with GUM modelling. The aim of this article is to demonstrate the ease with which Microsoft Excel (or a similar spreadsheet) can be used to provide an uncertainty estimate for measurands derived through a functional relationship. In addition, we also consider the relatively common situation where an empirically derived formula includes one or more 'constants', each of which has an empirically derived numerical value. Such empirically derived 'constants' must also have associated uncertainties which propagate through the functional relationship

  11. TU-F-CAMPUS-J-04: Setup Uncertainties in the Mediastinum Area for IMRT Treatment of Lymphoma Patients

    SciTech Connect

    Aristophanous, M; Court, L

    2015-06-15

    Purpose: Despite daily image guidance setup uncertainties can be high when treating large areas of the body. The aim of this study was to measure local uncertainties inside the PTV for patients receiving IMRT to the mediastinum region. Methods: Eleven lymphoma patients that received radiotherapy (breath-hold) to the mediastinum were included in this study. The treated region could range all the way from the neck to the diaphragm. Each patient had a CT scan with a CT-on-rails system prior to every treatment. The entire PTV region was matched to the planning CT using automatic rigid registration. The PTV was then split into 5 regions: neck, supraclavicular, superior mediastinum, upper heart, lower heart. Additional auto-registrations for each of the 5 local PTV regions were performed. The residual local setup errors were calculated as the difference between the final global PTV position and the individual final local PTV positions for the AP, SI and RL directions. For each patient 4 CT scans were analyzed (1 per week of treatment). Results: The residual mean group error (M) and standard deviation of the inter-patient (or systematic) error (Σ) were lowest in the RL direction of the superior mediastinum (0.0mm and 0.5mm) and highest in the RL direction of the lower heart (3.5mm and 2.9mm). The standard deviation of the inter-fraction (or random) error (σ) was lowest in the RL direction of the superior mediastinum (0.5mm) and highest in the SI direction of the lower heart (3.9mm) The directionality of local uncertainties is important; a superior residual error in the lower heart for example keeps it in the global PTV. Conclusion: There is a complex relationship between breath-holding and positioning uncertainties that needs further investigation. Residual setup uncertainties can be significant even under daily CT image guidance when treating large regions of the body.

  12. Estimations of uncertainties of frequencies

    NASA Astrophysics Data System (ADS)

    Eyer, Laurent; Nicoletti, Jean-Marc; Morgenthaler, Stephan

    2016-10-01

    Diverse variable phenomena in the Universe are periodic. Astonishingly many of the periodic signals present in stars have timescales coinciding with human ones (from minutes to years). The periods of signals often have to be deduced from time series which are irregularly sampled and sparse, furthermore correlations between the brightness measurements and their estimated uncertainties are common. The uncertainty on the frequency estimation is reviewed. We explore the astronomical and statistical literature, in both cases of regular and irregular samplings. The frequency uncertainty is depending on signal to noise ratio, the frequency, the observational timespan. The shape of the light curve should also intervene, since sharp features such as exoplanet transits, stellar eclipses, raising branches of pulsation stars give stringent constraints. We propose several procedures (parametric and nonparametric) to estimate the uncertainty on the frequency which are subsequently tested against simulated data to assess their performances.

  13. Climate Projections and Uncertainty Communication.

    PubMed

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections.

  14. Uncertainty assessment tool for climate change impact indicators

    NASA Astrophysics Data System (ADS)

    Otto, Juliane; Keup-Thiel, Elke; Jacob, Daniela; Rechid, Diana; Lückenkötter, Johannes; Juckes, Martin

    2015-04-01

    A major difficulty in the study of climate change impact indicators is dealing with the numerous sources of uncertainties of climate and non-climate data . Its assessment, however, is needed to communicate to users the degree of certainty of climate change impact indicators. This communication of uncertainty is an important component of the FP7 project "Climate Information Portal for Copernicus" (CLIPC). CLIPC is developing a portal to provide a central point of access for authoritative scientific information on climate change. In this project the Climate Service Center 2.0 is in charge of the development of a tool to assess the uncertainty of climate change impact indicators. The calculation of climate change impact indicators will include climate data from satellite and in-situ observations, climate models and re-analyses, and non-climate data. There is a lack of a systematic classification of uncertainties arising from the whole range of climate change impact indicators. We develop a framework that intends to clarify the potential sources of uncertainty of a given indicator and provides - if possible - solutions how to quantify the uncertainties. To structure the sources of uncertainties of climate change impact indicators, we first classify uncertainties along a 'cascade of uncertainty' (Reyer 2013). Our cascade consists of three levels which correspond to the CLIPC meta-classification of impact indicators: Tier-1 indicators are intended to give information on the climate system. Tier-2 indicators attempt to quantify the impacts of climate change on biophysical systems (i.e. flood risks). Tier-3 indicators primarily aim at providing information on the socio-economic systems affected by climate change. At each level, the potential sources of uncertainty of the input data sets and its processing will be discussed. Reference: Reyer, C. (2013): The cascade of uncertainty in modeling forest ecosystem responses to environmental change and the challenge of sustainable

  15. Dynamical Realism and Uncertainty Propagation

    NASA Astrophysics Data System (ADS)

    Park, Inkwan

    In recent years, Space Situational Awareness (SSA) has become increasingly important as the number of tracked Resident Space Objects (RSOs) continues their growth. One of the most significant technical discussions in SSA is how to propagate state uncertainty in a consistent way with the highly nonlinear dynamical environment. In order to keep pace with this situation, various methods have been proposed to propagate uncertainty accurately by capturing the nonlinearity of the dynamical system. We notice that all of the methods commonly focus on a way to describe the dynamical system as precisely as possible based on a mathematical perspective. This study proposes a new perspective based on understanding dynamics of the evolution of uncertainty itself. We expect that profound insights of the dynamical system could present the possibility to develop a new method for accurate uncertainty propagation. These approaches are naturally concluded in goals of the study. At first, we investigate the most dominant factors in the evolution of uncertainty to realize the dynamical system more rigorously. Second, we aim at developing the new method based on the first investigation enabling orbit uncertainty propagation efficiently while maintaining accuracy. We eliminate the short-period variations from the dynamical system, called a simplified dynamical system (SDS), to investigate the most dominant factors. In order to achieve this goal, the Lie transformation method is introduced since this transformation can define the solutions for each variation separately. From the first investigation, we conclude that the secular variations, including the long-period variations, are dominant for the propagation of uncertainty, i.e., short-period variations are negligible. Then, we develop the new method by combining the SDS and the higher-order nonlinear expansion method, called state transition tensors (STTs). The new method retains advantages of the SDS and the STTs and propagates

  16. Thermodynamic and relativistic uncertainty relations

    NASA Astrophysics Data System (ADS)

    Artamonov, A. A.; Plotnikov, E. M.

    2017-01-01

    Thermodynamic uncertainty relation (UR) was verified experimentally. The experiments have shown the validity of the quantum analogue of the zeroth law of stochastic thermodynamics in the form of the saturated Schrödinger UR. We have also proposed a new type of UR for the relativistic mechanics. These relations allow us to consider macroscopic phenomena within the limits of the ratio of the uncertainty relations for different physical quantities.

  17. Communication in Creative Collaborations: The Challenges of Uncertainty and Desire Related to Task, Identity, and Relational Goals

    ERIC Educational Resources Information Center

    Jordan, Michelle E.; Babrow, Austin S.

    2013-01-01

    This study offers a systematic analysis of uncertainty in communication education by examining communication goals and challenges in the context of collaborative creative problem-solving in engineering assignments. Engineering design projects are seen as having the potential to help K-12 students learn to deal with uncertainty as well as a means…

  18. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  19. Pharmacological Fingerprints of Contextual Uncertainty

    PubMed Central

    Ruge, Diane; Stephan, Klaas E.

    2016-01-01

    Successful interaction with the environment requires flexible updating of our beliefs about the world. By estimating the likelihood of future events, it is possible to prepare appropriate actions in advance and execute fast, accurate motor responses. According to theoretical proposals, agents track the variability arising from changing environments by computing various forms of uncertainty. Several neuromodulators have been linked to uncertainty signalling, but comprehensive empirical characterisation of their relative contributions to perceptual belief updating, and to the selection of motor responses, is lacking. Here we assess the roles of noradrenaline, acetylcholine, and dopamine within a single, unified computational framework of uncertainty. Using pharmacological interventions in a sample of 128 healthy human volunteers and a hierarchical Bayesian learning model, we characterise the influences of noradrenergic, cholinergic, and dopaminergic receptor antagonism on individual computations of uncertainty during a probabilistic serial reaction time task. We propose that noradrenaline influences learning of uncertain events arising from unexpected changes in the environment. In contrast, acetylcholine balances attribution of uncertainty to chance fluctuations within an environmental context, defined by a stable set of probabilistic associations, or to gross environmental violations following a contextual switch. Dopamine supports the use of uncertainty representations to engender fast, adaptive responses. PMID:27846219

  20. Uncertainty of empirical correlation equations

    NASA Astrophysics Data System (ADS)

    Feistel, R.; Lovell-Smith, J. W.; Saunders, P.; Seitz, S.

    2016-08-01

    The International Association for the Properties of Water and Steam (IAPWS) has published a set of empirical reference equations of state, forming the basis of the 2010 Thermodynamic Equation of Seawater (TEOS-10), from which all thermodynamic properties of seawater, ice, and humid air can be derived in a thermodynamically consistent manner. For each of the equations of state, the parameters have been found by simultaneously fitting equations for a range of different derived quantities using large sets of measurements of these quantities. In some cases, uncertainties in these fitted equations have been assigned based on the uncertainties of the measurement results. However, because uncertainties in the parameter values have not been determined, it is not possible to estimate the uncertainty in many of the useful quantities that can be calculated using the parameters. In this paper we demonstrate how the method of generalised least squares (GLS), in which the covariance of the input data is propagated into the values calculated by the fitted equation, and in particular into the covariance matrix of the fitted parameters, can be applied to one of the TEOS-10 equations of state, namely IAPWS-95 for fluid pure water. Using the calculated parameter covariance matrix, we provide some preliminary estimates of the uncertainties in derived quantities, namely the second and third virial coefficients for water. We recommend further investigation of the GLS method for use as a standard method for calculating and propagating the uncertainties of values computed from empirical equations.

  1. Wildfire Decision Making Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Thompson, M.

    2013-12-01

    Decisions relating to wildfire management are subject to multiple sources of uncertainty, and are made by a broad range of individuals, across a multitude of environmental and socioeconomic contexts. In this presentation I will review progress towards identification and characterization of uncertainties and how this information can support wildfire decision-making. First, I will review a typology of uncertainties common to wildfire management, highlighting some of the more salient sources of uncertainty and how they present challenges to assessing wildfire risk. This discussion will cover the expanding role of burn probability modeling, approaches for characterizing fire effects, and the role of multi-criteria decision analysis, and will provide illustrative examples of integrated wildfire risk assessment across a variety of planning scales. Second, I will describe a related uncertainty typology that focuses on the human dimensions of wildfire management, specifically addressing how social, psychological, and institutional factors may impair cost-effective risk mitigation. This discussion will encompass decision processes before, during, and after fire events, with a specific focus on active management of complex wildfire incidents. An improved ability to characterize uncertainties faced in wildfire management could lead to improved delivery of decision support, targeted communication strategies, and ultimately to improved wildfire management outcomes.

  2. Uncertainty Quantification of Calculated Temperatures for the AGR-1 Experiment

    SciTech Connect

    Binh T. Pham; Jeffrey J. Einerson; Grant L. Hawkes

    2012-04-01

    This report documents an effort to quantify the uncertainty of the calculated temperature data for the first Advanced Gas Reactor (AGR-1) fuel irradiation experiment conducted in the INL's Advanced Test Reactor (ATR) in support of the Next Generation Nuclear Plant (NGNP) R&D program. Recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, the results of the numerical simulations can be used in combination with the statistical analysis methods to improve qualification of measured data. Additionally, the temperature simulation data for AGR tests can be used for validation of the fuel transport and fuel performance simulation models. The crucial roles of the calculated fuel temperatures in ensuring achievement of the AGR experimental program objectives require accurate determination of the model temperature uncertainties. The report is organized into three chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program and provides overviews of AGR-1 measured data, AGR-1 test configuration and test procedure, and thermal simulation. Chapters 2 describes the uncertainty quantification procedure for temperature simulation data of the AGR-1 experiment, namely, (i) identify and quantify uncertainty sources; (ii) perform sensitivity analysis for several thermal test conditions; (iii) use uncertainty propagation to quantify overall response temperature uncertainty. A set of issues associated with modeling uncertainties resulting from the expert assessments are identified. This also includes the experimental design to estimate the main effects and interactions of the important thermal model parameters. Chapter 3 presents the overall uncertainty results for the six AGR-1 capsules. This includes uncertainties for the daily volume-average and peak fuel temperatures, daily average temperatures at TC locations, and time-average volume-average and time-average peak fuel temperatures.

  3. Uncertainty Quantification of Calculated Temperatures for the AGR-1 Experiment

    SciTech Connect

    Binh T. Pham; Jeffrey J. Einerson; Grant L. Hawkes

    2013-03-01

    This report documents an effort to quantify the uncertainty of the calculated temperature data for the first Advanced Gas Reactor (AGR-1) fuel irradiation experiment conducted in the INL’s Advanced Test Reactor (ATR) in support of the Next Generation Nuclear Plant (NGNP) R&D program. Recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, the results of the numerical simulations can be used in combination with the statistical analysis methods to improve qualification of measured data. Additionally, the temperature simulation data for AGR tests can be used for validation of the fuel transport and fuel performance simulation models. The crucial roles of the calculated fuel temperatures in ensuring achievement of the AGR experimental program objectives require accurate determination of the model temperature uncertainties. The report is organized into three chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program and provides overviews of AGR-1 measured data, AGR-1 test configuration and test procedure, and thermal simulation. Chapters 2 describes the uncertainty quantification procedure for temperature simulation data of the AGR-1 experiment, namely, (i) identify and quantify uncertainty sources; (ii) perform sensitivity analysis for several thermal test conditions; (iii) use uncertainty propagation to quantify overall response temperature uncertainty. A set of issues associated with modeling uncertainties resulting from the expert assessments are identified. This also includes the experimental design to estimate the main effects and interactions of the important thermal model parameters. Chapter 3 presents the overall uncertainty results for the six AGR-1 capsules. This includes uncertainties for the daily volume-average and peak fuel temperatures, daily average temperatures at TC locations, and time-average volume-average and time-average peak fuel temperatures.

  4. Visual Scanning Hartmann Optical Tester (VSHOT) Uncertainty Analysis (Milestone Report)

    SciTech Connect

    Gray, A.; Lewandowski, A.; Wendelin, T.

    2010-10-01

    In 1997, an uncertainty analysis was conducted of the Video Scanning Hartmann Optical Tester (VSHOT). In 2010, we have completed a new analysis, based primarily on the geometric optics of the system, and it shows sensitivities to various design and operational parameters. We discuss sources of error with measuring devices, instrument calibrations, and operator measurements for a parabolic trough mirror panel test. These help to guide the operator in proper setup, and help end-users to understand the data they are provided. We include both the systematic (bias) and random (precision) errors for VSHOT testing and their contributions to the uncertainty. The contributing factors we considered in this study are: target tilt; target face to laser output distance; instrument vertical offset; laser output angle; distance between the tool and the test piece; camera calibration; and laser scanner. These contributing factors were applied to the calculated slope error, focal length, and test article tilt that are generated by the VSHOT data processing. Results show the estimated 2-sigma uncertainty in slope error for a parabolic trough line scan test to be +/-0.2 milliradians; uncertainty in the focal length is +/- 0.1 mm, and the uncertainty in test article tilt is +/- 0.04 milliradians.

  5. Modelling ecosystem service flows under uncertainty with stochiastic SPAN

    USGS Publications Warehouse

    Johnson, Gary W.; Snapp, Robert R.; Villa, Ferdinando; Bagstad, Kenneth J.

    2012-01-01

    Ecosystem service models are increasingly in demand for decision making. However, the data required to run these models are often patchy, missing, outdated, or untrustworthy. Further, communication of data and model uncertainty to decision makers is often either absent or unintuitive. In this work, we introduce a systematic approach to addressing both the data gap and the difficulty in communicating uncertainty through a stochastic adaptation of the Service Path Attribution Networks (SPAN) framework. The SPAN formalism assesses ecosystem services through a set of up to 16 maps, which characterize the services in a study area in terms of flow pathways between ecosystems and human beneficiaries. Although the SPAN algorithms were originally defined deterministically, we present them here in a stochastic framework which combines probabilistic input data with a stochastic transport model in order to generate probabilistic spatial outputs. This enables a novel feature among ecosystem service models: the ability to spatially visualize uncertainty in the model results. The stochastic SPAN model can analyze areas where data limitations are prohibitive for deterministic models. Greater uncertainty in the model inputs (including missing data) should lead to greater uncertainty expressed in the model’s output distributions. By using Bayesian belief networks to fill data gaps and expert-provided trust assignments to augment untrustworthy or outdated information, we can account for uncertainty in input data, producing a model that is still able to run and provide information where strictly deterministic models could not. Taken together, these attributes enable more robust and intuitive modelling of ecosystem services under uncertainty.

  6. Uncertainty and sensitivity analysis for photovoltaic system modeling.

    SciTech Connect

    Hansen, Clifford W.; Pohl, Andrew Phillip; Jordan, Dirk

    2013-12-01

    We report an uncertainty and sensitivity analysis for modeling DC energy from photovoltaic systems. We consider two systems, each comprised of a single module using either crystalline silicon or CdTe cells, and located either at Albuquerque, NM, or Golden, CO. Output from a PV system is predicted by a sequence of models. Uncertainty in the output of each model is quantified by empirical distributions of each model's residuals. We sample these distributions to propagate uncertainty through the sequence of models to obtain an empirical distribution for each PV system's output. We considered models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane-of-array irradiance; (2) estimate effective irradiance from plane-of-array irradiance; (3) predict cell temperature; and (4) estimate DC voltage, current and power. We found that the uncertainty in PV system output to be relatively small, on the order of 1% for daily energy. Four alternative models were considered for the POA irradiance modeling step; we did not find the choice of one of these models to be of great significance. However, we observed that the POA irradiance model introduced a bias of upwards of 5% of daily energy which translates directly to a systematic difference in predicted energy. Sensitivity analyses relate uncertainty in the PV system output to uncertainty arising from each model. We found that the residuals arising from the POA irradiance and the effective irradiance models to be the dominant contributors to residuals for daily energy, for either technology or location considered. This analysis indicates that efforts to reduce the uncertainty in PV system output should focus on improvements to the POA and effective irradiance models.

  7. Systematic Alternatives to Proposal Preparation.

    ERIC Educational Resources Information Center

    Knirk, Frederick G.; And Others

    Educators who have to develop proposals must be concerned with making effective decisions. This paper discusses a number of educational systems management tools which can be used to reduce the time and effort in developing a proposal. In addition, ways are introduced to systematically increase the quality of the proposal through the development of…

  8. Assessing Uncertainty in Subsurface Transport Predictions Using the ASCEM Toolset

    NASA Astrophysics Data System (ADS)

    Freedman, V.; Chen, X.; Keating, E. H.; Higdon, D. M.; Rockhold, M. L.; Schuchardt, K. L.; Finsterle, S.; Gorton, I.; Freshley, M.

    2011-12-01

    Transport simulation of nonreactive solutes can be used to identify potential pathways of contaminants in the vadose zone and the effectiveness of site remediation technologies. At the BC Cribs site at Hanford in southeastern Washington State, innovative remedial technologies are being explored to address recalcitrant contamination in the deep (~100 m) vadose zone. To identify the effectiveness of the technologies, the impacts of a "no-action" alternative must also be explored. Because only sparse information is available for the geologic conceptual model and the physical and chemical properties of the sediments, there is considerable uncertainty in subsurface transport predictions. In this contribution, the uncertainty of the technetium-99 mass flux to the water table due to parameter uncertainty and variations in the conceptual model are investigated using a newly developed toolset for performing an uncertainty quantification (UQ) analysis. This toolset is part of ASCEM (Advanced Simulation Capability for Environmental Management), a state-of-the-art scientific tool and approach for understanding and predicting contaminant fate and transport in natural and engineered systems. Using the Akuna user environment currently under development, the uncertainty in technetium-99 transport through a two-dimensional, heterogeneous vadose-zone system is quantified with Monte Carlo simulation. Results show that uncertainty in simulated mass fluxes in hydraulic properties can be significant within a single conceptual model, and that significant additional uncertainty can be introduced by conceptual model variation.

  9. Cost uncertainty for different levels of technology maturity

    SciTech Connect

    DeMuth, S.F.; Franklin, A.L.

    1996-08-07

    It is difficult at best to apply a single methodology for estimating cost uncertainties related to technologies of differing maturity. While highly mature technologies may have significant performance and manufacturing cost data available, less well developed technologies may be defined in only conceptual terms. Regardless of the degree of technical maturity, often a cost estimate relating to application of the technology may be required to justify continued funding for development. Yet, a cost estimate without its associated uncertainty lacks the information required to assess the economic risk. For this reason, it is important for the developer to provide some type of uncertainty along with a cost estimate. This study demonstrates how different methodologies for estimating uncertainties can be applied to cost estimates for technologies of different maturities. For a less well developed technology an uncertainty analysis of the cost estimate can be based on a sensitivity analysis; whereas, an uncertainty analysis of the cost estimate for a well developed technology can be based on an error propagation technique from classical statistics. It was decided to demonstrate these uncertainty estimation techniques with (1) an investigation of the additional cost of remediation due to beyond baseline, nearly complete, waste heel retrieval from underground storage tanks (USTs) at Hanford; and (2) the cost related to the use of crystalline silico-titanate (CST) rather than the baseline CS100 ion exchange resin for cesium separation from UST waste at Hanford.

  10. The Role of Uncertainty, Awareness, and Trust in Visual Analytics.

    PubMed

    Sacha, Dominik; Senaratne, Hansi; Kwon, Bum Chul; Ellis, Geoffrey; Keim, Daniel A

    2016-01-01

    Visual analytics supports humans in generating knowledge from large and often complex datasets. Evidence is collected, collated and cross-linked with our existing knowledge. In the process, a myriad of analytical and visualisation techniques are employed to generate a visual representation of the data. These often introduce their own uncertainties, in addition to the ones inherent in the data, and these propagated and compounded uncertainties can result in impaired decision making. The user's confidence or trust in the results depends on the extent of user's awareness of the underlying uncertainties generated on the system side. This paper unpacks the uncertainties that propagate through visual analytics systems, illustrates how human's perceptual and cognitive biases influence the user's awareness of such uncertainties, and how this affects the user's trust building. The knowledge generation model for visual analytics is used to provide a terminology and framework to discuss the consequences of these aspects in knowledge construction and though examples, machine uncertainty is compared to human trust measures with provenance. Furthermore, guidelines for the design of uncertainty-aware systems are presented that can aid the user in better decision making.

  11. Quantifying uncertainty in stable isotope mixing models

    SciTech Connect

    Davis, Paul; Syme, James; Heikoop, Jeffrey; Fessenden-Rahn, Julianna; Perkins, George; Newman, Brent; Chrystal, Abbey E.; Hagerty, Shannon B.

    2015-05-19

    Mixing models are powerful tools for identifying biogeochemical sources and determining mixing fractions in a sample. However, identification of actual source contributors is often not simple, and source compositions typically vary or even overlap, significantly increasing model uncertainty in calculated mixing fractions. This study compares three probabilistic methods, SIAR [Parnell et al., 2010] a pure Monte Carlo technique (PMC), and Stable Isotope Reference Source (SIRS) mixing model, a new technique that estimates mixing in systems with more than three sources and/or uncertain source compositions. In this paper, we use nitrate stable isotope examples (δ15N and δ18O) but all methods tested are applicable to other tracers. In Phase I of a three-phase blind test, we compared methods for a set of six-source nitrate problems. PMC was unable to find solutions for two of the target water samples. The Bayesian method, SIAR, experienced anchoring problems, and SIRS calculated mixing fractions that most closely approximated the known mixing fractions. For that reason, SIRS was the only approach used in the next phase of testing. In Phase II, the problem was broadened where any subset of the six sources could be a possible solution to the mixing problem. Results showed a high rate of Type I errors where solutions included sources that were not contributing to the sample. In Phase III some sources were eliminated based on assumed site knowledge and assumed nitrate concentrations, substantially reduced mixing fraction uncertainties and lowered the Type I error rate. These results demonstrate that valuable insights into stable isotope mixing problems result from probabilistic mixing model approaches like SIRS. The results also emphasize the importance of identifying a minimal set of potential sources and quantifying uncertainties in source isotopic composition as well as demonstrating the value of additional information in reducing the

  12. Quantifying uncertainty in stable isotope mixing models

    DOE PAGES

    Davis, Paul; Syme, James; Heikoop, Jeffrey; ...

    2015-05-19

    Mixing models are powerful tools for identifying biogeochemical sources and determining mixing fractions in a sample. However, identification of actual source contributors is often not simple, and source compositions typically vary or even overlap, significantly increasing model uncertainty in calculated mixing fractions. This study compares three probabilistic methods, SIAR [Parnell et al., 2010] a pure Monte Carlo technique (PMC), and Stable Isotope Reference Source (SIRS) mixing model, a new technique that estimates mixing in systems with more than three sources and/or uncertain source compositions. In this paper, we use nitrate stable isotope examples (δ15N and δ18O) but all methods testedmore » are applicable to other tracers. In Phase I of a three-phase blind test, we compared methods for a set of six-source nitrate problems. PMC was unable to find solutions for two of the target water samples. The Bayesian method, SIAR, experienced anchoring problems, and SIRS calculated mixing fractions that most closely approximated the known mixing fractions. For that reason, SIRS was the only approach used in the next phase of testing. In Phase II, the problem was broadened where any subset of the six sources could be a possible solution to the mixing problem. Results showed a high rate of Type I errors where solutions included sources that were not contributing to the sample. In Phase III some sources were eliminated based on assumed site knowledge and assumed nitrate concentrations, substantially reduced mixing fraction uncertainties and lowered the Type I error rate. These results demonstrate that valuable insights into stable isotope mixing problems result from probabilistic mixing model approaches like SIRS. The results also emphasize the importance of identifying a minimal set of potential sources and quantifying uncertainties in source isotopic composition as well as demonstrating the value of additional information in reducing the uncertainty in calculated

  13. Quantifying uncertainty in stable isotope mixing models

    NASA Astrophysics Data System (ADS)

    Davis, Paul; Syme, James; Heikoop, Jeffrey; Fessenden-Rahn, Julianna; Perkins, George; Newman, Brent; Chrystal, Abbey E.; Hagerty, Shannon B.

    2015-05-01

    Mixing models are powerful tools for identifying biogeochemical sources and determining mixing fractions in a sample. However, identification of actual source contributors is often not simple, and source compositions typically vary or even overlap, significantly increasing model uncertainty in calculated mixing fractions. This study compares three probabilistic methods, Stable Isotope Analysis in R (SIAR), a pure Monte Carlo technique (PMC), and Stable Isotope Reference Source (SIRS) mixing model, a new technique that estimates mixing in systems with more than three sources and/or uncertain source compositions. In this paper, we use nitrate stable isotope examples (δ15N and δ18O) but all methods tested are applicable to other tracers. In Phase I of a three-phase blind test, we compared methods for a set of six-source nitrate problems. PMC was unable to find solutions for two of the target water samples. The Bayesian method, SIAR, experienced anchoring problems, and SIRS calculated mixing fractions that most closely approximated the known mixing fractions. For that reason, SIRS was the only approach used in the next phase of testing. In Phase II, the problem was broadened where any subset of the six sources could be a possible solution to the mixing problem. Results showed a high rate of Type I errors where solutions included sources that were not contributing to the sample. In Phase III some sources were eliminated based on assumed site knowledge and assumed nitrate concentrations, substantially reduced mixing fraction uncertainties and lowered the Type I error rate. These results demonstrate that valuable insights into stable isotope mixing problems result from probabilistic mixing model approaches like SIRS. The results also emphasize the importance of identifying a minimal set of potential sources and quantifying uncertainties in source isotopic composition as well as demonstrating the value of additional information in reducing the uncertainty in calculated

  14. Reducing Uncertainties in Neutron-Induced Fission Cross Sections Using a Time Projection Chamber

    NASA Astrophysics Data System (ADS)

    Manning, Brett; Niffte Collaboration

    2015-10-01

    Neutron-induced fission cross sections for actinides have long been of great interest for nuclear energy and stockpile stewardship. Traditionally, measurements were performed using fission chambers which provided limited information about the detected fission events. For the case of 239Pu(n,f), sensitivity studies have shown a need for more precise measurements. Recently the Neutron Induced Fission Fragment Tracking Experiment (NIFFTE) has developed the fission Time Projection Chamber (fissionTPC) to measure fission cross sections to better than 1% uncertainty by providing 3D tracking of fission fragments. The fissionTPC collected data to calculate the 239Pu(n,f) cross section at the Weapons Neutron Research facility at the Los Alamos Neutron Science Center during the 2014 run cycle. Preliminary analysis has been focused on studying particle identification and target and beam non-uniformities to reduce the uncertainty on the cross section. Additionally, the collaboration is investigating other systematic errors that could not be well studied with a traditional fission chamber. LA-UR-15-24906.

  15. Risk-cost-benefit analysis for transportation corridors with interval uncertainties of heterogeneous data.

    PubMed

    Xu, Junrui; Lambert, James H

    2015-04-01

    Access management, which systematically limits opportunities for egress and ingress of vehicles to highway lanes, is critical to protect trillions of dollars of current investment in transportation. This article addresses allocating resources for access management with incomplete and partially relevant data on crash rates, travel speeds, and other factors. While access management can be effective to avoid crashes, reduce travel times, and increase route capacities, the literature suggests a need for performance metrics to guide investments in resource allocation across large corridor networks and several time horizons. In this article, we describe a quantitative decision model to support an access management program via risk-cost-benefit analysis under data uncertainties from diverse sources of data and expertise. The approach quantifies potential benefits, including safety improvement and travel time savings, and costs of access management through functional relationships of input parameters including crash rates, corridor access point densities, and traffic volumes. Parameter uncertainties, which vary across locales and experts, are addressed via numerical interval analyses. This approach is demonstrated at several geographic scales across 7,000 kilometers of highways in a geographic region and several subregions. The demonstration prioritizes route segments that would benefit from risk management, including (i) additional data or elicitation, (ii) right-of-way purchases, (iii) restriction or closing of access points, (iv) new alignments, (v) developer proffers, and (vi) etc. The approach ought to be of wide interest to analysts, planners, policymakers, and stakeholders who rely on heterogeneous data and expertise for risk management.

  16. Propagation of uncertainties in soil and pesticide properties to pesticide leaching.

    PubMed

    van den Berg, F; Tiktak, A; Heuvelink, G B M; Burgers, S L G E; Brus, D J; de Vries, F; Stolte, J; Kroes, J G

    2012-01-01

    In the new Dutch decision tree for the evaluation of pesticide leaching to groundwater, spatially distributed soil data are used by the GeoPEARL model to calculate the 90th percentile of the spatial cumulative distribution function of the leaching concentration in the area of potential usage (SP90). Until now it was not known to what extent uncertainties in soil and pesticide properties propagate to spatially aggregated parameters like the SP90. A study was performed to quantify the uncertainties in soil and pesticide properties and to analyze their contribution to the uncertainty in SP90. First, uncertainties in the soil and pesticide properties were quantified. Next, a regular grid sample of points covering the whole of the agricultural area in the Netherlands was randomly selected. At the grid nodes, realizations from the probability distributions of the uncertain inputs were generated and used as input to a Monte Carlo uncertainty propagation analysis. The analysis showed that the uncertainty concerning the SP90 is 10 times smaller than the uncertainty about the leaching concentration at individual point locations. The parameters that contribute most to the uncertainty about the SP90 are, however, the same as the parameters that contribute most to uncertainty about the leaching concentration at individual point locations (e.g., the transformation half-life in soil and the coefficient of sorption on organic matter). Taking uncertainties in soil and pesticide properties into account further leads to a systematic increase of the predicted SP90. The important implication for pesticide regulation is that the leaching concentration is systematically underestimated when these uncertainties are ignored.

  17. Intolerance of uncertainty in emotional disorders: What uncertainties remain?

    PubMed

    Shihata, Sarah; McEvoy, Peter M; Mullan, Barbara Ann; Carleton, R Nicholas

    2016-06-01

    The current paper presents a future research agenda for intolerance of uncertainty (IU), which is a transdiagnostic risk and maintaining factor for emotional disorders. In light of the accumulating interest and promising research on IU, it is timely to emphasize the theoretical and therapeutic significance of IU, as well as to highlight what remains unknown about IU across areas such as development, assessment, behavior, threat and risk, and relationships to cognitive vulnerability factors and emotional disorders. The present paper was designed to provide a synthesis of what is known and unknown about IU, and, in doing so, proposes broad and novel directions for future research to address the remaining uncertainties in the literature.

  18. Entropic uncertainty and measurement reversibility

    NASA Astrophysics Data System (ADS)

    Berta, Mario; Wehner, Stephanie; Wilde, Mark M.

    2015-11-01

    The entropic uncertainty relation with quantum side information (EUR-QSI) from [Berta et al., Nat. Phys. 6, 659 (2010)] is a unifying principle relating two distinctive features of quantum mechanics: quantum uncertainty due to measurement incompatibility, and entanglement. In these relations, quantum uncertainty takes the form of preparation uncertainty where one of two incompatible measurements is applied. In particular, the "uncertainty witness" lower bound in the EUR-QSI is not a function of a post-measurement state. An insightful proof of the EUR-QSI from [Coles et al., Phys. Rev. Lett. 108, 210405 (2012)] makes use of a fundamental mathematical consequence of the postulates of quantum mechanics known as the non-increase of quantum relative entropy under quantum channels. Here, we exploit this perspective to establish a tightening of the EUR-QSI which adds a new state-dependent term in the lower bound, related to how well one can reverse the action of a quantum measurement. As such, this new term is a direct function of the post-measurement state and can be thought of as quantifying how much disturbance a given measurement causes. Our result thus quantitatively unifies this feature of quantum mechanics with the others mentioned above. We have experimentally tested our theoretical predictions on the IBM Quantum Experience and find reasonable agreement between our predictions and experimental outcomes.

  19. Uncertainty Quantification for Airfoil Icing

    NASA Astrophysics Data System (ADS)

    DeGennaro, Anthony Matteo

    Ensuring the safety of airplane flight in icing conditions is an important and active arena of research in the aerospace community. Notwithstanding the research, development, and legislation aimed at certifying airplanes for safe operation, an analysis of the effects of icing uncertainties on certification quantities of interest is generally lacking. The central objective of this thesis is to examine and analyze problems in airfoil ice accretion from the standpoint of uncertainty quantification. We focus on three distinct areas: user-informed, data-driven, and computational uncertainty quantification. In the user-informed approach to uncertainty quantification, we discuss important canonical icing classifications and show how these categories can be modeled using a few shape parameters. We then investigate the statistical effects of these parameters. In the data-driven approach, we build statistical models of airfoil ice shapes from databases of actual ice shapes, and quantify the effects of these parameters. Finally, in the computational approach, we investigate the effects of uncertainty in the physics of the ice accretion process, by perturbing the input to an in-house numerical ice accretion code that we develop in this thesis.

  20. Entropic uncertainty and measurement reversibility

    NASA Astrophysics Data System (ADS)

    Berta, Mario; Wehner, Stephanie; Wilde, Mark M.

    2016-07-01

    The entropic uncertainty relation with quantum side information (EUR-QSI) from (Berta et al 2010 Nat. Phys. 6 659) is a unifying principle relating two distinctive features of quantum mechanics: quantum uncertainty due to measurement incompatibility, and entanglement. In these relations, quantum uncertainty takes the form of preparation uncertainty where one of two incompatible measurements is applied. In particular, the ‘uncertainty witness’ lower bound in the EUR-QSI is not a function of a post-measurement state. An insightful proof of the EUR-QSI from (Coles et al 2012 Phys. Rev. Lett. 108 210405) makes use of a fundamental mathematical consequence of the postulates of quantum mechanics known as the non-increase of quantum relative entropy under quantum channels. Here, we exploit this perspective to establish a tightening of the EUR-QSI which adds a new state-dependent term in the lower bound, related to how well one can reverse the action of a quantum measurement. As such, this new term is a direct function of the post-measurement state and can be thought of as quantifying how much disturbance a given measurement causes. Our result thus quantitatively unifies this feature of quantum mechanics with the others mentioned above. We have experimentally tested our theoretical predictions on the IBM quantum experience and find reasonable agreement between our predictions and experimental outcomes.

  1. Communicating spatial uncertainty to non-experts using R

    NASA Astrophysics Data System (ADS)

    Luzzi, Damiano; Sawicka, Kasia; Heuvelink, Gerard; de Bruin, Sytze

    2016-04-01

    Effective visualisation methods are important for the efficient use of uncertainty information for various groups of users. Uncertainty propagation analysis is often used with spatial environmental models to quantify the uncertainty within the information. A challenge arises when trying to effectively communicate the uncertainty information to non-experts (not statisticians) in a wide range of cases. Due to the growing popularity and applicability of the open source programming language R, we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. The package has implemented Monte Carlo algorithms for uncertainty propagation, the output of which is represented by an ensemble of model outputs (i.e. a sample from a probability distribution). Numerous visualisation methods exist that aim to present such spatial uncertainty information both statically, dynamically and interactively. To provide the most universal visualisation tools for non-experts, we conducted a survey on a group of 20 university students and assessed the effectiveness of selected static and interactive methods for visualising uncertainty in spatial variables such as DEM and land cover. The static methods included adjacent maps and glyphs for continuous variables. Both allow for displaying maps with information about the ensemble mean, variance/standard deviation and prediction intervals. Adjacent maps were also used for categorical data, displaying maps of the most probable class, as well as its associated probability. The interactive methods included a graphical user interface, which in addition to displaying the previously mentioned variables also allowed for comparison of joint uncertainties at multiple locations. The survey indicated that users could understand the basics of the uncertainty information displayed in the static maps, with the interactive interface allowing for more in-depth information. Subsequently, the R

  2. Capturing the complexity of uncertainty language to maximise its use.

    NASA Astrophysics Data System (ADS)

    Juanchich, Marie; Sirota, Miroslav

    2016-04-01

    Uncertainty is often communicated verbally, using uncertainty phrases such as 'there is a small risk of earthquake', 'flooding is possible' or 'it is very likely the sea level will rise'. Prior research has only examined a limited number of properties of uncertainty phrases: mainly the probability conveyed (e.g., 'a small chance' convey a small probability whereas 'it is likely' convey a high probability). We propose a new analytical framework that captures more of the complexity of uncertainty phrases by studying their semantic, pragmatic and syntactic properties. Further, we argue that the complexity of uncertainty phrases is functional and can be leveraged to best describe uncertain outcomes and achieve the goals of speakers. We will present findings from a corpus study and an experiment where we assessed the following properties of uncertainty phrases: probability conveyed, subjectivity, valence, nature of the subject, grammatical category of the uncertainty quantifier and whether the quantifier elicits a positive or a negative framing. Natural language processing techniques applied to corpus data showed that people use a very large variety of uncertainty phrases representing different configurations of the properties of uncertainty phrases (e.g., phrases that convey different levels of subjectivity, phrases with different grammatical construction). In addition, the corpus analysis uncovered that uncertainty phrases commonly studied in psychology are not the most commonly used in real life. In the experiment we manipulated the amount of evidence indicating that a fact was true and whether the participant was required to prove the fact was true or that it was false. Participants produced a phrase to communicate the likelihood that the fact was true (e.g., 'it is not sure…', 'I am convinced that…'). The analyses of the uncertainty phrases produced showed that participants leveraged the properties of uncertainty phrases to reflect the strength of evidence but

  3. Space Radiation Cancer Risks and Uncertainties for Mars Missions

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Schimmerling, W.; Wilson, J. W.; Peterson, L. E.; Badhwar, G. D.; Saganti, P. B.; Dicello, J. F.

    2001-01-01

    Projecting cancer risks from exposure to space radiation is highly uncertain because of the absence of data for humans and because of the limited radiobiology data available for estimating late effects from the high-energy and charge (HZE) ions present in the galactic cosmic rays (GCR). Cancer risk projections involve many biological and physical factors, each of which has a differential range of uncertainty due to the lack of data and knowledge. We discuss an uncertainty assessment within the linear-additivity model using the approach of Monte Carlo sampling from subjective error distributions that represent the lack of knowledge in each factor to quantify the overall uncertainty in risk projections. Calculations are performed using the space radiation environment and transport codes for several Mars mission scenarios. This approach leads to estimates of the uncertainties in cancer risk projections of 400-600% for a Mars mission. The uncertainties in the quality factors are dominant. Using safety standards developed for low-Earth orbit, long-term space missions (>90 days) outside the Earth's magnetic field are currently unacceptable if the confidence levels in risk projections are considered. Because GCR exposures involve multiple particle or delta-ray tracks per cellular array, our results suggest that the shape of the dose response at low dose rates may be an additional uncertainty for estimating space radiation risks.

  4. Uncertainty Analysis of non-point source pollution control facilities design techniques in Korea

    NASA Astrophysics Data System (ADS)

    Lee, J.; Okjeong, L.; Gyeong, C. B.; Park, M. W.; Kim, S.

    2015-12-01

    The design of non-point sources control facilities in Korea is divided largely by the stormwater capture ratio, the stormwater load capture ratio, and the pollutant reduction efficiency of the facility. The stormwater capture ratio is given by a design formula as a function of the water quality treatment capacity, the greater the capacity, the more the amount of stormwater intercepted by the facility. The stormwater load capture ratio is defined as the ratio of the load entering the facility of the total pollutant load generated in the target catchment, and is given as a design formula represented by a function of the stormwater capture ratio. In order to estimate the stormwater capture ratio and load capture ratio, a lot of quantitative analysis of hydrologic processes acted in pollutant emission is required, but these formulas have been applied without any verification. Since systematic monitoring programs were insufficient, verification of these formulas was fundamentally impossible. However, recently the Korean ministry of Environment has conducted an long-term systematic monitoring project, and thus the verification of the formulas became possible. In this presentation, the stormwater capture ratio and load capture ratio are re-estimated using actual TP data obtained from long-term monitoring program at Noksan industrial complex located in Busan, Korea. Through the re-estimated process, the uncertainty included in the design process that has been applied until now will be shown in a quantitative extent. In addition, each uncertainty included in the stormwater capture ratio estimation and in the stormwater load capture ratio estimation will be expressed to quantify the relative impact on the overall non-point pollutant control facilities design process. Finally, the SWMM-Matlab interlocking module for model parameters estimation will be introduced. Acknowledgement This subject is supported by Korea Ministry of Environment as "The Eco Innovation Project : Non

  5. Pragmatic aspects of uncertainty propagation: A conceptual review

    NASA Astrophysics Data System (ADS)

    Thacker, W. Carlisle; Iskandarani, Mohamed; Gonçalves, Rafael C.; Srinivasan, Ashwanth; Knio, Omar M.

    2015-11-01

    When quantifying the uncertainty of the response of a computationally costly oceanographic or meteorological model stemming from the uncertainty of its inputs, practicality demands getting the most information using the fewest simulations. It is widely recognized that, by interpolating the results of a small number of simulations, results of additional simulations can be inexpensively approximated to provide a useful estimate of the variability of the response. Even so, as computing the simulations to be interpolated remains the biggest expense, the choice of these simulations deserves attention. When making this choice, two requirement should be considered: (i) the nature of the interpolation and (ii) the available information about input uncertainty. Examples comparing polynomial interpolation and Gaussian process interpolation are presented for three different views of input uncertainty.

  6. Uncertainty-based internal quality control. Harmonization considerations.

    PubMed

    Bonet-Domingo, E; Escuder-Gilabert, L; Medina-Hernandez, M J; Sagrado, S

    2006-12-01

    Three main quality aspects for analytical laboratories are internal method validation, internal quality control (IQC), and sample result uncertainty. Unfortunately, in the past they have been used in a nonharmonized way. The most universal IQC tool is the mean chart, but some criteria used to fix their control limits do not fit the real nature of analytical results. A new approach for fixing these limits is proposed (the u-approach). The key is the combined uncertainty, u, obtained from the method validation information, also used for estimating the sample result uncertainty. A comparative study on "in-control" simulated, bibliographic, and real laboratory data suggests that the u-approach is more reliable than other well-established criteria. In addition, the u-approach mean chart emerges as an IQC tool, consistent with chemical assays, which harmonizes the validation-control-uncertainty process.

  7. Analysis of uncertainties in turbine metal temperature predictions

    NASA Technical Reports Server (NTRS)

    Stepka, F. S.

    1980-01-01

    An analysis was conducted to examine the extent to which various factors influence the accuracy of analytically predicting turbine blade metal temperatures and to determine the uncertainties in these predictions for several accuracies of the influence factors. The advanced turbofan engine gas conditions of 1700 K and 40 atmospheres were considered along with those of a highly instrumented high temperature turbine test rig and a low temperature turbine rig that simulated the engine conditions. The analysis showed that the uncertainty in analytically predicting local blade temperature was as much as 98 K, or 7.6 percent of the metal absolute temperature, with current knowledge of the influence factors. The expected reductions in uncertainties in the influence factors with additional knowledge and tests should reduce the uncertainty in predicting blade metal temperature to 28 K, or 2.1 percent of the metal absolute temperature.

  8. Sub-Heisenberg phase uncertainties

    NASA Astrophysics Data System (ADS)

    Pezzé, Luca

    2013-12-01

    Phase shift estimation with uncertainty below the Heisenberg limit, ΔϕHL∝1/N¯T, where N¯T is the total average number of particles employed, is a mirage of linear quantum interferometry. Recently, Rivas and Luis, [New J. Phys.NJOPFM1367-263010.1088/1367-2630/14/9/093052 14, 093052 (2012)] proposed a scheme to achieve a phase uncertainty Δϕ∝1/N¯Tk, with k an arbitrary exponent. This sparked an intense debate in the literature which, ultimately, does not exclude the possibility to overcome ΔϕHL at specific phase values. Our numerical analysis of the Rivas and Luis proposal shows that sub-Heisenberg uncertainties are obtained only when the estimator is strongly biased. No violation of the Heisenberg limit is found after bias correction or when using a bias-free Bayesian analysis.

  9. Climate negotiations under scientific uncertainty

    PubMed Central

    Barrett, Scott; Dannenberg, Astrid

    2012-01-01

    How does uncertainty about “dangerous” climate change affect the prospects for international cooperation? Climate negotiations usually are depicted as a prisoners’ dilemma game; collectively, countries are better off reducing their emissions, but self-interest impels them to keep on emitting. We provide experimental evidence, grounded in an analytical framework, showing that the fear of crossing a dangerous threshold can turn climate negotiations into a coordination game, making collective action to avoid a dangerous threshold virtually assured. These results are robust to uncertainty about the impact of crossing a threshold, but uncertainty about the location of the threshold turns the game back into a prisoners’ dilemma, causing cooperation to collapse. Our research explains the paradox of why countries would agree to a collective goal, aimed at reducing the risk of catastrophe, but act as if they were blind to this risk. PMID:23045685

  10. Significant predictors of patients' uncertainty in primary brain tumors.

    PubMed

    Lin, Lin; Chien, Lung-Chang; Acquaye, Alvina A; Vera-Bolanos, Elizabeth; Gilbert, Mark R; Armstrong, Terri S

    2015-05-01

    Patients with primary brain tumors (PBT) face uncertainty related to prognosis, symptoms and treatment response and toxicity. Uncertainty is correlated to negative mood states and symptom severity and interference. This study identified predictors of uncertainty during different treatment stages (newly-diagnosed, on treatment, followed-up without active treatment). One hundred eighty six patients with PBT were accrued at various points in the illness trajectory. Data collection tools included: a clinical checklist/a demographic data sheet/the Mishel Uncertainty in Illness Scale-Brain Tumor Form. The structured additive regression model was used to identify significant demographic and clinical predictors of illness-related uncertainty. Participants were primarily white (80 %) males (53 %). They ranged in age from 19-80 (mean = 44.2 ± 12.6). Thirty-two of the 186 patients were newly-diagnosed, 64 were on treatment at the time of clinical visit with MRI evaluation, 21 were without MRI, and 69 were not on active treatment. Three subscales (ambiguity/inconsistency; unpredictability-disease prognoses; unpredictability-symptoms and other triggers) were different amongst the treatment groups (P < .01). However, patients' uncertainty during active treatment was as high as in newly-diagnosed period. Other than treatment stages, change of employment status due to the illness was the most significant predictor of illness-related uncertainty. The illness trajectory of PBT remains ambiguous, complex, and unpredictable, leading to a high incidence of uncertainty. There was variation in the subscales of uncertainty depending on treatment status. Although patients who are newly diagnosed reported the highest scores on most of the subscales, patients on treatment felt more uncertain about unpredictability of symptoms than other groups. Due to the complexity and impact of the disease, associated symptoms, and interference with functional status, comprehensive assessment of patients

  11. Mutualism Disruption Threatens Global Plant Biodiversity: A Systematic Review

    PubMed Central

    Aslan, Clare E.; Zavaleta, Erika S.; Tershy, Bernie; Croll, Donald

    2013-01-01

    Background As global environmental change accelerates, biodiversity losses can disrupt interspecific interactions. Extinctions of mutualist partners can create “widow” species, which may face reduced ecological fitness. Hypothetically, such mutualism disruptions could have cascading effects on biodiversity by causing additional species coextinctions. However, the scope of this problem – the magnitude of biodiversity that may lose mutualist partners and the consequences of these losses – remains unknown. Methodology/Principal Findings We conducted a systematic review and synthesis of data from a broad range of sources to estimate the threat posed by vertebrate extinctions to the global biodiversity of vertebrate-dispersed and -pollinated plants. Though enormous research gaps persist, our analysis identified Africa, Asia, the Caribbean, and global oceanic islands as geographic regions at particular risk of disruption of these mutualisms; within these regions, percentages of plant species likely affected range from 2.1–4.5%. Widowed plants are likely to experience reproductive declines of 40–58%, potentially threatening their persistence in the context of other global change stresses. Conclusions Our systematic approach demonstrates that thousands of species may be impacted by disruption in one class of mutualisms, but extinctions will likely disrupt other mutualisms, as well. Although uncertainty is high, there is evidence that mutualism disruption directly threatens significant biodiversity in some geographic regions. Conservation measures with explicit focus on mutualistic functions could be necessary to bolster populations of widowed species and maintain ecosystem functions. PMID:23840571

  12. Error correction in adders using systematic subcodes.

    NASA Technical Reports Server (NTRS)

    Rao, T. R. N.

    1972-01-01

    A generalized theory is presented for the construction of a systematic subcode for a given AN code in such a way that error control properties of the AN code are preserved in this new code. The 'systematic weight' and 'systematic distance' functions in this new code depend not only on its number representation system but also on its addition structure. Finally, to illustrate this theory, a simple error-correcting adder organization using a systematic subcode of 29 N code is sketched in some detail.

  13. Analyzing the uncertainty of suspended sediment load prediction using sequential data assimilation

    NASA Astrophysics Data System (ADS)

    Leisenring, Marc; Moradkhani, Hamid

    2012-10-01

    SummaryA first step in understanding the impacts of sediment and controlling the sources of sediment is to quantify the mass loading. Since mass loading is the product of flow and concentration, the quantification of loads first requires the quantification of runoff volume. Using the National Weather Service's SNOW-17 and the Sacramento Soil Moisture Accounting (SAC-SMA) models, this study employed particle filter based Bayesian data assimilation methods to predict seasonal snow water equivalent (SWE) and runoff within a small watershed in the Lake Tahoe Basin located in California, USA. A procedure was developed to scale the variance multipliers (a.k.a hyperparameters) for model parameters and predictions based on the accuracy of the mean predictions relative to the ensemble spread. In addition, an online bias correction algorithm based on the lagged average bias was implemented to detect and correct for systematic bias in model forecasts prior to updating with the particle filter. Both of these methods significantly improved the performance of the particle filter without requiring excessively wide prediction bounds. The flow ensemble was linked to a non-linear regression model that was used to predict suspended sediment concentrations (SSCs) based on runoff rate and time of year. Runoff volumes and SSC were then combined to produce an ensemble of suspended sediment load estimates. Annual suspended sediment loads for the 5 years of simulation were finally computed along with 95% prediction intervals that account for uncertainty in both the SSC regression model and flow rate estimates. Understanding the uncertainty associated with annual suspended sediment load predictions is critical for making sound watershed management decisions aimed at maintaining the exceptional clarity of Lake Tahoe. The computational methods developed and applied in this research could assist with similar studies where it is important to quantify the predictive uncertainty of pollutant load

  14. Linear Programming Problems for Generalized Uncertainty

    ERIC Educational Resources Information Center

    Thipwiwatpotjana, Phantipa

    2010-01-01

    Uncertainty occurs when there is more than one realization that can represent an information. This dissertation concerns merely discrete realizations of an uncertainty. Different interpretations of an uncertainty and their relationships are addressed when the uncertainty is not a probability of each realization. A well known model that can handle…

  15. Calibration and systematic error analysis for the COBE(1) DMR 4year sky maps

    SciTech Connect

    Kogut, A.; Banday, A.J.; Bennett, C.L.; Gorski, K.M.; Hinshaw,G.; Jackson, P.D.; Keegstra, P.; Lineweaver, C.; Smoot, G.F.; Tenorio,L.; Wright, E.L.

    1996-01-04

    The Differential Microwave Radiometers (DMR) instrument aboard the Cosmic Background Explorer (COBE) has mapped the full microwave sky to mean sensitivity 26 mu K per 7 degrees held of view. The absolute calibration is determined to 0.7 percent with drifts smaller than 0.2 percent per year. We have analyzed both the raw differential data and the pixelized sky maps for evidence of contaminating sources such as solar system foregrounds, instrumental susceptibilities, and artifacts from data recovery and processing. Most systematic effects couple only weakly to the sky maps. The largest uncertainties in the maps result from the instrument susceptibility to Earth's magnetic field, microwave emission from Earth, and upper limits to potential effects at the spacecraft spin period. Systematic effects in the maps are small compared to either the noise or the celestial signal: the 95 percent confidence upper limit for the pixel-pixel rms from all identified systematics is less than 6 mu K in the worst channel. A power spectrum analysis of the (A-B)/2 difference maps shows no evidence for additional undetected systematic effects.

  16. Calibration and Systematic Error Analysis for the COBE DMR 4 Year Sky Maps

    NASA Astrophysics Data System (ADS)

    Kogut, A.; Banday, A. J.; Bennett, C. L.; Gorski, K. M.; Hinshaw, G.; Jackson, P. D.; Keegstra, P.; Lineweaver, C.; Smoot, G. F.; Tenorio, L.; Wright, E. L.

    1996-10-01

    The Differential Microwave Radiometers (DMR) instrument aboard the Cosmic Background Explorer (CO BE) has mapped the full microwave sky to mean sensitivity 26 μK per 7° field of view. The absolute calibration is determined to 0.7% with drifts smaller than 0.2% per year. We have analyzed both the raw differential data and the pixelized sky maps for evidence of contaminating sources such as solar system foregrounds, instrumental susceptibilities, and artifacts from data recovery and processing. Most systematic effects couple only weakly to the sky maps. The largest uncertainties in the maps result from the instrument susceptibility to Earth's magnetic field, microwave emission from Earth, and upper limits to potential effects at the spacecraft spin period. Systematic effects in the maps are small compared to either the noise or the celestial signal: the 95% confidence upper limit for the pixel-pixel rms from all identified systematics is less than 6 μK in the worst channel. A power spectrum analysis of the (A - B)/2 difference maps shows no evidence for additional undetected systematic effects.

  17. Statistics, Uncertainty, and Transmitted Variation

    SciTech Connect

    Wendelberger, Joanne Roth

    2014-11-05

    The field of Statistics provides methods for modeling and understanding data and making decisions in the presence of uncertainty. When examining response functions, variation present in the input variables will be transmitted via the response function to the output variables. This phenomenon can potentially have significant impacts on the uncertainty associated with results from subsequent analysis. This presentation will examine the concept of transmitted variation, its impact on designed experiments, and a method for identifying and estimating sources of transmitted variation in certain settings.

  18. Awe, uncertainty, and agency detection.

    PubMed

    Valdesolo, Piercarlo; Graham, Jesse

    2014-01-01

    Across five studies, we found that awe increases both supernatural belief (Studies 1, 2, and 5) and intentional-pattern perception (Studies 3 and 4)-two phenomena that have been linked to agency detection, or the tendency to interpret events as the consequence of intentional and purpose-driven agents. Effects were both directly and conceptually replicated, and mediational analyses revealed that these effects were driven by the influence of awe on tolerance for uncertainty. Experiences of awe decreased tolerance for uncertainty, which, in turn, increased the tendency to believe in nonhuman agents and to perceive human agency in random events.

  19. Intermittent optical frequency measurements to reduce the dead time uncertainty of frequency link

    NASA Astrophysics Data System (ADS)

    Hachisu, Hidekazu; Ido, Tetsuya

    2015-11-01

    The absolute frequency of the 87Sr lattice clock transition was evaluated with an uncertainty of 1.1 × 10-15 using a frequency link to the international atomic time (TAI). The frequency uncertainty of a hydrogen maser used as a transfer oscillator was reduced by homogeneously distributed intermittent measurement over a five-day grid of TAI. Three sets of four or five days measurements as well as systematic uncertainty of the clock at 8.6 × 10-17 have resulted in an absolute frequency of 87Sr 1S0-3P0 clock transition to be 429 228 004 229 872.85 (47) Hz.

  20. Confronting uncertainty in flood damage predictions

    NASA Astrophysics Data System (ADS)

    Schröter, Kai; Kreibich, Heidi; Vogel, Kristin; Merz, Bruno

    2015-04-01

    Reliable flood damage models are a prerequisite for the practical usefulness of the model results. Oftentimes, traditional uni-variate damage models as for instance depth-damage curves fail to reproduce the variability of observed flood damage. Innovative multi-variate probabilistic modelling approaches are promising to capture and quantify the uncertainty involved and thus to improve the basis for decision making. In this study we compare the predictive capability of two probabilistic modelling approaches, namely Bagging Decision Trees and Bayesian Networks. For model evaluation we use empirical damage data which are available from computer aided telephone interviews that were respectively compiled after the floods in 2002, 2005 and 2006, in the Elbe and Danube catchments in Germany. We carry out a split sample test by sub-setting the damage records. One sub-set is used to derive the models and the remaining records are used to evaluate the predictive performance of the model. Further we stratify the sample according to catchments which allows studying model performance in a spatial transfer context. Flood damage estimation is carried out on the scale of the individual buildings in terms of relative damage. The predictive performance of the models is assessed in terms of systematic deviations (mean bias), precision (mean absolute error) as well as in terms of reliability which is represented by the proportion of the number of observations that fall within the 95-quantile and 5-quantile predictive interval. The reliability of the probabilistic predictions within validation runs decreases only slightly and achieves a very good coverage of observations within the predictive interval. Probabilistic models provide quantitative information about prediction uncertainty which is crucial to assess the reliability of model predictions and improves the usefulness of model results.

  1. Respondent uncertainty in contingent valuation of preventing beach erosion: an analysis with a polychotomous choice question.

    PubMed

    Logar, Ivana; van den Bergh, Jeroen C J M

    2012-12-30

    Respondent uncertainty is often considered as one of the main limitations of stated preference methods, which are nowadays being widely used for valuing environmental goods and services. This article examines the effect of respondent uncertainty on welfare estimates by applying the contingent valuation method. This is done in the context of beach protection against erosion. Respondent certainty levels are elicited using a five-category polychotomous choice question. Two different uncertainty calibration techniques are tested, namely one that treats uncertain responses as missing and another in which uncertain 'yes' responses are recoded as 'no' responses. We found no evidence that the former technique offers any gains over the conventional model assuming certainty. The latter calibration technique systematically reduces welfare estimates. The reduction is statistically significant only when the most certain 'yes' responses are recoded as 'no' responses. The article further identifies determinants of respondent uncertainty. Finally, it explores how real market experience affects respondent uncertainty.

  2. Probabilistic fatigue life prediction using ultrasonic inspection data considering equivalent initial flaw size uncertainty

    NASA Astrophysics Data System (ADS)

    Guan, X.; Zhang, J.; Kadau, K.; Zhou, S. K.

    2013-01-01

    This study presents a systematical method for probabilistic fatigue life prediction using ultrasonic inspection data. A probabilistic model to correlate the ultrasonic inspection reported size and the actual size is proposed based on historical data of rotor flaw sizing. Both of the reported size and the actual size are quantified in terms of the equivalent reflector diameter. The equivalent initial flaw size (EIFS) is then calculated based on the actual size for fatigue propagation analysis. All major uncertainties, such as EIFS uncertainty, fatigue crack growth model parameter uncertainty, and experimental data measurement uncertainty are explicitly included in the fatigue life prediction. Bayesian parameter estimation is used to estimate fatigue crack growth model parameters and measurement uncertainties using a limited number of fatigue testing data points. The overall procedure is demonstrated using a Cr-Mo-V rotor segment with ultrasonic inspection data. Interpretations of the probabilistic prediction results are given.

  3. Uncertainty in environmental risk assessment: implications for risk-based management of river basins.

    PubMed

    Ragas, Ad M J; Huijbregts, Mark A J; Henning-de Jong, Irmgard; Leuven, Rob S E W

    2009-01-01

    Environmental risk assessment is typically uncertain due to different perceptions of the risk problem and limited knowledge about the physical, chemical, and biological processes underlying the risk. The present paper provides a systematic overview of the implications of different types of uncertainty for risk management, with a focus on risk-based management of river basins. Three different types of uncertainty are distinguished: 1) problem definition uncertainty, 2) true uncertainty, and 3) variability. Methods to quantify and describe these types of uncertainty are discussed and illustrated in 4 case studies. The case studies demonstrate that explicit regulation of uncertainty can improve risk management (e.g., by identification of the most effective risk reduction measures, optimization of the use of resources, and improvement of the decision-making process). It is concluded that the involvement of nongovernmental actors as prescribed by the European Union Water Framework Directive (WFD) provides challenging opportunities to address problem definition uncertainty and those forms of true uncertainty that are difficult to quantify. However, the WFD guidelines for derivation and application of environmental quality standards could be improved by the introduction of a probabilistic approach to deal with true uncertainty and a better scientific basis for regulation of variability.

  4. Instrumentation-related uncertainty of reflectance and transmittance measurements with a two-channel spectrophotometer

    NASA Astrophysics Data System (ADS)

    Peest, Christian; Schinke, Carsten; Brendel, Rolf; Schmidt, Jan; Bothe, Karsten

    2017-01-01

    Spectrophotometers are operated in numerous fields of science and industry for a variety of applications. In order to provide confidence for the measured data, analyzing the associated uncertainty is valuable. However, the uncertainty of the measurement results is often unknown or reduced to sample-related contributions. In this paper, we describe our approach for the systematic determination of the measurement uncertainty of the commercially available two-channel spectrophotometer Agilent Cary 5000 in accordance with the Guide to the expression of uncertainty in measurements. We focus on the instrumentation-related uncertainty contributions rather than the specific application and thus outline a general procedure which can be adapted for other instruments. Moreover, we discover a systematic signal deviation due to the inertia of the measurement amplifier and develop and apply a correction procedure. Thereby we increase the usable dynamic range of the instrument by more than one order of magnitude. We present methods for the quantification of the uncertainty contributions and combine them into an uncertainty budget for the device.

  5. TRITIUM UNCERTAINTY ANALYSIS FOR SURFACE WATER SAMPLES AT THE SAVANNAH RIVER SITE

    SciTech Connect

    Atkinson, R.

    2012-07-31

    Radiochemical analyses of surface water samples, in the framework of Environmental Monitoring, have associated uncertainties for the radioisotopic results reported. These uncertainty analyses pertain to the tritium results from surface water samples collected at five locations on the Savannah River near the U.S. Department of Energy's Savannah River Site (SRS). Uncertainties can result from the field-sampling routine, can be incurred during transport due to the physical properties of the sample, from equipment limitations, and from the measurement instrumentation used. The uncertainty reported by the SRS in their Annual Site Environmental Report currently considers only the counting uncertainty in the measurements, which is the standard reporting protocol for radioanalytical chemistry results. The focus of this work is to provide an overview of all uncertainty components associated with SRS tritium measurements, estimate the total uncertainty according to ISO 17025, and to propose additional experiments to verify some of the estimated uncertainties. The main uncertainty components discovered and investigated in this paper are tritium absorption or desorption in the sample container, HTO/H{sub 2}O isotopic effect during distillation, pipette volume, and tritium standard uncertainty. The goal is to quantify these uncertainties and to establish a combined uncertainty in order to increase the scientific depth of the SRS Annual Site Environmental Report.

  6. Carbon cycle uncertainty in the Alaskan Arctic

    NASA Astrophysics Data System (ADS)

    Fisher, J. B.; Sikka, M.; Oechel, W. C.; Huntzinger, D. N.; Melton, J. R.; Koven, C. D.; Ahlström, A.; Arain, A. M.; Baker, I.; Chen, J. M.; Ciais, P.; Davidson, C.; Dietze, M.; El-Masri, B.; Hayes, D.; Huntingford, C.; Jain, A.; Levy, P. E.; Lomas, M. R.; Poulter, B.; Price, D.; Sahoo, A. K.; Schaefer, K.; Tian, H.; Tomelleri, E.; Verbeeck, H.; Viovy, N.; Wania, R.; Zeng, N.; Miller, C. E.

    2014-02-01

    Climate change is leading to a disproportionately large warming in the high northern latitudes, but the magnitude and sign of the future carbon balance of the Arctic are highly uncertain. Using 40 terrestrial biosphere models for Alaska, we provide a baseline of terrestrial carbon cycle structural and parametric uncertainty, defined as the multi-model standard deviation (σ) against the mean (x\\bar) for each quantity. Mean annual uncertainty (σ/x\\bar) was largest for net ecosystem exchange (NEE) (-0.01± 0.19 kg C m-2 yr-1), then net primary production (NPP) (0.14 ± 0.33 kg C m-2 yr-1), autotrophic respiration (Ra) (0.09 ± 0.20 kg C m-2 yr-1), gross primary production (GPP) (0.22 ± 0.50 kg C m-2 yr-1), ecosystem respiration (Re) (0.23 ± 0.38 kg C m-2 yr-1), CH4 flux (2.52 ± 4.02 g CH4 m-2 yr-1), heterotrophic respiration (Rh) (0.14 ± 0.20 kg C m-2 yr-1), and soil carbon (14.0± 9.2 kg C m-2). The spatial patterns in regional carbon stocks and fluxes varied widely with some models showing NEE for Alaska as a strong carbon sink, others as a strong carbon source, while still others as carbon neutral. Additionally, a feedback (i.e., sensitivity) analysis was conducted of 20th century NEE to CO2 fertilization (β) and climate (γ), which showed that uncertainty in γ was 2x larger than that of β, with neither indicating that the Alaskan Arctic is shifting towards a certain net carbon sink or source. Finally, AmeriFlux data are used at two sites in the Alaskan Arctic to evaluate the regional patterns; observed seasonal NEE was captured within multi-model uncertainty. This assessment of carbon cycle uncertainties may be used as a baseline for the improvement of experimental and modeling activities, as well as a reference for future trajectories in carbon cycling with climate change in the Alaskan Arctic.

  7. Uncertainty Can Increase Explanatory Credibility

    DTIC Science & Technology

    2013-08-01

    metacognitive cue to infer their conversational partner’s depth of processing . Keywords: explanations, confidence, uncertainty, collaborative reasoning...scope, i.e., those that account for only observed phenomena (Khemlani, Sussman, & Oppenheimer , 2011). These preferences show that properties intrinsic...Fischhoff, & Phillips , 1982; Lindley, 1982; McClelland & Bolger, 1994). Much of the research on subjective confidence addresses how individuals

  8. Spatial uncertainty and ecological models

    SciTech Connect

    Jager, Yetta; King, Anthony Wayne

    2004-07-01

    Applied ecological models that are used to understand and manage natural systems often rely on spatial data as input. Spatial uncertainty in these data can propagate into model predictions. Uncertainty analysis, sensitivity analysis, error analysis, error budget analysis, spatial decision analysis, and hypothesis testing using neutral models are all techniques designed to explore the relationship between variation in model inputs and variation in model predictions. Although similar methods can be used to answer them, these approaches address different questions. These approaches differ in (a) whether the focus is forward or backward (forward to evaluate the magnitude of variation in model predictions propagated or backward to rank input parameters by their influence); (b) whether the question involves model robustness to large variations in spatial pattern or to small deviations from a reference map; and (c) whether processes that generate input uncertainty (for example, cartographic error) are of interest. In this commentary, we propose a taxonomy of approaches, all of which clarify the relationship between spatial uncertainty and the predictions of ecological models. We describe existing techniques and indicate a few areas where research is needed.

  9. The face of uncertainty eats.

    PubMed

    Corwin, Rebecca L W

    2011-09-01

    The idea that foods rich in fat and sugar may be addictive has generated much interest, as well as controversy, among both scientific and lay communities. Recent research indicates that fatty and sugary food in-and-of itself is not addictive. Rather, the food and the context in which it is consumed interact to produce an addiction-like state. One of the contexts that appears to be important is the intermittent opportunity to consume foods rich in fat and sugar in environments where food is plentiful. Animal research indicates that, under these conditions, intake of the fatty sugary food escalates across time and binge-type behavior develops. However, the mechanisms that account for the powerful effect of intermittency on ingestive behavior have only begun to be elucidated. In this review, it is proposed that intermittency stimulates appetitive behavior that is associated with uncertainty regarding what, when, and how much of the highly palatable food to consume. Uncertainty may stimulate consumption of optional fatty and sugary treats due to differential firing of midbrain dopamine neurons, activation of the stress axis, and involvement of orexin signaling. In short, uncertainty may produce an aversive state that bingeing on palatable food can alleviate, however temporarily. "Food addiction" may not be "addiction" to food at all; it may be a response to uncertainty within environments of food abundance.

  10. PIV uncertainty quantification by image matching

    NASA Astrophysics Data System (ADS)

    Sciacchitano, Andrea; Wieneke, Bernhard; Scarano, Fulvio

    2013-04-01

    A novel method is presented to quantify the uncertainty of PIV data. The approach is a posteriori, i.e. the unknown actual error of the measured velocity field is estimated using the velocity field itself as input along with the original images. The principle of the method relies on the concept of super-resolution: the image pair is matched according to the cross-correlation analysis and the residual distance between matched particle image pairs (particle disparity vector) due to incomplete match between the two exposures is measured. The ensemble of disparity vectors within the interrogation window is analyzed statistically. The dispersion of the disparity vector returns the estimate of the random error, whereas the mean value of the disparity indicates the occurrence of a systematic error. The validity of the working principle is first demonstrated via Monte Carlo simulations. Two different interrogation algorithms are considered, namely the cross-correlation with discrete window offset and the multi-pass with window deformation. In the simulated recordings, the effects of particle image displacement, its gradient, out-of-plane motion, seeding density and particle image diameter are considered. In all cases good agreement is retrieved, indicating that the error estimator is able to follow the trend of the actual error with satisfactory precision. Experiments where time-resolved PIV data are available are used to prove the concept under realistic measurement conditions. In this case the ‘exact’ velocity field is unknown; however a high accuracy estimate is obtained with an advanced interrogation algorithm that exploits the redundant information of highly temporally oversampled data (pyramid correlation, Sciacchitano et al (2012 Exp. Fluids 53 1087-105)). The image-matching estimator returns the instantaneous distribution of the estimated velocity measurement error. The spatial distribution compares very well with that of the actual error with maxima in the

  11. Structural Damage Assessment under Uncertainty

    NASA Astrophysics Data System (ADS)

    Lopez Martinez, Israel

    Structural damage assessment has applications in the majority of engineering structures and mechanical systems ranging from aerospace vehicles to manufacturing equipment. The primary goals of any structural damage assessment and health monitoring systems are to ascertain the condition of a structure and to provide an evaluation of changes as a function of time as well as providing an early-warning of an unsafe condition. There are many structural heath monitoring and assessment techniques developed for research using numerical simulations and scaled structural experiments. However, the transition from research to real-world structures has been rather slow. One major reason for this slow-progress is the existence of uncertainty in every step of the damage assessment process. This dissertation research involved the experimental and numerical investigation of uncertainty in vibration-based structural health monitoring and development of robust detection and localization methods. The basic premise of vibration-based structural health monitoring is that changes in structural characteristics, such as stiffness, mass and damping, will affect the global vibration response of the structure. The diagnostic performance of vibration-based monitoring system is affected by uncertainty sources such as measurement errors, environmental disturbances and parametric modeling uncertainties. To address diagnostic errors due to irreducible uncertainty, a pattern recognition framework for damage detection has been developed to be used for continuous monitoring of structures. The robust damage detection approach developed is based on the ensemble of dimensional reduction algorithms for improved damage-sensitive feature extraction. For damage localization, the determination of an experimental structural model was performed based on output-only modal analysis. An experimental model correlation technique is developed in which the discrepancies between the undamaged and damaged modal data are

  12. A novel dose uncertainty model and its application for dose verification.

    PubMed

    Jin, Hosang; Chung, Heetaek; Liu, Chihray; Palta, Jatinder; Suh, Tae-Suk; Kim, Siyong

    2005-06-01

    measurements were within the tolerance bound as expected by a statistical prediction of the model. Using the dose uncertainty distributions, an uncertainty length (uncertainty area and uncertainty volume for two-dimensional and three-dimensional, respectively) histogram (a plot of the dose uncertainty of 1sigma received by a length of field) was made. The histogram provides additional information on superiority of a treatment plan in terms of uncertainty. In summary, the uncertainty model provides the dose comparison tool as well as the evaluation tool of a treatment planning system.

  13. A Peep into the Uncertainty-Complexity-Relevance Modeling Trilemma through Global Sensitivity and Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.

    2014-12-01

    Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping

  14. Systematic comparison of δ13C measurements of testosterone and derivative steroids in a freeze-dried urine candidate reference material for sports drug testing by gas chromatography/combustion/isotope ratio mass spectrometry and uncertainty evaluation using four different metrological approaches.

    PubMed

    Munton, Ellaine; Murby, John; Hibbert, D Brynn; Santamaria-Fernandez, Rebeca

    2011-06-15

    An alternative calibration procedure for use when performing carbon isotope ratio measurements by gas chromatography/combustion/isotope ratio mass spectrometry (GC/C/IRMS) has been developed. This calibration procedure does not rely on the corrections in-built in the instrument software, as the carbon isotope ratios of a sample are calculated from the measured raw peak areas. The method was developed for the certification of a urine reference material for sports drug testing, as the estimation of measurement uncertainty is greatly simplified. To ensure that the method is free from bias arising from the choice of calibration material and instrument, the carbon isotope ratios of steroids in urine extracts were measured using two different instruments in different laboratories, and three different reference materials (CU/USADA steroid standards from Brenna Laboratory, Cornell University; NIST RM8539 mineral oil; methane calibrated against NIST RM8560 natural gas). The measurements were performed at LGC and the Australian National Measurement Institute (NMI). It was found that there was no significant difference in measurement results when different instruments and reference materials were used to measure the carbon isotope ratio of the major testosterone metabolites androsterone and etiocholanolone, or the endogenous reference compounds pregnanediol, 11- ketoetiocholanolone and 11β-hydroxyandrosterone. Expanded measurement uncertainties at the 95% coverage probability ranged from 0.21‰ to 1.4‰, depending on analyte, instrument and reference material. The measurement results of this comparison were used to estimate a measurement uncertainty of δ(13)C for the certification of the urine reference material being performed on a single instrument using a single reference material at NMI.

  15. Sensitivity of direct global warming potentials to key uncertainties

    SciTech Connect

    Wuebbles, D.J.; Patten, K.O.; Grant, K.E. ); Jain, A.K. )

    1992-07-01

    A series of sensitivity studies examines the effect of several uncertainties in Global Wanning Potentials (GWPs). For example, the original evaluation of GWPs for the Intergovernmental Panel on Climate Change (EPCC, 1990) did not attempt to account for the possible sinks of carbon dioxide (CO{sub 2}) that could balance the carbon cycle and produce atmospheric concentrations of C0{sub 2} that match observations. In this study, a balanced carbon cycle model is applied in calculation of the radiative forcing from C0{sub 2}. Use of the balanced model produces up to 20 percent enhancement of the GWPs for most trace gases compared with the EPCC (1990) values for time horizons up to 100 years, but a decreasing enhancement with longer time horizons. Uncertainty limits of the fertilization feedback parameter contribute a 10 percent range in GWP values. Another systematic uncertainty in GWPs is the assumption of an equilibrium atmosphere (one in which the concentration of trace gases remains constant) versus a disequilibrium atmosphere. The latter gives GWPs that are 15 to 30 percent greater than the former, dependening upon the carbon dioxide emission scenario chosen. Seven scenarios are employed: constant emission past 1990 and the six EPCC (1992) emission scenarios. For the analysis of uncertainties in atmospheric lifetime ({tau}), the GWP changes in direct proportion to {tau} for short-lived gases, but to a lesser extent for gases with {tau} greater than the time horizon for the GWP calculation.

  16. Uncertainty quantification in virtual surgery predictions for single ventricle palliation

    NASA Astrophysics Data System (ADS)

    Schiavazzi, Daniele; Marsden, Alison

    2014-11-01

    Hemodynamic results from numerical simulations of physiology in patients are invariably presented as deterministic quantities without assessment of associated confidence. Recent advances in cardiovascular simulation and Uncertainty Analysis can be leveraged to challenge this paradigm and to quantify the variability of output quantities of interest, of paramount importance to complement clinical decision making. Physiological variability and errors are responsible for the uncertainty typically associated with measurements in the clinic; starting from a characterization of these quantities in probability, we present applications in the context of estimating the distributions of lumped parameters in 0D models of single-ventricle circulation. We also present results in virtual Fontan palliation surgery, where the variability of both local and systemic hemodynamic indicators is inferred from the uncertainty in pre-operative clinical measurements. Efficient numerical algorithms are required to mitigate the computational cost of propagating the uncertainty through multiscale coupled 0D-3D models of pulsatile flow at the cavopulmonary connection. This work constitutes a first step towards systematic application of robust numerical simulations to virtual surgery predictions.

  17. Differentiating intolerance of uncertainty from three related but distinct constructs.

    PubMed

    Rosen, Natalie O; Ivanova, Elena; Knäuper, Bärbel

    2014-01-01

    Individual differences in uncertainty have been associated with heightened anxiety, stress and approach-oriented coping. Intolerance of uncertainty (IU) is a trait characteristic that arises from negative beliefs about uncertainty and its consequences. Researchers have established the central role of IU in the development of problematic worry and maladaptive coping, highlighting the importance of this construct to anxiety disorders. However, there is a need to improve our understanding of the phenomenology of IU. The goal of this paper was to present hypotheses regarding the similarities and differences between IU and three related constructs--intolerance of ambiguity, uncertainty orientation, and need for cognitive closure--and to call for future empirical studies to substantiate these hypotheses. To assist with achieving this goal, we conducted a systematic review of the literature, which also served to identify current gaps in knowledge. This paper differentiates these constructs by outlining each definition and general approaches to assessment, reviewing the existing empirical relations, and proposing theoretical similarities and distinctions. Findings may assist researchers in selecting the appropriate construct to address their research questions. Future research directions for the application of these constructs, particularly within the field of clinical and health psychology, are discussed.

  18. Stereo-particle image velocimetry uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Sayantan; Charonko, John J.; Vlachos, Pavlos P.

    2017-01-01

    Particle image velocimetry (PIV) measurements are subject to multiple elemental error sources and thus estimating overall measurement uncertainty is challenging. Recent advances have led to a posteriori uncertainty estimation methods for planar two-component PIV. However, no complete methodology exists for uncertainty quantification in stereo PIV. In the current work, a comprehensive framework is presented to quantify the uncertainty stemming from stereo registration error and combine it with the underlying planar velocity uncertainties. The disparity in particle locations of the dewarped images is used to estimate the positional uncertainty of the world coordinate system, which is then propagated to the uncertainty in the calibration mapping function coefficients. Next, the calibration uncertainty is combined with the planar uncertainty fields of the individual cameras through an uncertainty propagation equation and uncertainty estimates are obtained for all three velocity components. The methodology was tested with synthetic stereo PIV data for different light sheet thicknesses, with and without registration error, and also validated with an experimental vortex ring case from 2014 PIV challenge. Thorough sensitivity analysis was performed to assess the relative impact of the various parameters to the overall uncertainty. The results suggest that in absence of any disparity, the stereo PIV uncertainty prediction method is more sensitive to the planar uncertainty estimates than to the angle uncertainty, although the latter is not negligible for non-zero disparity. Overall the presented uncertainty quantification framework showed excellent agreement between the error and uncertainty RMS values for both the synthetic and the experimental data and demonstrated reliable uncertainty prediction coverage. This stereo PIV uncertainty quantification framework provides the first comprehensive treatment on the subject and potentially lays foundations applicable to volumetric

  19. Sense of control under uncertainty depends on people's childhood environment: a life history theory approach.

    PubMed

    Mittal, Chiraag; Griskevicius, Vladas

    2014-10-01

    Past research found that environmental uncertainty leads people to behave differently depending on their childhood environment. For example, economic uncertainty leads people from poor childhoods to become more impulsive while leading people from wealthy childhoods to become less impulsive. Drawing on life history theory, we examine the psychological mechanism driving such diverging responses to uncertainty. Five experiments show that uncertainty alters people's sense of control over the environment. Exposure to uncertainty led people from poorer childhoods to have a significantly lower sense of control than those from wealthier childhoods. In addition, perceptions of control statistically mediated the effect of uncertainty on impulsive behavior. These studies contribute by demonstrating that sense of control is a psychological driver of behaviors associated with fast and slow life history strategies. We discuss the implications of this for theory and future research, including that environmental uncertainty might lead people who grew up poor to quit challenging tasks sooner than people who grew up wealthy.

  20. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    SciTech Connect

    Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  1. Prediction uncertainty and optimal experimental design for learning dynamical systems.

    PubMed

    Letham, Benjamin; Letham, Portia A; Rudin, Cynthia; Browne, Edward P

    2016-06-01

    Dynamical systems are frequently used to model biological systems. When these models are fit to data, it is necessary to ascertain the uncertainty in the model fit. Here, we present prediction deviation, a metric of uncertainty that determines the extent to which observed data have constrained the model's predictions. This is accomplished by solving an optimization problem that searches for a pair of models that each provides a good fit for the observed data, yet has maximally different predictions. We develop a method for estimating a priori the impact that additional experiments would have on the prediction deviation, allowing the experimenter to design a set of experiments that would most reduce uncertainty. We use prediction deviation to assess uncertainty in a model of interferon-alpha inhibition of viral infection, and to select a sequence of experiments that reduces this uncertainty. Finally, we prove a theoretical result which shows that prediction deviation provides bounds on the trajectories of the underlying true model. These results show that prediction deviation is a meaningful metric of uncertainty that can be used for optimal experimental design.

  2. Prediction uncertainty and optimal experimental design for learning dynamical systems

    NASA Astrophysics Data System (ADS)

    Letham, Benjamin; Letham, Portia A.; Rudin, Cynthia; Browne, Edward P.

    2016-06-01

    Dynamical systems are frequently used to model biological systems. When these models are fit to data, it is necessary to ascertain the uncertainty in the model fit. Here, we present prediction deviation, a metric of uncertainty that determines the extent to which observed data have constrained the model's predictions. This is accomplished by solving an optimization problem that searches for a pair of models that each provides a good fit for the observed data, yet has maximally different predictions. We develop a method for estimating a priori the impact that additional experiments would have on the prediction deviation, allowing the experimenter to design a set of experiments that would most reduce uncertainty. We use prediction deviation to assess uncertainty in a model of interferon-alpha inhibition of viral infection, and to select a sequence of experiments that reduces this uncertainty. Finally, we prove a theoretical result which shows that prediction deviation provides bounds on the trajectories of the underlying true model. These results show that prediction deviation is a meaningful metric of uncertainty that can be used for optimal experimental design.

  3. Structural uncertainty in watershed phosphorus modeling: Toward a stochastic framework

    NASA Astrophysics Data System (ADS)

    Chen, Lei; Gong, Yongwei; Shen, Zhenyao

    2016-06-01

    Structural uncertainty is an important source of model predictive errors, but few studies have been conducted on the error-transitivity from model structure to nonpoint source (NPS) prediction. In this study, we focused on the structural uncertainty caused by the algorithms and equations that are used to describe the phosphorus (P) cycle at the watershed scale. The sensitivity of simulated P to each algorithm/equation was quantified using the Soil and Water Assessment Tool (SWAT) in the Three Gorges Reservoir Area, China. The results indicated that the ratios of C:N and P:N for humic materials, as well as the algorithm of fertilization and P leaching contributed the largest output uncertainties. In comparison, the initiation of inorganic P in the soil layer and the transformation algorithm between P pools are less sensitive for the NPS-P predictions. In addition, the coefficient of variation values were quantified as 0.028-0.086, indicating that the structure-induced uncertainty is minor compared to NPS-P prediction uncertainty caused by the model input and parameters. Using the stochastic framework, the cumulative probability of simulated NPS-P data provided a trade-off between expenditure burden and desired risk. In this sense, this paper provides valuable information for the control of model structural uncertainty, and can be extrapolated to other model-based studies.

  4. Laser triangulation: fundamental uncertainty in distance measurement.

    PubMed

    Dorsch, R G; Häusler, G; Herrmann, J M

    1994-03-01

    We discuss the uncertainty limit in distance sensing by laser triangulation. The uncertainty in distance measurement of laser triangulation sensors and other coherent sensors is limited by speckle noise. Speckle arises because of the coherent illumination in combination with rough surfaces. A minimum limit on the distance uncertainty is derived through speckle statistics. This uncertainty is a function of wavelength, observation aperture, and speckle contrast in the spot image. Surprisingly, it is the same distance uncertainty that we obtained from a single-photon experiment and from Heisenberg's uncertainty principle. Experiments confirm the theory. An uncertainty principle connecting lateral resolution and distance uncertainty is introduced. Design criteria for a sensor with minimum distanc uncertainty are determined: small temporal coherence, small spatial coherence, a large observation aperture.

  5. Uncertainty quantification in volumetric Particle Image Velocimetry

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Sayantan; Charonko, John; Vlachos, Pavlos

    2016-11-01

    Particle Image Velocimetry (PIV) uncertainty quantification is challenging due to coupled sources of elemental uncertainty and complex data reduction procedures in the measurement chain. Recent developments in this field have led to uncertainty estimation methods for planar PIV. However, no framework exists for three-dimensional volumetric PIV. In volumetric PIV the measurement uncertainty is a function of reconstructed three-dimensional particle location that in turn is very sensitive to the accuracy of the calibration mapping function. Furthermore, the iterative correction to the camera mapping function using triangulated particle locations in space (volumetric self-calibration) has its own associated uncertainty due to image noise and ghost particle reconstructions. Here we first quantify the uncertainty in the triangulated particle position which is a function of particle detection and mapping function uncertainty. The location uncertainty is then combined with the three-dimensional cross-correlation uncertainty that is estimated as an extension of the 2D PIV uncertainty framework. Finally the overall measurement uncertainty is quantified using an uncertainty propagation equation. The framework is tested with both simulated and experimental cases. For the simulated cases the variation of estimated uncertainty with the elemental volumetric PIV error sources are also evaluated. The results show reasonable prediction of standard uncertainty with good coverage.

  6. Uncertainty in mapping urban air quality using crowdsourcing techniques

    NASA Astrophysics Data System (ADS)

    Schneider, Philipp; Castell, Nuria; Lahoz, William; Bartonova, Alena

    2016-04-01

    Small and low-cost sensors measuring various air pollutants have become available in recent years owing to advances in sensor technology. Such sensors have significant potential for improving high-resolution mapping of air quality in the urban environment as they can be deployed in comparatively large numbers and therefore are able to provide information at unprecedented spatial detail. However, such sensor devices are subject to significant and currently little understood uncertainties that affect their usability. Not only do these devices exhibit random errors and biases of occasionally substantial magnitudes, but these errors may also shift over time. In addition, there often tends to be significant inter-sensor variability even when supposedly identical sensors from the same manufacturer are used. We need to quantify accurately these uncertainties to make proper use of the information they provide. Furthermore, when making use of the data and producing derived products such as maps, the measurement uncertainties that propagate throughout the analysis need to be clearly communicated to the scientific and non-scientific users of the map products. Based on recent experiences within the EU-funded projects CITI-SENSE and hackAIR we discuss the uncertainties along the entire processing chain when using crowdsourcing techniques for mapping urban air quality. Starting with the uncertainties exhibited by the sensors themselves, we present ways of quantifying the error characteristics of a network of low-cost microsensors and show suitable statistical metrics for summarizing them. Subsequently, we briefly present a data-fusion-based method for mapping air quality in the urban environment and illustrate how we propagate the uncertainties of the individual sensors throughout the mapping system, resulting in detailed maps that document the pixel-level uncertainty for each concentration field. Finally, we present methods for communicating the resulting spatial uncertainty

  7. Toward a definition of intolerance of uncertainty: a review of factor analytical studies of the Intolerance of Uncertainty Scale.

    PubMed

    Birrell, Jane; Meares, Kevin; Wilkinson, Andrew; Freeston, Mark

    2011-11-01

    Since its emergence in the early 1990s, a narrow but concentrated body of research has developed examining the role of intolerance of uncertainty (IU) in worry, and yet we still know little about its phenomenology. In an attempt to clarify our understanding of this construct, this paper traces the way in which our understanding and definition of IU have evolved throughout the literature. This paper also aims to further our understanding of IU by exploring the latent variables measures by the Intolerance of Uncertainty Scale (IUS; Freeston, Rheaume, Letarte, Dugas & Ladouceur, 1994). A review of the literature surrounding IU confirmed that the current definitions are categorical and lack specificity. A critical review of existing factor analytic studies was carried out in order to determine the underlying factors measured by the IUS. Systematic searches yielded 9 papers for review. Two factors with 12 consistent items emerged throughout the exploratory studies, and the stability of models containing these two factors was demonstrated in subsequent confirmatory studies. It is proposed that these factors represent (i) desire for predictability and an active engagement in seeking certainty, and (ii) paralysis of cognition and action in the face of uncertainty. It is suggested that these factors may represent approach and avoidance responses to uncertainty. Further research is required to confirm the construct validity of these factors and to determine the stability of this structure within clinical samples.

  8. Uncertainty Quantification for Nuclear Currents: A Bayesian χ-EFT view of the Triton and β- Decay

    NASA Astrophysics Data System (ADS)

    Wendt, Kyle

    2014-09-01

    Chiral Effective Field Theory (χ-EFT) provides a framework for the generation and systematic improvement of model independent inter-nucleon interaction Hamiltonians and nuclear current operators. Within χ-EFT, short and mid distance physics is encoded through a gradient expansion and multiple pion exchange parameterized by a set of low energy constants (LECs). The LECs are often constrained via non-linear least squares using nuclear bound state and scattering observables. This has produced reasonable low-energy descriptions in the past, but has been plagued by LECs that are unnaturally large. Additional issues manifest in medium mass nuclei where the χ-EFT Hamiltonians fail to adequately describe saturation properties. It has been suggested that Bayesian approaches may remedy the unnaturally large LECs using carefully selected priors. Other analyses have suggested that the inclusion and feedback of nuclear currents into the constraints of the LECs may improve saturation properties. We combine these approaches using Markov chain Monte Carlo (MCMC) to study and quantify uncertainties in the Triton and the χ-EFT axial-vector current, with the aim of providing a foundation for quantifying χ-EFT uncertainties for weak processes in nuclei. Chiral Effective Field Theory (χ-EFT) provides a framework for the generation and systematic improvement of model independent inter-nucleon interaction Hamiltonians and nuclear current operators. Within χ-EFT, short and mid distance physics is encoded through a gradient expansion and multiple pion exchange parameterized by a set of low energy constants (LECs). The LECs are often constrained via non-linear least squares using nuclear bound state and scattering observables. This has produced reasonable low-energy descriptions in the past, but has been plagued by LECs that are unnaturally large. Additional issues manifest in medium mass nuclei where the χ-EFT Hamiltonians fail to adequately describe saturation properties. It has

  9. Uncertainty quantification for systems of conservation laws

    SciTech Connect

    Poette, Gael Despres, Bruno Lucor, Didier

    2009-04-20

    Uncertainty quantification through stochastic spectral methods has been recently applied to several kinds of non-linear stochastic PDEs. In this paper, we introduce a formalism based on kinetic theory to tackle uncertain hyperbolic systems of conservation laws with Polynomial Chaos (PC) methods. The idea is to introduce a new variable, the entropic variable, in bijection with our vector of unknowns, which we develop on the polynomial basis: by performing a Galerkin projection, we obtain a deterministic system of conservation laws. We state several properties of this deterministic system in the case of a general uncertain system of conservation laws. We then apply the method to the case of the inviscid Burgers' equation with random initial conditions and we present some preliminary results for the Euler system. We systematically compare results from our new approach to results from the stochastic Galerkin method. In the vicinity of discontinuities, the new method bounds the oscillations due to Gibbs phenomenon to a certain range through the entropy of the system without the use of any adaptative random space discretizations. It is found to be more precise than the stochastic Galerkin method for smooth cases but above all for discontinuous cases.

  10. Uncertainty in Measured Data and Model Predictions: Essential Components for Mobilizing Environmental Data and Modeling

    NASA Astrophysics Data System (ADS)

    Harmel, D.

    2014-12-01

    In spite of pleas for uncertainty analysis - such as Beven's (2006) "Should it not be required that every paper in both field and modeling studies attempt to evaluate the uncertainty in the results?" - the uncertainty associated with hydrology and water quality data is rarely quantified and rarely considered in model evaluation. This oversight, justified in the past by mainly tenuous philosophical concerns, diminishes the value of measured data and ignores the environmental and socio-economic benefits of improved decisions and policies based on data with estimated uncertainty. This oversight extends to researchers, who typically fail to estimate uncertainty in measured discharge and water quality data because of additional effort required, lack of adequate scientific understanding on the subject, and fear of negative perception if data with "high" uncertainty are reported; however, the benefits are certain. Furthermore, researchers have a responsibility for scientific integrity in reporting what is known and what is unknown, including the quality of measured data. In response we produced an uncertainty estimation framework and the first cumulative uncertainty estimates for measured water quality data (Harmel et al., 2006). From that framework, DUET-H/WQ was developed (Harmel et al., 2009). Application to several real-world data sets indicated that substantial uncertainty can be contributed by each data collection procedural category and that uncertainties typically occur in order discharge < sediment < dissolved N and P < total N and P. Similarly, modelers address certain aspects of model uncertainty but ignore others, such as the impact of uncertainty in discharge and water quality data. Thus, we developed methods to incorporate prediction uncertainty as well as calibration/validation data uncertainty into model goodness-of-fit evaluation (Harmel and Smith, 2007; Harmel et al., 2010). These enhance model evaluation by: appropriately sharing burden with "data

  11. Adjoint-Based Uncertainty Quantification with MCNP

    SciTech Connect

    Seifried, Jeffrey E.

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  12. A Stronger Multi-observable Uncertainty Relation

    NASA Astrophysics Data System (ADS)

    Song, Qiu-Cheng; Li, Jun-Li; Peng, Guang-Xiong; Qiao, Cong-Feng

    2017-03-01

    Uncertainty relation lies at the heart of quantum mechanics, characterizing the incompatibility of non-commuting observables in the preparation of quantum states. An important question is how to improve the lower bound of uncertainty relation. Here we present a variance-based sum uncertainty relation for N incompatible observables stronger than the simple generalization of an existing uncertainty relation for two observables. Further comparisons of our uncertainty relation with other related ones for spin- and spin-1 particles indicate that the obtained uncertainty relation gives a better lower bound.

  13. Adjoint-Based Uncertainty Quantification with MCNP

    NASA Astrophysics Data System (ADS)

    Seifried, Jeffrey Edwin

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  14. A Stronger Multi-observable Uncertainty Relation

    PubMed Central

    Song, Qiu-Cheng; Li, Jun-Li; Peng, Guang-Xiong; Qiao, Cong-Feng

    2017-01-01

    Uncertainty relation lies at the heart of quantum mechanics, characterizing the incompatibility of non-commuting observables in the preparation of quantum states. An important question is how to improve the lower bound of uncertainty relation. Here we present a variance-based sum uncertainty relation for N incompatible observables stronger than the simple generalization of an existing uncertainty relation for two observables. Further comparisons of our uncertainty relation with other related ones for spin- and spin-1 particles indicate that the obtained uncertainty relation gives a better lower bound. PMID:28317917

  15. Video Scanning Hartmann Optical Tester (VSHOT) Uncertainty Analysis: Preprint

    SciTech Connect

    Lewandowski, A.; Gray, A.

    2010-10-01

    This purely analytical work is based primarily on the geometric optics of the system and shows sensitivities to various design and operational parameters. We discuss sources of error with measuring devices, instrument calibrations, and operator measurements for a parabolic trough test. In this paper, we include both the random (precision) and systematic (bias) errors for VSHOT testing and their contributions to the uncertainty. The contributing factors that we considered in this study are target tilt, target face to laser output distance, instrument vertical offset, scanner tilt, distance between the tool and the test piece, camera calibration, and scanner/calibration.

  16. Case studies of uncertainty analysis and explosives cleanup

    SciTech Connect

    Elmore, A.C.; Graff, T.

    1999-07-01

    Decision making for ground water restoration projects is sometimes difficult or complicated because of the large uncertainties inherent to evaluating hydrogeologic and contaminant conditions which cannot be directly observed over the large scale. Fiscal responsibility and professional ethics require that decision makers use best science approaches instead of relying on arbitrary selection processes. Three case studies illustrate how decision tree models and Monte Carlo analysis can be used by the practicing professional to systematically make important environmental restoration decisions. Each case study is based on formerly used defense site projects in the Midwest where explosives contamination is present in the ground water.

  17. Uncertainty relation for mutual information

    NASA Astrophysics Data System (ADS)

    Schneeloch, James; Broadbent, Curtis J.; Howell, John C.

    2014-12-01

    We postulate the existence of a universal uncertainty relation between the quantum and classical mutual informations between pairs of quantum systems. Specifically, we propose that the sum of the classical mutual information, determined by two mutually unbiased pairs of observables, never exceeds the quantum mutual information. We call this the complementary-quantum correlation (CQC) relation and prove its validity for pure states, for states with one maximally mixed subsystem, and for all states when one measurement is minimally disturbing. We provide results of a Monte Carlo simulation suggesting that the CQC relation is generally valid. Importantly, we also show that the CQC relation represents an improvement to an entropic uncertainty principle in the presence of a quantum memory, and that it can be used to verify an achievable secret key rate in the quantum one-time pad cryptographic protocol.

  18. MODEL VALIDATION AND UNCERTAINTY QUANTIFICATION.

    SciTech Connect

    Hemez, F.M.; Doebling, S.W.

    2000-10-01

    This session offers an open forum to discuss issues and directions of research in the areas of model updating, predictive quality of computer simulations, model validation and uncertainty quantification. Technical presentations review the state-of-the-art in nonlinear dynamics and model validation for structural dynamics. A panel discussion introduces the discussion on technology needs, future trends and challenges ahead with an emphasis placed on soliciting participation of the audience, One of the goals is to show, through invited contributions, how other scientific communities are approaching and solving difficulties similar to those encountered in structural dynamics. The session also serves the purpose of presenting the on-going organization of technical meetings sponsored by the U.S. Department of Energy and dedicated to health monitoring, damage prognosis, model validation and uncertainty quantification in engineering applications. The session is part of the SD-2000 Forum, a forum to identify research trends, funding opportunities and to discuss the future of structural dynamics.

  19. Aspects of complementarity and uncertainty

    NASA Astrophysics Data System (ADS)

    Vathsan, Radhika; Qureshi, Tabish

    2016-08-01

    The two-slit experiment with quantum particles provides many insights into the behavior of quantum mechanics, including Bohr’s complementarity principle. Here, we analyze Einstein’s recoiling slit version of the experiment and show how the inevitable entanglement between the particle and the recoiling slit as a which-way detector is responsible for complementarity. We derive the Englert-Greenberger-Yasin duality from this entanglement, which can also be thought of as a consequence of sum-uncertainty relations between certain complementary observables of the recoiling slit. Thus, entanglement is an integral part of the which-way detection process, and so is uncertainty, though in a completely different way from that envisaged by Bohr and Einstein.

  20. Neutrino Scattering Uncertainties and their Role in Long Baseline Oscillation Experiments

    SciTech Connect

    D.A. Harris; G. Blazey; Arie Bodek; D. Boehnlein; S. Boyd; William Brooks; Antje Bruell; Howard S. Budd; R. Burnstein; D. Casper; A. Chakravorty; Michael Christy; Jesse Chvojka; M.A.C. Cummings; P. deBarbaro; D. Drakoulakos; J. Dunmore; Rolf Ent; Hugh Gallagher; David Gaskell; Ronald Gilman; Charles Glashausser; Wendy Hinton; Xiaodong Jiang; T. Kafka; O. Kamaev; Cynthia Keppel; M. Kostin; Sergey Kulagin; Gerfried Kumbartzki; Steven Manly; W.A. Mann; Kevin Mcfarland-porter; Wolodymyr Melnitchouk; Jorge Morfin; D. Naples; John Nelson; Gabriel Niculescu; Maria-ioana Niculescu; W. Oliver; Michael Paolone; Emmanuel Paschos; A. Pla-Dalmau; Ronald Ransome; C. Regis; P. Rubinov; V. Rykalin; Willis Sakumoto; P. Shanahan; N. Solomey; P. Spentzouris; P. Stamoulis; G. Tzanakos; Stephen Wood; F.X. Yumiceva; B. Ziemer; M. Zois

    2004-10-01

    The field of oscillation physics is about to make an enormous leap forward in statistical precision: first through the MINOS experiment in the coming year, and later through the NOvA and T2K experiments. Because of the relatively poor understanding of neutrino interactions in the energy ranges of these experiments, there are systematics that can arise in interpreting far detector data that can be as large as or even larger than the expected statistical uncertainties. We describe how these systematic errors arise, and how specific measurements in a dedicated neutrino scattering experiment like MINERvA can reduce the cross section systematic errors to well below the statistical errors.

  1. Accounting for uncertainty in the quantification of the environmental impacts of Canadian pig farming systems.

    PubMed

    Mackenzie, S G; Leinonen, I; Ferguson, N; Kyriazakis, I

    2015-06-01

    The objective of the study was to develop a life cycle assessment (LCA) for pig farming systems that would account for uncertainty and variability in input data and allow systematic environmental impact comparisons between production systems. The environmental impacts of commercial pig production for 2 regions in Canada (Eastern and Western) were compared using a cradle-to-farm gate LCA. These systems had important contrasting characteristics such as typical feed ingredients used, herd performance, and expected emission factors from manure management. The study used detailed production data supplied by the industry and incorporated uncertainty/variation in all major aspects of the system including life cycle inventory data for feed ingredients, animal performance, energy inputs, and emission factors. The impacts were defined using 5 metrics-global warming potential, acidification potential, eutrophication potential (EP), abiotic resource use, and nonrenewable energy use-and were expressed per kilogram carcass weight at farm gate. Eutrophication potential was further separated into marine EP (MEP) and freshwater EP (FEP). Uncertainties in the model inputs were separated into 2 types: uncertainty in the data used to describe the system (α uncertainties) and uncertainty in impact calculations or background data that affects all systems equally (β uncertainties). The impacts of pig production in the 2 regions were systematically compared based on the differences in the systems (α uncertainties). The method of ascribing uncertainty influenced the outcomes. In eastern systems, EP, MEP, and FEP were lower (P < 0.05) when assuming that all uncertainty in the emission factors for leaching from manure application was β. This was mainly due to increased EP resulting from field emissions for typical ingredients in western diets. When uncertainty in these emission factors was assumed to be α, only FEP was lower in eastern systems (P < 0.05). The environmental impacts for

  2. Uncertainty analysis and robust trajectory linearization control of a flexible air-breathing hypersonic vehicle

    NASA Astrophysics Data System (ADS)

    Pu, Zhiqiang; Tan, Xiangmin; Fan, Guoliang; Yi, Jianqiang

    2014-08-01

    Flexible air-breathing hypersonic vehicles feature significant uncertainties which pose huge challenges to robust controller designs. In this paper, four major categories of uncertainties are analyzed, that is, uncertainties associated with flexible effects, aerodynamic parameter variations, external environmental disturbances, and control-oriented modeling errors. A uniform nonlinear uncertainty model is explored for the first three uncertainties which lumps all uncertainties together and consequently is beneficial for controller synthesis. The fourth uncertainty is additionally considered in stability analysis. Based on these analyses, the starting point of the control design is to decompose the vehicle dynamics into five functional subsystems. Then a robust trajectory linearization control (TLC) scheme consisting of five robust subsystem controllers is proposed. In each subsystem controller, TLC is combined with the extended state observer (ESO) technique for uncertainty compensation. The stability of the overall closed-loop system with the four aforementioned uncertainties and additional singular perturbations is analyzed. Particularly, the stability of nonlinear ESO is also discussed from a Liénard system perspective. At last, simulations demonstrate the great control performance and the uncertainty rejection ability of the robust scheme.

  3. Sonic Boom Pressure Signature Uncertainty Calculation and Propagation to Ground Noise

    NASA Technical Reports Server (NTRS)

    West, Thomas K., IV; Bretl, Katherine N.; Walker, Eric L.; Pinier, Jeremy T.

    2015-01-01

    The objective of this study was to outline an approach for the quantification of uncertainty in sonic boom measurements and to investigate the effect of various near-field uncertainty representation approaches on ground noise predictions. These approaches included a symmetric versus asymmetric uncertainty band representation and a dispersion technique based on a partial sum Fourier series that allows for the inclusion of random error sources in the uncertainty. The near-field uncertainty was propagated to the ground level, along with additional uncertainty in the propagation modeling. Estimates of perceived loudness were obtained for the various types of uncertainty representation in the near-field. Analyses were performed on three configurations of interest to the sonic boom community: the SEEB-ALR, the 69o DeltaWing, and the LM 1021-01. Results showed that representation of the near-field uncertainty plays a key role in ground noise predictions. Using a Fourier series based dispersion approach can double the amount of uncertainty in the ground noise compared to a pure bias representation. Compared to previous computational fluid dynamics results, uncertainty in ground noise predictions were greater when considering the near-field experimental uncertainty.

  4. Heisenberg uncertainty principles for an oscillatory integral operator

    NASA Astrophysics Data System (ADS)

    Castro, L. P.; Guerra, R. C.; Tuan, N. M.

    2017-01-01

    The main aim of this work is to obtain Heisenberg uncertainty principles for a specific oscillatory integral operator which representatively exhibits different parameters on their sine and cosine phase components. Additionally, invertibility theorems, Parseval type identities and Plancherel type theorems are also obtained.

  5. Uncertainty propagation in nuclear forensics.

    PubMed

    Pommé, S; Jerome, S M; Venchiarutti, C

    2014-07-01

    Uncertainty propagation formulae are presented for age dating in support of nuclear forensics. The age of radioactive material in this context refers to the time elapsed since a particular radionuclide was chemically separated from its decay product(s). The decay of the parent radionuclide and ingrowth of the daughter nuclide are governed by statistical decay laws. Mathematical equations allow calculation of the age of specific nuclear material through the atom ratio between parent and daughter nuclides, or through the activity ratio provided that the daughter nuclide is also unstable. The derivation of the uncertainty formulae of the age may present some difficulty to the user community and so the exact solutions, some approximations, a graphical representation and their interpretation are presented in this work. Typical nuclides of interest are actinides in the context of non-proliferation commitments. The uncertainty analysis is applied to a set of important parent-daughter pairs and the need for more precise half-life data is examined.

  6. Blade tip timing (BTT) uncertainties

    NASA Astrophysics Data System (ADS)

    Russhard, Pete

    2016-06-01

    Blade Tip Timing (BTT) is an alternative technique for characterising blade vibration in which non-contact timing probes (e.g. capacitance or optical probes), typically mounted on the engine casing (figure 1), and are used to measure the time at which a blade passes each probe. This time is compared with the time at which the blade would have passed the probe if it had been undergoing no vibration. For a number of years the aerospace industry has been sponsoring research into Blade Tip Timing technologies that have been developed as tools to obtain rotor blade tip deflections. These have been successful in demonstrating the potential of the technology, but rarely produced quantitative data, along with a demonstration of a traceable value for measurement uncertainty. BTT technologies have been developed under a cloak of secrecy by the gas turbine OEM's due to the competitive advantages it offered if it could be shown to work. BTT measurements are sensitive to many variables and there is a need to quantify the measurement uncertainty of the complete technology and to define a set of guidelines as to how BTT should be applied to different vehicles. The data shown in figure 2 was developed from US government sponsored program that bought together four different tip timing system and a gas turbine engine test. Comparisons showed that they were just capable of obtaining measurement within a +/-25% uncertainty band when compared to strain gauges even when using the same input data sets.

  7. Hierarchical Bayesian model averaging for hydrostratigraphic modeling: Uncertainty segregation and comparative evaluation

    NASA Astrophysics Data System (ADS)

    Tsai, Frank T.-C.; Elshall, Ahmed S.

    2013-09-01

    Analysts are often faced with competing propositions for each uncertain model component. How can we judge that we select a correct proposition(s) for an uncertain model component out of numerous possible propositions? We introduce the hierarchical Bayesian model averaging (HBMA) method as a multimodel framework for uncertainty analysis. The HBMA allows for segregating, prioritizing, and evaluating different sources of uncertainty and their corresponding competing propositions through a hierarchy of BMA models that forms a BMA tree. We apply the HBMA to conduct uncertainty analysis on the reconstructed hydrostratigraphic architectures of the Baton Rouge aquifer-fault system, Louisiana. Due to uncertainty in model data, structure, and parameters, multiple possible hydrostratigraphic models are produced and calibrated as base models. The study considers four sources of uncertainty. With respect to data uncertainty, the study considers two calibration data sets. With respect to model structure, the study considers three different variogram models, two geological stationarity assumptions and two fault conceptualizations. The base models are produced following a combinatorial design to allow for uncertainty segregation. Thus, these four uncertain model components with their corresponding competing model propositions result in 24 base models. The results show that the systematic dissection of the uncertain model components along with their corresponding competing propositions allows for detecting the robust model propositions and the major sources of uncertainty.

  8. Confronting dynamics and uncertainty in optimal decision making for conservation

    NASA Astrophysics Data System (ADS)

    Williams, Byron K.; Johnson, Fred A.

    2013-06-01

    The effectiveness of conservation efforts ultimately depends on the recognition that decision making, and the systems that it is designed to affect, are inherently dynamic and characterized by multiple sources of uncertainty. To cope with these challenges, conservation planners are increasingly turning to the tools of decision analysis, especially dynamic optimization methods. Here we provide a general framework for optimal, dynamic conservation and then explore its capacity for coping with various sources and degrees of uncertainty. In broadest terms, the dynamic optimization problem in conservation is choosing among a set of decision options at periodic intervals so as to maximize some conservation objective over the planning horizon. Planners must account for immediate objective returns, as well as the effect of current decisions on future resource conditions and, thus, on future decisions. Undermining the effectiveness of such a planning process are uncertainties concerning extant resource conditions (partial observability), the immediate consequences of decision choices (partial controllability), the outcomes of uncontrolled, environmental drivers (environmental variation), and the processes structuring resource dynamics (structural uncertainty). Where outcomes from these sources of uncertainty can be described in terms of probability distributions, a focus on maximizing the expected objective return, while taking state-specific actions, is an effective mechanism for coping with uncertainty. When such probability distributions are unavailable or deemed unreliable, a focus on maximizing robustness is likely to be the preferred approach. Here the idea is to choose an action (or state-dependent policy) that achieves at least some minimum level of performance regardless of the (uncertain) outcomes. We provide some examples of how the dynamic optimization problem can be framed for problems involving management of habitat for an imperiled species, conservation of a

  9. Confronting dynamics and uncertainty in optimal decision making for conservation

    USGS Publications Warehouse

    Williams, Byron K.; Johnson, Fred A.

    2013-01-01

    The effectiveness of conservation efforts ultimately depends on the recognition that decision making, and the systems that it is designed to affect, are inherently dynamic and characterized by multiple sources of uncertainty. To cope with these challenges, conservation planners are increasingly turning to the tools of decision analysis, especially dynamic optimization methods. Here we provide a general framework for optimal, dynamic conservation and then explore its capacity for coping with various sources and degrees of uncertainty. In broadest terms, the dynamic optimization problem in conservation is choosing among a set of decision options at periodic intervals so as to maximize some conservation objective over the planning horizon. Planners must account for immediate objective returns, as well as the effect of current decisions on future resource conditions and, thus, on future decisions. Undermining the effectiveness of such a planning process are uncertainties concerning extant resource conditions (partial observability), the immediate consequences of decision choices (partial controllability), the outcomes of uncontrolled, environmental drivers (environmental variation), and the processes structuring resource dynamics (structural uncertainty). Where outcomes from these sources of uncertainty can be described in terms of probability distributions, a focus on maximizing the expected objective return, while taking state-specific actions, is an effective mechanism for coping with uncertainty. When such probability distributions are unavailable or deemed unreliable, a focus on maximizing robustness is likely to be the preferred approach. Here the idea is to choose an action (or state-dependent policy) that achieves at least some minimum level of performance regardless of the (uncertain) outcomes. We provide some examples of how the dynamic optimization problem can be framed for problems involving management of habitat for an imperiled species, conservation of a

  10. Probabilistic simulation of uncertainties in thermal structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael

    1990-01-01

    Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.

  11. Effect of atmospheric flux uncertainties on the determination of the neutrino mass hierarchy

    NASA Astrophysics Data System (ADS)

    Sandroos, Joakim; Erhardt, Thomas; Arlen, Tim; Böser, Sebastian

    2016-04-01

    The next generation of large-volume neutrino telescopes will include low-energy subarrays which will be able to measure neutrinos with energies of a few GeV. In this energy range the primary signal below the horizon is neutrinos created by cosmic ray interactions in the atmosphere. The measured event rate will depend on the neutrino mass hierarchy, allowing determination of this quantity to a significance level of about 3.5 sigma within a 5-year period, mostly limited by systematic uncertainties. We present here the impact of the uncertainties on the atmospheric neutrino flux normalization on the determination of the neutrino mass hierarchy. We suggest constraining the systematic uncertainties by including the downgoing neutrino sample, which will increase the significance. This work was performed using simulation data from the low-energy extension to the IceCube detector located at the geographic south pole, PINGU, and is relevant to a wide range of other experiments.

  12. Absolute frequency measurement with uncertainty below 1× 10^{-15} using International Atomic Time

    NASA Astrophysics Data System (ADS)

    Hachisu, Hidekazu; Petit, Gérard; Ido, Tetsuya

    2017-01-01

    The absolute frequency of the ^{87}Sr clock transition measured in 2015 (Jpn J Appl Phys 54:112401, 2015) was reevaluated using an improved frequency link to the SI second. The scale interval of International Atomic Time (TAI) that we used as the reference was calibrated for an evaluation interval of 5 days instead of the conventional interval of 1 month which is regularly employed in Circular T. The calibration on a 5-day basis removed the uncertainty in assimilating the TAI scale of the 5-day mean to that of the 1-month mean. The reevaluation resulted in the total uncertainty of 10^{-16} level for the first time without local cesium fountains. Since there are presumably no correlations among systematic shifts of cesium fountains worldwide, the measurement is not limited by the systematic uncertainty of a specific primary frequency standard.

  13. Uncertainties of responses calculated with a 'tuned' library - Geometrical and algebraic insights

    SciTech Connect

    Perel, R.L.

    2011-07-01

    The standard American evaluated nuclear data library ENDF/B-VII was 'tuned' based on simple measured critical assemblies. This tuning was not done according to a fully defined mathematical algorithm, such as the adjustment algorithm. In this work, we investigate how tuning affects the uncertainties (covariances) of the cross-section libraries. First, we analyze what happens to cross-section uncertainties as a result of adjustment. The effect of adjustment on the uncertainties is geometrically demonstrated for simple cases. For those parts of the sensitivities of the assembly to be calculated that are parallel to the sensitivities of the assemblies on which adjustment was based, there is significant reduction in the uncertainties. For orthogonal parts, there is no change in the uncertainties. These findings are algebraically proven based on the adjustment algorithm. Then we analyze the differences between tuned libraries and adjusted libraries. We conclude that for tuned libraries, the uncertainties in the direction of sensitivities on which adjustment or tuning were based are improved, similar to the improvement for an adjusted library. However, the displacement of the nominal values of the library parameters to their tuned value, instead of their adjusted value, adds an additional uncertainty. This additional uncertainty is typically small in the direction that was improved by adjustment. The magnitude of the additional uncertainty in perpendicular directions depends on the particular details of the tuning performed. (authors)

  14. Quantifying Global Uncertainties in a Simple Microwave Rainfall Algorithm

    NASA Technical Reports Server (NTRS)

    Kummerow, Christian; Berg, Wesley; Thomas-Stahle, Jody; Masunaga, Hirohiko

    2006-01-01

    While a large number of methods exist in the literature for retrieving rainfall from passive microwave brightness temperatures, little has been written about the quantitative assessment of the expected uncertainties in these rainfall products at various time and space scales. The latter is the result of two factors: sparse validation sites over most of the world's oceans, and algorithm sensitivities to rainfall regimes that cause inconsistencies against validation data collected at different locations. To make progress in this area, a simple probabilistic algorithm is developed. The algorithm uses an a priori database constructed from the Tropical Rainfall Measuring Mission (TRMM) radar data coupled with radiative transfer computations. Unlike efforts designed to improve rainfall products, this algorithm takes a step backward in order to focus on uncertainties. In addition to inversion uncertainties, the construction of the algorithm allows errors resulting from incorrect databases, incomplete databases, and time- and space-varying databases to be examined. These are quantified. Results show that the simple algorithm reduces errors introduced by imperfect knowledge of precipitation radar (PR) rain by a factor of 4 relative to an algorithm that is tuned to the PR rainfall. Database completeness does not introduce any additional uncertainty at the global scale, while climatologically distinct space/time domains add approximately 25% uncertainty that cannot be detected by a radiometer alone. Of this value, 20% is attributed to changes in cloud morphology and microphysics, while 5% is a result of changes in the rain/no-rain thresholds. All but 2%-3% of this variability can be accounted for by considering the implicit assumptions in the algorithm. Additional uncertainties introduced by the details of the algorithm formulation are not quantified in this study because of the need for independent measurements that are beyond the scope of this paper. A validation strategy

  15. Facility Measurement Uncertainty Analysis at NASA GRC

    NASA Technical Reports Server (NTRS)

    Stephens, Julia; Hubbard, Erin

    2016-01-01

    This presentation provides and overview of the measurement uncertainty analysis currently being implemented in various facilities at NASA GRC. This presentation includes examples pertinent to the turbine engine community (mass flow and fan efficiency calculation uncertainties.

  16. Two new kinds of uncertainty relations

    NASA Technical Reports Server (NTRS)

    Uffink, Jos

    1994-01-01

    We review a statistical-geometrical and a generalized entropic approach to the uncertainty principle. Both approaches provide a strengthening and generalization of the standard Heisenberg uncertainty relations, but in different directions.

  17. Uncertainty of Rotating Shadowband Irradiometers and Si-Pyranometers Including the Spectral Irradiance Error

    SciTech Connect

    Wilbert, Stefan; Kleindiek, Stefan; Nouri, Bijan; Geuder, Norbert; Habte, Aron; Schwandt, Marko; Vignola, Frank

    2016-05-31

    Concentrating solar power projects require accurate direct normal irradiance (DNI) data including uncertainty specifications for plant layout and cost calculations. Ground measured data are necessary to obtain the required level of accuracy and are often obtained with Rotating Shadowband Irradiometers (RSI) that use photodiode pyranometers and correction functions to account for systematic effects. The uncertainty of Si-pyranometers has been investigated, but so far basically empirical studies were published or decisive uncertainty influences had to be estimated based on experience in analytical studies. One of the most crucial estimated influences is the spectral irradiance error because Si-photodiode-pyranometers only detect visible and color infrared radiation and have a spectral response that varies strongly within this wavelength interval. Furthermore, analytic studies did not discuss the role of correction functions and the uncertainty introduced by imperfect shading. In order to further improve the bankability of RSI and Si-pyranometer data, a detailed uncertainty analysis following the Guide to the Expression of Uncertainty in Measurement (GUM) has been carried out. The study defines a method for the derivation of the spectral error and spectral uncertainties and presents quantitative values of the spectral and overall uncertainties. Data from the PSA station in southern Spain was selected for the analysis. Average standard uncertainties for corrected 10 min data of 2% for global horizontal irradiance (GHI), and 2.9% for DNI (for GHI and DNI over 300 W/m2) were found for the 2012 yearly dataset when separate GHI and DHI calibration constants were used. Also the uncertainty in 1 min resolution was analyzed. The effect of correction functions is significant. The uncertainties found in this study are consistent with results of previous empirical studies.

  18. Uncertainty of rotating shadowband irradiometers and Si-pyranometers including the spectral irradiance error

    NASA Astrophysics Data System (ADS)

    Wilbert, Stefan; Kleindiek, Stefan; Nouri, Bijan; Geuder, Norbert; Habte, Aron; Schwandt, Marko; Vignola, Frank

    2016-05-01

    Concentrating solar power projects require accurate direct normal irradiance (DNI) data including uncertainty specifications for plant layout and cost calculations. Ground measured data are necessary to obtain the required level of accuracy and are often obtained with Rotating Shadowband Irradiometers (RSI) that use photodiode pyranometers and correction functions to account for systematic effects. The uncertainty of Si-pyranometers has been investigated, but so far basically empirical studies were published or decisive uncertainty influences had to be estimated based on experience in analytical studies. One of the most crucial estimated influences is the spectral irradiance error because Si-photodiode-pyranometers only detect visible and color infrared radiation and have a spectral response that varies strongly within this wavelength interval. Furthermore, analytic studies did not discuss the role of correction functions and the uncertainty introduced by imperfect shading. In order to further improve the bankability of RSI and Si-pyranometer data, a detailed uncertainty analysis following the Guide to the Expression of Uncertainty in Measurement (GUM) has been carried out. The study defines a method for the derivation of the spectral error and spectral uncertainties and presents quantitative values of the spectral and overall uncertainties. Data from the PSA station in southern Spain was selected for the analysis. Average standard uncertainties for corrected 10 min data of 2 % for global horizontal irradiance (GHI), and 2.9 % for DNI (for GHI and DNI over 300 W/m²) were found for the 2012 yearly dataset when separate GHI and DHI calibration constants were used. Also the uncertainty in 1 min resolution was analyzed. The effect of correction functions is significant. The uncertainties found in this study are consistent with results of previous empirical studies.

  19. An update on the uncertainties of water vapor measurements using cryogenic frost point hygrometers

    NASA Astrophysics Data System (ADS)

    Vömel, Holger; Naebert, Tatjana; Dirksen, Ruud; Sommer, Michael

    2016-08-01

    Long time series of observations of essential climate variables in the troposphere and stratosphere are often impacted by inconsistencies in instrumentation and ambiguities in the interpretation of the data. To reduce these problems of long-term data series, all measurements should include an estimate of their uncertainty and a description of their sources. Here we present an update of the uncertainties for tropospheric and stratospheric water vapor observations using the cryogenic frost point hygrometer (CFH). The largest source of measurement uncertainty is the controller stability, which is discussed here in detail. We describe a method to quantify this uncertainty for each profile based on the measurements. We also show the importance of a manufacturer-independent ground check, which is an essential tool to continuously monitor the uncertainty introduced by instrument variability. A small bias, which has previously been indicated in lower tropospheric measurements, is described here in detail and has been rectified. Under good conditions, the total from all sources of uncertainty of frost point or dew point measurements using the CFH can be better than 0.2 K. Systematic errors, which are most likely to impact long-term climate series, are verified to be less than 0.1 K. The impact of the radiosonde pressure uncertainty on the mixing ratio for properly processed radiosondes is considered small. The mixing ratio uncertainty may be as low as 2 to 3 %. The impact of the ambient temperature uncertainty on relative humidity (RH) is generally larger than that of the frost point uncertainty. The relative RH uncertainty may be as low as 2 % in the lower troposphere and 5 % in the tropical tropopause region.

  20. Uncertainty Quantification in Climate Modeling

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Safta, C.; Berry, R.; Debusschere, B.; Najm, H.

    2011-12-01

    We address challenges that sensitivity analysis and uncertainty quantification methods face when dealing with complex computational models. In particular, climate models are computationally expensive and typically depend on a large number of input parameters. We consider the Community Land Model (CLM), which consists of a nested computational grid hierarchy designed to represent the spatial heterogeneity of the land surface. Each computational cell can be composed of multiple land types, and each land type can incorporate one or more sub-models describing the spatial and depth variability. Even for simulations at a regional scale, the computational cost of a single run is quite high and the number of parameters that control the model behavior is very large. Therefore, the parameter sensitivity analysis and uncertainty propagation face significant difficulties for climate models. This work employs several algorithmic avenues to address some of the challenges encountered by classical uncertainty quantification methodologies when dealing with expensive computational models, specifically focusing on the CLM as a primary application. First of all, since the available climate model predictions are extremely sparse due to the high computational cost of model runs, we adopt a Bayesian framework that effectively incorporates this lack-of-knowledge as a source of uncertainty, and produces robust predictions with quantified uncertainty even if the model runs are extremely sparse. In particular, we infer Polynomial Chaos spectral expansions that effectively encode the uncertain input-output relationship and allow efficient propagation of all sources of input uncertainties to outputs of interest. Secondly, the predictability analysis of climate models strongly suffers from the curse of dimensionality, i.e. the large number of input parameters. While single-parameter perturbation studies can be efficiently performed in a parallel fashion, the multivariate uncertainty analysis

  1. Systematic review automation technologies.

    PubMed

    Tsafnat, Guy; Glasziou, Paul; Choong, Miew Keen; Dunn, Adam; Galgani, Filippo; Coiera, Enrico

    2014-07-09

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects.We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time.

  2. Systematic review automation technologies

    PubMed Central

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  3. Associating uncertainty with datasets using Linked Data and allowing propagation via provenance chains

    NASA Astrophysics Data System (ADS)

    Car, Nicholas; Cox, Simon; Fitch, Peter

    2015-04-01

    With earth-science datasets increasingly being published to enable re-use in projects disassociated from the original data acquisition or generation, there is an urgent need for associated metadata to be connected, in order to guide their application. In particular, provenance traces should support the evaluation of data quality and reliability. However, while standards for describing provenance are emerging (e.g. PROV-O), these do not include the necessary statistical descriptors and confidence assessments. UncertML has a mature conceptual model that may be used to record uncertainty metadata. However, by itself UncertML does not support the representation of uncertainty of multi-part datasets, and provides no direct way of associating the uncertainty information - metadata in relation to a dataset - with dataset objects.We present a method to address both these issues by combining UncertML with PROV-O, and delivering resulting uncertainty-enriched provenance traces through the Linked Data API. UncertProv extends the PROV-O provenance ontology with an RDF formulation of the UncertML conceptual model elements, adds further elements to support uncertainty representation without a conceptual model and the integration of UncertML through links to documents. The Linked ID API provides a systematic way of navigating from dataset objects to their UncertProv metadata and back again. The Linked Data API's 'views' capability enables access to UncertML and non-UncertML uncertainty metadata representations for a dataset. With this approach, it is possible to access and navigate the uncertainty metadata associated with a published dataset using standard semantic web tools, such as SPARQL queries. Where the uncertainty data follows the UncertML model it can be automatically interpreted and may also support automatic uncertainty propagation . Repositories wishing to enable uncertainty propagation for all datasets must ensure that all elements that are associated with uncertainty

  4. Measuring, Estimating, and Deciding under Uncertainty.

    PubMed

    Michel, Rolf

    2016-03-01

    The problem of uncertainty as a general consequence of incomplete information and the approach to quantify uncertainty in metrology is addressed. Then, this paper discusses some of the controversial aspects of the statistical foundation of the concepts of uncertainty in measurements. The basics of the ISO Guide to the Expression of Uncertainty in Measurement as well as of characteristic limits according to ISO 11929 are described and the needs for a revision of the latter standard are explained.

  5. Measurement Issues for Energy Efficient Commercial Buildings: Productivity and Performance Uncertainties

    SciTech Connect

    Jones, D.W.

    2002-05-16

    In previous reports, we have identified two potentially important issues, solutions to which would increase the attractiveness of DOE-developed technologies in commercial buildings energy systems. One issue concerns the fact that in addition to saving energy, many new technologies offer non-energy benefits that contribute to building productivity (firm profitability). The second issue is that new technologies are typically unproven in the eyes of decision makers and must bear risk premiums that offset cost advantages resulting from laboratory calculations. Even though a compelling case can be made for the importance of these issues, for building decision makers to incorporate them in business decisions and for DOE to use them in R&D program planning there must be robust empirical evidence of their existence and size. This paper investigates how such measurements could be made and offers recommendations as to preferred options. There is currently little systematic information on either of these concepts in the literature. Of the two there is somewhat more information on non-energy benefits, but little as regards office buildings. Office building productivity impacts can be observed casually, but must be estimated statistically, because buildings have many interacting attributes and observations based on direct behavior can easily confuse the process of attribution. For example, absenteeism can be easily observed. However, absenteeism may be down because a more healthy space conditioning system was put into place, because the weather was milder, or because firm policy regarding sick days had changed. There is also a general dearth of appropriate information for purposes of estimation. To overcome these difficulties, we propose developing a new data base and applying the technique of hedonic price analysis. This technique has been used extensively in the analysis of residential dwellings. There is also a literature on its application to commercial and industrial

  6. Constraining uncertainties in particle-wall deposition correction during SOA formation in chamber experiments

    NASA Astrophysics Data System (ADS)

    Nah, Theodora; McVay, Renee C.; Pierce, Jeffrey R.; Seinfeld, John H.; Ng, Nga L.

    2017-02-01

    The effect of vapor-wall deposition on secondary organic aerosol (SOA) formation has gained significant attention; however, uncertainties in experimentally derived SOA mass yields due to uncertainties in particle-wall deposition remain. Different approaches have been used to correct for particle-wall deposition in SOA formation studies, each having its own set of assumptions in determining the particle-wall loss rate. In volatile and intermediate-volatility organic compound (VOC and IVOC) systems in which SOA formation is governed by kinetically limited growth, the effect of vapor-wall deposition on SOA mass yields can be constrained by using high surface area concentrations of seed aerosol to promote the condensation of SOA-forming vapors onto seed aerosol instead of the chamber walls. However, under such high seed aerosol levels, the presence of significant coagulation may complicate the particle-wall deposition correction. Here, we present a model framework that accounts for coagulation in chamber studies in which high seed aerosol surface area concentrations are used. For the α-pinene ozonolysis system, we find that after accounting for coagulation, SOA mass yields remain approximately constant when high seed aerosol surface area concentrations ( ≥ 8000 µm2 cm-3) are used, consistent with our prior study (Nah et al., 2016) showing that α-pinene ozonolysis SOA formation is governed by quasi-equilibrium growth. In addition, we systematically assess the uncertainties in the calculated SOA mass concentrations and yields between four different particle-wall loss correction methods over the series of α-pinene ozonolysis experiments. At low seed aerosol surface area concentrations (< 3000 µm2 cm-3), the SOA mass yields at peak SOA growth obtained from the particle-wall loss correction methods agree within 14 %. However, at high seed aerosol surface area concentrations ( ≥ 8000 µm2 cm-3), the SOA mass yields at peak SOA growth obtained from different particle

  7. Brief review of uncertainty quantification for particle image velocimetry

    NASA Astrophysics Data System (ADS)

    Farias, M. H.; Teixeira, R. S.; Koiller, J.; Santos, A. M.

    2016-07-01

    Metrological studies for particle image velocimetry (PIV) are recent in literature. An attempt to evaluate the uncertainty quantifications (UQ) of the PIV velocity field are in evidence. Therefore, a short review on main sources of uncertainty in PIV and available methodologies for its quantification are presented. In addition, the potential of some mathematical techniques, coming from the area of geometric mechanics and control, that could interest the fluids UQ community are highlighted as good possibilities. “We must measure what is measurable and make measurable what cannot be measured” (Galileo)

  8. Uncertainty Analysis and Order-by-Order Optimization of Chiral Nuclear Interactions

    NASA Astrophysics Data System (ADS)

    Carlsson, B. D.; Ekström, A.; Forssén, C.; Strömberg, D. Fahlin; Jansen, G. R.; Lilja, O.; Lindby, M.; Mattsson, B. A.; Wendt, K. A.

    2016-01-01

    Chiral effective field theory (χ EFT ) provides a systematic approach to describe low-energy nuclear forces. Moreover, χ EFT is able to provide well-founded estimates of statistical and systematic uncertainties—although this unique advantage has not yet been fully exploited. We fill this gap by performing an optimization and statistical analysis of all the low-energy constants (LECs) up to next-to-next-to-leading order. Our optimization protocol corresponds to a simultaneous fit to scattering and bound-state observables in the pion-nucleon, nucleon-nucleon, and few-nucleon sectors, thereby utilizing the full model capabilities of χ EFT . Finally, we study the effect on other observables by demonstrating forward-error-propagation methods that can easily be adopted by future works. We employ mathematical optimization and implement automatic differentiation to attain efficient and machine-precise first- and second-order derivatives of the objective function with respect to the LECs. This is also vital for the regression analysis. We use power-counting arguments to estimate the systematic uncertainty that is inherent to χ EFT , and we construct chiral interactions at different orders with quantified uncertainties. Statistical error propagation is compared with Monte Carlo sampling, showing that statistical errors are, in general, small compared to systematic ones. In conclusion, we find that a simultaneous fit to different sets of data is critical to (i) identify the optimal set of LECs, (ii) capture all relevant correlations, (iii) reduce the statistical uncertainty, and (iv) attain order-by-order convergence in χ EFT . Furthermore, certain systematic uncertainties in the few-nucleon sector are shown to get substantially magnified in the many-body sector, in particular when varying the cutoff in the chiral potentials. The methodology and results presented in this paper open a new frontier for uncertainty quantification in ab initio nuclear theory.

  9. Regarding Uncertainty in Teachers and Teaching

    ERIC Educational Resources Information Center

    Helsing, Deborah

    2007-01-01

    The literature on teacher uncertainty suggests that it is a significant and perhaps inherent feature of teaching. Yet there is disagreement about the effects of these uncertainties on teachers as well as about the ways that teachers should regard them. Recognition of uncertainties can be viewed alternatively as a liability or an asset to effective…

  10. Risk Analysis and Uncertainty: Implications for Counselling

    ERIC Educational Resources Information Center

    Hassenzahl, David

    2004-01-01

    Over the past two decades, the risk analysis community has made substantial advances in understanding and describing uncertainty. Uncertainty is ubiquitous, complex, both quantitative and qualitative in nature, and often irreducible. Uncertainty thus creates a challenge when using risk analysis to evaluate the rationality of group and individual…

  11. Identifying the Rhetoric of Uncertainty Reduction.

    ERIC Educational Resources Information Center

    Williams, David E.

    Offering a rhetorical perspective of uncertainty reduction, this paper (1) discusses uncertainty reduction theory and dramatism; (2) identifies rhetorical strategies inherent in C. W. Berger and R. J. Calabrese's theory; (3) extends predicted outcome value to influenced outcome value; and (4) argues that the goal of uncertainty reduction and…

  12. Quantum mechanics and the generalized uncertainty principle

    SciTech Connect

    Bang, Jang Young; Berger, Micheal S.

    2006-12-15

    The generalized uncertainty principle has been described as a general consequence of incorporating a minimal length from a theory of quantum gravity. We consider a simple quantum mechanical model where the operator corresponding to position has discrete eigenvalues and show how the generalized uncertainty principle results for minimum uncertainty wave packets.

  13. WE-B-19A-01: SRT II: Uncertainties in SRT

    SciTech Connect

    Dieterich, S; Schlesinger, D; Geneser, S

    2014-06-15

    SRS delivery has undergone major technical changes in the last decade, transitioning from predominantly frame-based treatment delivery to imageguided, frameless SRS. It is important for medical physicists working in SRS to understand the magnitude and sources of uncertainty involved in delivering SRS treatments for a multitude of technologies (Gamma Knife, CyberKnife, linac-based SRS and protons). Sources of SRS planning and delivery uncertainty include dose calculation, dose fusion, and intra- and inter-fraction motion. Dose calculations for small fields are particularly difficult because of the lack of electronic equilibrium and greater effect of inhomogeneities within and near the PTV. Going frameless introduces greater setup uncertainties that allows for potentially increased intra- and interfraction motion, The increased use of multiple imaging modalities to determine the tumor volume, necessitates (deformable) image and contour fusion, and the resulting uncertainties introduced in the image registration process further contribute to overall treatment planning uncertainties. Each of these uncertainties must be quantified and their impact on treatment delivery accuracy understood. If necessary, the uncertainties may then be accounted for during treatment planning either through techniques to make the uncertainty explicit, or by the appropriate addition of PTV margins. Further complicating matters, the statistics of 1-5 fraction SRS treatments differ from traditional margin recipes relying on Poisson statistics. In this session, we will discuss uncertainties introduced during each step of the SRS treatment planning and delivery process and present margin recipes to appropriately account for such uncertainties. Learning Objectives: To understand the major contributors to the total delivery uncertainty in SRS for Gamma Knife, CyberKnife, and linac-based SRS. Learn the various uncertainties introduced by image fusion, deformable image registration, and contouring

  14. mu analysis with real parametric uncertainty

    NASA Technical Reports Server (NTRS)

    Young, Peter M.; Newlin, Matthew P.; Doyle, John C.

    1991-01-01

    The authors give a broad overview, from a LFT (linear fractional transformation)/mu perspective, of some of the theoretical and practical issues associated with robustness in the presence of real parametric uncertainty, with a focus on computation. Recent results on the properties of mu in the mixed case are reviewed, including issues of NP completeness, continuity, computation of bounds, the equivalence of mu and its bounds, and some direct comparisons with Kharitonov-type analysis methods. In addition, some advances in the computational aspects of the problem, including a novel branch and bound algorithm, are briefly presented together with numerical results. The results suggest that while the mixed mu problem may have inherently combinatoric worst-case behavior, practical algorithms with modest computational requirements can be developed for problems of medium size (less than 100 parameters) that are of engineering interest.

  15. Optimization of environmental water purchases with uncertainty

    NASA Astrophysics Data System (ADS)

    Hollinshead, Sarah P.; Lund, Jay R.

    2006-08-01

    Water managers are turning increasingly to market solutions to meet new environmental demands for water in fully allocated systems. This paper presents a three-stage probabilistic optimization model that identifies least cost strategies for staged seasonal water purchases for an environmental water acquisition program given hydrologic, operational, and biological uncertainties. Multistage linear programming is used to minimize the expected cost of long-term, spot, and option water purchases used to meet uncertain environmental demands. Results prescribe the location, timing, and type of optimal water purchases and illustrate how least cost strategies change as information becomes available during the year. Results also provide sensitivity analysis, including shadow values that estimate the expected cost of additional dedicated environmental water. The model's application to California's Environmental Water Account is presented with a discussion of its utility for planning and policy purposes. Model limitations and sensitivity analysis are discussed, as are operational and research recommendations.

  16. Calculating Measurement Uncertainties for Mass Spectrometry Data

    NASA Astrophysics Data System (ADS)

    Essex, R. M.; Goldberg, S. A.

    2006-12-01

    A complete and transparent characterization of measurement uncertainty is fundamentally important to the interpretation of analytical results. We have observed that the calculation and reporting of uncertainty estimates for isotopic measurement from a variety of analytical facilities are inconsistent, making it difficult to compare and evaluate data. Therefore, we recommend an approach to uncertainty estimation that has been adopted by both US national metrology facilities and is becoming widely accepted within the analytical community. This approach is outlined in the ISO "Guide to the Expression of Uncertainty in Measurement" (GUM). The GUM approach to uncertainty estimation includes four major steps: 1) Specify the measurand; 2) Identify uncertainty sources; 3) Quantify components by determining the standard uncertainty (u) for each component; and 4) Calculate combined standard uncertainty (u_c) by using established propagation laws to combine the various components. To obtain a desired confidence level, the combined standard uncertainty is multiplied by a coverage factor (k) to yield an expanded uncertainty (U). To be consistent with the GUM principles, it is also necessary create an uncertainty budget, which is a listing of all the components comprising the uncertainty and their relative contribution to the combined standard uncertainty. In mass spectrometry, Step 1 is normally the determination of an isotopic ratio for a particular element. Step 2 requires the identification of the many potential sources of measurement variability and bias including: gain, baseline, cup efficiency, Schottky noise, counting statistics, CRM uncertainties, yield calibrations, linearity calibrations, run conditions, and filament geometry. Then an equation expressing the relationship of all of the components to the measurement value must be written. To complete Step 3, these potential sources of uncertainty must be characterized (Type A or Type B) and quantified. This information

  17. Uncertainty and sensitivity assessment of flood risk assessments

    NASA Astrophysics Data System (ADS)

    de Moel, H.; Aerts, J. C.

    2009-12-01

    uncertainties of the final risk estimate will be helpful to decision makers to make better informed decisions and attributing this uncertainty to the input parameters helps to identify which parameters are most important when it comes to uncertainty in the final estimate and should therefore deserve additional attention in further research.

  18. Assessment of parametric uncertainty for groundwater reactive transport modeling,

    USGS Publications Warehouse

    Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun

    2014-01-01

    The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood

  19. The uncertainty principle - A simplified review of the four versions

    NASA Astrophysics Data System (ADS)

    Jijnasu, Vasudeva

    2016-08-01

    The complexity of the historical confusions around different versions of the uncertainty principle, in addition to the increasing technicality of physics in general, has made its affairs predominantly accessible only to specialists. Consequently, the clarity that has dawned upon physicists over the decades regarding quantum uncertainty remains mostly imperceptible for general readers, students, philosophers and even non-expert scientists. In an attempt to weaken this barrier, the article presents a summary of this technical subject, focussing at the prime case of the position-momentum pair, as modestly and informatively as possible. This includes a crisp analysis of the historical as well as of the latest developments. In the process the article provides arguments to show that the usually sidelined version of uncertainty-the intrinsic 'unsharpness' or 'indeterminacy'-forms the basis for all the other three versions, and subsequently presents its hard philosophical implications.

  20. Managing the uncertainties of low-level radioactive waste disposal.

    PubMed

    Bullard, C W; Weger, H T; Wagner, J

    1998-08-01

    The disposal of low-level radioactive waste (LLRW) entails financial and safety risks not common to most market commodities. This manifests debilitating uncertainty regarding future waste volume and disposal technology performance in the market for waste disposal services. Dealing with the publicly perceived risks of LLRW disposal increases the total cost of the technology by an order of magnitude, relative to traditional shallow land burial. Therefore, this analysis first examines five proposed disposal facility designs and quantifies the costs associated with these two important sources of uncertainty. Based upon this analysis, a marketable disposal permit mechanism is proposed and analyzed for the purpose of reducing market uncertainty and thereby facilitating a market solution to the waste disposal problem. In addition to quantifying the costs, the results illustrate the ways in which the design of a technology is influenced by its institutional environment, and vice versa.

  1. Failure probability under parameter uncertainty.

    PubMed

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications.

  2. Medical Need, Equality, and Uncertainty.

    PubMed

    Horne, L Chad

    2016-10-01

    Many hold that distributing healthcare according to medical need is a requirement of equality. Most egalitarians believe, however, that people ought to be equal on the whole, by some overall measure of well-being or life-prospects; it would be a massive coincidence if distributing healthcare according to medical need turned out to be an effective way of promoting equality overall. I argue that distributing healthcare according to medical need is important for reducing individuals' uncertainty surrounding their future medical needs. In other words, distributing healthcare according to medical need is a natural feature of healthcare insurance; it is about indemnity, not equality.

  3. Uncertainty Relation and Inseparability Criterion

    NASA Astrophysics Data System (ADS)

    Goswami, Ashutosh K.; Panigrahi, Prasanta K.

    2017-02-01

    We investigate the Peres-Horodecki positive partial transpose criterion in the context of conserved quantities and derive a condition of inseparability for a composite bipartite system depending only on the dimensions of its subsystems, which leads to a bi-linear entanglement witness for the two qubit system. A separability inequality using generalized Schrodinger-Robertson uncertainty relation taking suitable operators, has been derived, which proves to be stronger than the bi-linear entanglement witness operator. In the case of mixed density matrices, it identically distinguishes the separable and non separable Werner states.

  4. Development and Uncertainty Analysis of an Automatic Testing System for Diffusion Pump Performance

    NASA Astrophysics Data System (ADS)

    Zhang, S. W.; Liang, W. S.; Zhang, Z. J.

    A newly developed automatic testing system used in laboratory for diffusion pump performance measurement is introduced in this paper. By using two optical fiber sensors to indicate the oil level in glass-buret and a needle valve driven by a stepper motor to regulate the pressure in the test dome, the system can automatically test the ultimate pressure and pumping speed of a diffusion pump in accordance with ISO 1608. The uncertainty analysis theory is applied to analyze pumping speed measurement results. Based on the test principle and system structure, it is studied how much influence each component and test step contributes to the final uncertainty. According to differential method, the mathematical model for systematic uncertainty transfer function is established. Finally, by case study, combined uncertainties of manual operation and automatic operation are compared with each other (6.11% and 5.87% respectively). The reasonableness and practicality of this newly developed automatic testing system is proved.

  5. IAEA CRP on HTGR Uncertainty Analysis: Benchmark Definition and Test Cases

    SciTech Connect

    Gerhard Strydom; Frederik Reitsma; Hans Gougar; Bismark Tyobeka; Kostadin Ivanov

    2012-11-01

    Uncertainty and sensitivity studies are essential elements of the reactor simulation code verification and validation process. Although several international uncertainty quantification activities have