Sample records for initial condition uncertainty

  1. Oceanic ensemble forecasting in the Gulf of Mexico: An application to the case of the Deep Water Horizon oil spill

    NASA Astrophysics Data System (ADS)

    Khade, Vikram; Kurian, Jaison; Chang, Ping; Szunyogh, Istvan; Thyng, Kristen; Montuoro, Raffaele

    2017-05-01

    This paper demonstrates the potential of ocean ensemble forecasting in the Gulf of Mexico (GoM). The Bred Vector (BV) technique with one week rescaling frequency is implemented on a 9 km resolution version of the Regional Ocean Modelling System (ROMS). Numerical experiments are carried out by using the HYCOM analysis products to define the initial conditions and the lateral boundary conditions. The growth rates of the forecast uncertainty are estimated to be about 10% of initial amplitude per week. By carrying out ensemble forecast experiments with and without perturbed surface forcing, it is demonstrated that in the coastal regions accounting for uncertainties in the atmospheric forcing is more important than accounting for uncertainties in the ocean initial conditions. In the Loop Current region, the initial condition uncertainties, are the dominant source of the forecast uncertainty. The root-mean-square error of the Lagrangian track forecasts at the 15-day forecast lead time can be reduced by about 10 - 50 km using the ensemble mean Eulerian forecast of the oceanic flow for the computation of the tracks, instead of the single-initial-condition Eulerian forecast.

  2. Assimilation of water temperature and discharge data for ensemble water temperature forecasting

    NASA Astrophysics Data System (ADS)

    Ouellet-Proulx, Sébastien; Chimi Chiadjeu, Olivier; Boucher, Marie-Amélie; St-Hilaire, André

    2017-11-01

    Recent work demonstrated the value of water temperature forecasts to improve water resources allocation and highlighted the importance of quantifying their uncertainty adequately. In this study, we perform a multisite cascading ensemble assimilation of discharge and water temperature on the Nechako River (Canada) using particle filters. Hydrological and thermal initial conditions were provided to a rainfall-runoff model, coupled to a thermal module, using ensemble meteorological forecasts as inputs to produce 5 day ensemble thermal forecasts. Results show good performances of the particle filters with improvements of the accuracy of initial conditions by more than 65% compared to simulations without data assimilation for both the hydrological and the thermal component. All thermal forecasts returned continuous ranked probability scores under 0.8 °C when using a set of 40 initial conditions and meteorological forecasts comprising 20 members. A greater contribution of the initial conditions to the total uncertainty of the system for 1-dayforecasts is observed (mean ensemble spread = 1.1 °C) compared to meteorological forcings (mean ensemble spread = 0.6 °C). The inclusion of meteorological uncertainty is critical to maintain reliable forecasts and proper ensemble spread for lead times of 2 days and more. This work demonstrates the ability of the particle filters to properly update the initial conditions of a coupled hydrological and thermal model and offers insights regarding the contribution of two major sources of uncertainty to the overall uncertainty in thermal forecasts.

  3. Predicting Ice Sheet and Climate Evolution at Extreme Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heimbach, Patrick

    2016-02-06

    A main research objectives of PISCEES is the development of formal methods for quantifying uncertainties in ice sheet modeling. Uncertainties in simulating and projecting mass loss from the polar ice sheets arise primarily from initial conditions, surface and basal boundary conditions, and model parameters. In general terms, two main chains of uncertainty propagation may be identified: 1. inverse propagation of observation and/or prior onto posterior control variable uncertainties; 2. forward propagation of prior or posterior control variable uncertainties onto those of target output quantities of interest (e.g., climate indices or ice sheet mass loss). A related goal is the developmentmore » of computationally efficient methods for producing initial conditions for an ice sheet that are close to available present-day observations and essentially free of artificial model drift, which is required in order to be useful for model projections (“initialization problem”). To be of maximum value, such optimal initial states should be accompanied by “useful” uncertainty estimates that account for the different sources of uncerainties, as well as the degree to which the optimum state is constrained by available observations. The PISCEES proposal outlined two approaches for quantifying uncertainties. The first targets the full exploration of the uncertainty in model projections with sampling-based methods and a workflow managed by DAKOTA (the main delivery vehicle for software developed under QUEST). This is feasible for low-dimensional problems, e.g., those with a handful of global parameters to be inferred. This approach can benefit from derivative/adjoint information, but it is not necessary, which is why it often referred to as “non-intrusive”. The second approach makes heavy use of derivative information from model adjoints to address quantifying uncertainty in high-dimensions (e.g., basal boundary conditions in ice sheet models). The use of local gradient, or Hessian information (i.e., second derivatives of the cost function), requires additional code development and implementation, and is thus often referred to as an “intrusive” approach. Within PISCEES, MIT has been tasked to develop methods for derivative-based UQ, the ”intrusive” approach discussed above. These methods rely on the availability of first (adjoint) and second (Hessian) derivative code, developed through intrusive methods such as algorithmic differentiation (AD). While representing a significant burden in terms of code development, derivative-baesd UQ is able to cope with very high-dimensional uncertainty spaces. That is, unlike sampling methods (all variations of Monte Carlo), calculational burden is independent of the dimension of the uncertainty space. This is a significant advantage for spatially distributed uncertainty fields, such as threedimensional initial conditions, three-dimensional parameter fields, or two-dimensional surface and basal boundary conditions. Importantly, uncertainty fields for ice sheet models generally fall into this category.« less

  4. A new Method for the Estimation of Initial Condition Uncertainty Structures in Mesoscale Models

    NASA Astrophysics Data System (ADS)

    Keller, J. D.; Bach, L.; Hense, A.

    2012-12-01

    The estimation of fast growing error modes of a system is a key interest of ensemble data assimilation when assessing uncertainty in initial conditions. Over the last two decades three methods (and variations of these methods) have evolved for global numerical weather prediction models: ensemble Kalman filter, singular vectors and breeding of growing modes (or now ensemble transform). While the former incorporates a priori model error information and observation error estimates to determine ensemble initial conditions, the latter two techniques directly address the error structures associated with Lyapunov vectors. However, in global models these structures are mainly associated with transient global wave patterns. When assessing initial condition uncertainty in mesoscale limited area models, several problems regarding the aforementioned techniques arise: (a) additional sources of uncertainty on the smaller scales contribute to the error and (b) error structures from the global scale may quickly move through the model domain (depending on the size of the domain). To address the latter problem, perturbation structures from global models are often included in the mesoscale predictions as perturbed boundary conditions. However, the initial perturbations (when used) are often generated with a variant of an ensemble Kalman filter which does not necessarily focus on the large scale error patterns. In the framework of the European regional reanalysis project of the Hans-Ertel-Center for Weather Research we use a mesoscale model with an implemented nudging data assimilation scheme which does not support ensemble data assimilation at all. In preparation of an ensemble-based regional reanalysis and for the estimation of three-dimensional atmospheric covariance structures, we implemented a new method for the assessment of fast growing error modes for mesoscale limited area models. The so-called self-breeding is development based on the breeding of growing modes technique. Initial perturbations are integrated forward for a short time period and then rescaled and added to the initial state again. Iterating this rapid breeding cycle provides estimates for the initial uncertainty structure (or local Lyapunov vectors) given a specific norm. To avoid that all ensemble perturbations converge towards the leading local Lyapunov vector we apply an ensemble transform variant to orthogonalize the perturbations in the sub-space spanned by the ensemble. By choosing different kind of norms to measure perturbation growth, this technique allows for estimating uncertainty patterns targeted at specific sources of errors (e.g. convection, turbulence). With case study experiments we show applications of the self-breeding method for different sources of uncertainty and different horizontal scales.

  5. Effects of Moist Convection on Hurricane Predictability

    NASA Technical Reports Server (NTRS)

    Zhang, Fuqing; Sippel, Jason A.

    2008-01-01

    This study exemplifies inherent uncertainties in deterministic prediction of hurricane formation and intensity. Such uncertainties could ultimately limit the predictability of hurricanes at all time scales. In particular, this study highlights the predictability limit due to the effects on moist convection of initial-condition errors with amplitudes far smaller than those of any observation or analysis system. Not only can small and arguably unobservable differences in the initial conditions result in different routes to tropical cyclogenesis, but they can also determine whether or not a tropical disturbance will significantly develop. The details of how the initial vortex is built can depend on chaotic interactions of mesoscale features, such as cold pools from moist convection, whose timing and placement may significantly vary with minute initial differences. Inherent uncertainties in hurricane forecasts illustrate the need for developing advanced ensemble prediction systems to provide event-dependent probabilistic forecasts and risk assessment.

  6. Stormwater quality modelling in combined sewers: calibration and uncertainty analysis.

    PubMed

    Kanso, A; Chebbo, G; Tassin, B

    2005-01-01

    Estimating the level of uncertainty in urban stormwater quality models is vital for their utilization. This paper presents the results of application of a Monte Carlo Markov Chain method based on the Bayesian theory for the calibration and uncertainty analysis of a storm water quality model commonly used in available software. The tested model uses a hydrologic/hydrodynamic scheme to estimate the accumulation, the erosion and the transport of pollutants on surfaces and in sewers. It was calibrated for four different initial conditions of in-sewer deposits. Calibration results showed large variability in the model's responses in function of the initial conditions. They demonstrated that the model's predictive capacity is very low.

  7. Uncertainty Estimation in Tsunami Initial Condition From Rapid Bayesian Finite Fault Modeling

    NASA Astrophysics Data System (ADS)

    Benavente, R. F.; Dettmer, J.; Cummins, P. R.; Urrutia, A.; Cienfuegos, R.

    2017-12-01

    It is well known that kinematic rupture models for a given earthquake can present discrepancies even when similar datasets are employed in the inversion process. While quantifying this variability can be critical when making early estimates of the earthquake and triggered tsunami impact, "most likely models" are normally used for this purpose. In this work, we quantify the uncertainty of the tsunami initial condition for the great Illapel earthquake (Mw = 8.3, 2015, Chile). We focus on utilizing data and inversion methods that are suitable to rapid source characterization yet provide meaningful and robust results. Rupture models from teleseismic body and surface waves as well as W-phase are derived and accompanied by Bayesian uncertainty estimates from linearized inversion under positivity constraints. We show that robust and consistent features about the rupture kinematics appear when working within this probabilistic framework. Moreover, by using static dislocation theory, we translate the probabilistic slip distributions into seafloor deformation which we interpret as a tsunami initial condition. After considering uncertainty, our probabilistic seafloor deformation models obtained from different data types appear consistent with each other providing meaningful results. We also show that selecting just a single "representative" solution from the ensemble of initial conditions for tsunami propagation may lead to overestimating information content in the data. Our results suggest that rapid, probabilistic rupture models can play a significant role during emergency response by providing robust information about the extent of the disaster.

  8. Autonomous choices among deterministic evolution-laws as source of uncertainty

    NASA Astrophysics Data System (ADS)

    Trujillo, Leonardo; Meyroneinc, Arnaud; Campos, Kilver; Rendón, Otto; Sigalotti, Leonardo Di G.

    2018-03-01

    We provide evidence of an extreme form of sensitivity to initial conditions in a family of one-dimensional self-ruling dynamical systems. We prove that some hyperchaotic sequences are closed-form expressions of the orbits of these pseudo-random dynamical systems. Each chaotic system in this family exhibits a sensitivity to initial conditions that encompasses the sequence of choices of the evolution rule in some collection of maps. This opens a possibility to extend current theories of complex behaviors on the basis of intrinsic uncertainty in deterministic chaos.

  9. Local Sensitivity of Predicted CO 2 Injectivity and Plume Extent to Model Inputs for the FutureGen 2.0 site

    DOE PAGES

    Zhang, Z. Fred; White, Signe K.; Bonneville, Alain; ...

    2014-12-31

    Numerical simulations have been used for estimating CO2 injectivity, CO2 plume extent, pressure distribution, and Area of Review (AoR), and for the design of CO2 injection operations and monitoring network for the FutureGen project. The simulation results are affected by uncertainties associated with numerous input parameters, the conceptual model, initial and boundary conditions, and factors related to injection operations. Furthermore, the uncertainties in the simulation results also vary in space and time. The key need is to identify those uncertainties that critically impact the simulation results and quantify their impacts. We introduce an approach to determine the local sensitivity coefficientmore » (LSC), defined as the response of the output in percent, to rank the importance of model inputs on outputs. The uncertainty of an input with higher sensitivity has larger impacts on the output. The LSC is scalable by the error of an input parameter. The composite sensitivity of an output to a subset of inputs can be calculated by summing the individual LSC values. We propose a local sensitivity coefficient method and applied it to the FutureGen 2.0 Site in Morgan County, Illinois, USA, to investigate the sensitivity of input parameters and initial conditions. The conceptual model for the site consists of 31 layers, each of which has a unique set of input parameters. The sensitivity of 11 parameters for each layer and 7 inputs as initial conditions is then investigated. For CO2 injectivity and plume size, about half of the uncertainty is due to only 4 or 5 of the 348 inputs and 3/4 of the uncertainty is due to about 15 of the inputs. The initial conditions and the properties of the injection layer and its neighbour layers contribute to most of the sensitivity. Overall, the simulation outputs are very sensitive to only a small fraction of the inputs. However, the parameters that are important for controlling CO2 injectivity are not the same as those controlling the plume size. The three most sensitive inputs for injectivity were the horizontal permeability of Mt Simon 11 (the injection layer), the initial fracture-pressure gradient, and the residual aqueous saturation of Mt Simon 11, while those for the plume area were the initial salt concentration, the initial pressure, and the initial fracture-pressure gradient. The advantages of requiring only a single set of simulation results, scalability to the proper parameter errors, and easy calculation of the composite sensitivities make this approach very cost-effective for estimating AoR uncertainty and guiding cost-effective site characterization, injection well design, and monitoring network design for CO2 storage projects.« less

  10. Dynamical attribution of oceanic prediction uncertainty in the North Atlantic: application to the design of optimal monitoring systems

    NASA Astrophysics Data System (ADS)

    Sévellec, Florian; Dijkstra, Henk A.; Drijfhout, Sybren S.; Germe, Agathe

    2017-11-01

    In this study, the relation between two approaches to assess the ocean predictability on interannual to decadal time scales is investigated. The first pragmatic approach consists of sampling the initial condition uncertainty and assess the predictability through the divergence of this ensemble in time. The second approach is provided by a theoretical framework to determine error growth by estimating optimal linear growing modes. In this paper, it is shown that under the assumption of linearized dynamics and normal distributions of the uncertainty, the exact quantitative spread of ensemble can be determined from the theoretical framework. This spread is at least an order of magnitude less expensive to compute than the approximate solution given by the pragmatic approach. This result is applied to a state-of-the-art Ocean General Circulation Model to assess the predictability in the North Atlantic of four typical oceanic metrics: the strength of the Atlantic Meridional Overturning Circulation (AMOC), the intensity of its heat transport, the two-dimensional spatially-averaged Sea Surface Temperature (SST) over the North Atlantic, and the three-dimensional spatially-averaged temperature in the North Atlantic. For all tested metrics, except for SST, ˜ 75% of the total uncertainty on interannual time scales can be attributed to oceanic initial condition uncertainty rather than atmospheric stochastic forcing. The theoretical method also provide the sensitivity pattern to the initial condition uncertainty, allowing for targeted measurements to improve the skill of the prediction. It is suggested that a relatively small fleet of several autonomous underwater vehicles can reduce the uncertainty in AMOC strength prediction by 70% for 1-5 years lead times.

  11. Uncertainty analysis for the steady-state flows in a dual throat nozzle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Q.-Y.; Gottlieb, David; Hesthaven, Jan S.

    2005-03-20

    It is well known that the steady state of an isentropic flow in a dual-throat nozzle with equal throat areas is not unique. In particular there is a possibility that the flow contains a shock wave, whose location is determined solely by the initial condition. In this paper, we consider cases with uncertainty in this initial condition and use generalized polynomial chaos methods to study the steady-state solutions for stochastic initial conditions. Special interest is given to the statistics of the shock location. The polynomial chaos (PC) expansion modes are shown to be smooth functions of the spatial variable x,more » although each solution realization is discontinuous in the spatial variable x. When the variance of the initial condition is small, the probability density function of the shock location is computed with high accuracy. Otherwise, many terms are needed in the PC expansion to produce reasonable results due to the slow convergence of the PC expansion, caused by non-smoothness in random space.« less

  12. On the generation of climate model ensembles

    NASA Astrophysics Data System (ADS)

    Haughton, Ned; Abramowitz, Gab; Pitman, Andy; Phipps, Steven J.

    2014-10-01

    Climate model ensembles are used to estimate uncertainty in future projections, typically by interpreting the ensemble distribution for a particular variable probabilistically. There are, however, different ways to produce climate model ensembles that yield different results, and therefore different probabilities for a future change in a variable. Perhaps equally importantly, there are different approaches to interpreting the ensemble distribution that lead to different conclusions. Here we use a reduced-resolution climate system model to compare three common ways to generate ensembles: initial conditions perturbation, physical parameter perturbation, and structural changes. Despite these three approaches conceptually representing very different categories of uncertainty within a modelling system, when comparing simulations to observations of surface air temperature they can be very difficult to separate. Using the twentieth century CMIP5 ensemble for comparison, we show that initial conditions ensembles, in theory representing internal variability, significantly underestimate observed variance. Structural ensembles, perhaps less surprisingly, exhibit over-dispersion in simulated variance. We argue that future climate model ensembles may need to include parameter or structural perturbation members in addition to perturbed initial conditions members to ensure that they sample uncertainty due to internal variability more completely. We note that where ensembles are over- or under-dispersive, such as for the CMIP5 ensemble, estimates of uncertainty need to be treated with care.

  13. Overall uncertainty study of the hydrological impacts of climate change for a Canadian watershed

    NASA Astrophysics Data System (ADS)

    Chen, Jie; Brissette, FrançOis P.; Poulin, Annie; Leconte, Robert

    2011-12-01

    General circulation models (GCMs) and greenhouse gas emissions scenarios (GGES) are generally considered to be the two major sources of uncertainty in quantifying the climate change impacts on hydrology. Other sources of uncertainty have been given less attention. This study considers overall uncertainty by combining results from an ensemble of two GGES, six GCMs, five GCM initial conditions, four downscaling techniques, three hydrological model structures, and 10 sets of hydrological model parameters. Each climate projection is equally weighted to predict the hydrology on a Canadian watershed for the 2081-2100 horizon. The results show that the choice of GCM is consistently a major contributor to uncertainty. However, other sources of uncertainty, such as the choice of a downscaling method and the GCM initial conditions, also have a comparable or even larger uncertainty for some hydrological variables. Uncertainties linked to GGES and the hydrological model structure are somewhat less than those related to GCMs and downscaling techniques. Uncertainty due to the hydrological model parameter selection has the least important contribution among all the variables considered. Overall, this research underlines the importance of adequately covering all sources of uncertainty. A failure to do so may result in moderately to severely biased climate change impact studies. Results further indicate that the major contributors to uncertainty vary depending on the hydrological variables selected, and that the methodology presented in this paper is successful at identifying the key sources of uncertainty to consider for a climate change impact study.

  14. Constraining the inferred paleohydrologic evolution of a deep unsaturated zone in the Amargosa Desert

    USGS Publications Warehouse

    Walvoord, Michelle Ann; Stonestrom, David A.; Andraski, Brian J.; Striegl, Robert G.

    2004-01-01

    Natural flow regimes in deep unsaturated zones of arid interfluvial environments are rarely in hydraulic equilibrium with near-surface boundary conditions imposed by present-day plant–soil–atmosphere dynamics. Nevertheless, assessments of water resources and contaminant transport require realistic estimates of gas, water, and solute fluxes under past, present, and projected conditions. Multimillennial transients that are captured in current hydraulic, chemical, and isotopic profiles can be interpreted to constrain alternative scenarios of paleohydrologic evolution following climatic and vegetational shifts from pluvial to arid conditions. However, interpreting profile data with numerical models presents formidable challenges in that boundary conditions must be prescribed throughout the entire Holocene, when we have at most a few decades of actual records. Models of profile development at the Amargosa Desert Research Site include substantial uncertainties from imperfectly known initial and boundary conditions when simulating flow and solute transport over millennial timescales. We show how multiple types of profile data, including matric potentials and porewater concentrations of Cl−, δD, δ18O, can be used in multiphase heat, flow, and transport models to expose and reduce uncertainty in paleohydrologic reconstructions. Results indicate that a dramatic shift in the near-surface water balance occurred approximately 16000 yr ago, but that transitions in precipitation, temperature, and vegetation were not necessarily synchronous. The timing of the hydraulic transition imparts the largest uncertainty to model-predicted contemporary fluxes. In contrast, the uncertainties associated with initial (late Pleistocene) conditions and boundary conditions during the Holocene impart only small uncertainties to model-predicted contemporaneous fluxes.

  15. Deterministic physical systems under uncertain initial conditions: the case of maximum entropy applied to projectile motion

    NASA Astrophysics Data System (ADS)

    Montecinos, Alejandra; Davis, Sergio; Peralta, Joaquín

    2018-07-01

    The kinematics and dynamics of deterministic physical systems have been a foundation of our understanding of the world since Galileo and Newton. For real systems, however, uncertainty is largely present via external forces such as friction or lack of precise knowledge about the initial conditions of the system. In this work we focus on the latter case and describe the use of inference methodologies in solving the statistical properties of classical systems subject to uncertain initial conditions. In particular we describe the application of the formalism of maximum entropy (MaxEnt) inference to the problem of projectile motion, given information about the average horizontal range over many realizations. By using MaxEnt we can invert the problem and use the provided information on the average range to reduce the original uncertainty in the initial conditions. Also, additional insight into the initial condition's probabilities, and the projectile path distribution itself, can be achieved based on the value of the average horizontal range. The wide applicability of this procedure, as well as its ease of use, reveals a useful tool with which to revisit a large number of physics problems, from classrooms to frontier research.

  16. A probabilistic drought forecasting framework: A combined dynamical and statistical approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Hongxiang; Moradkhani, Hamid; Zarekarizi, Mahkameh

    In order to improve drought forecasting skill, this study develops a probabilistic drought forecasting framework comprised of dynamical and statistical modeling components. The novelty of this study is to seek the use of data assimilation to quantify initial condition uncertainty with the Monte Carlo ensemble members, rather than relying entirely on the hydrologic model or land surface model to generate a single deterministic initial condition, as currently implemented in the operational drought forecasting systems. Next, the initial condition uncertainty is quantified through data assimilation and coupled with a newly developed probabilistic drought forecasting model using a copula function. The initialmore » condition at each forecast start date are sampled from the data assimilation ensembles for forecast initialization. Finally, seasonal drought forecasting products are generated with the updated initial conditions. This study introduces the theory behind the proposed drought forecasting system, with an application in Columbia River Basin, Pacific Northwest, United States. Results from both synthetic and real case studies suggest that the proposed drought forecasting system significantly improves the seasonal drought forecasting skills and can facilitate the state drought preparation and declaration, at least three months before the official state drought declaration.« less

  17. Effect of initial conditions of a catchment on seasonal streamflow prediction using ensemble streamflow prediction (ESP) technique for the Rangitata and Waitaki River basins on the South Island of New Zealand

    NASA Astrophysics Data System (ADS)

    Singh, Shailesh Kumar; Zammit, Christian; Hreinsson, Einar; Woods, Ross; Clark, Martyn; Hamlet, Alan

    2013-04-01

    Increased access to water is a key pillar of the New Zealand government plan for economic growths. Variable climatic conditions coupled with market drivers and increased demand on water resource result in critical decision made by water managers based on climate and streamflow forecast. Because many of these decisions have serious economic implications, accurate forecast of climate and streamflow are of paramount importance (eg irrigated agriculture and electricity generation). New Zealand currently does not have a centralized, comprehensive, and state-of-the-art system in place for providing operational seasonal to interannual streamflow forecasts to guide water resources management decisions. As a pilot effort, we implement and evaluate an experimental ensemble streamflow forecasting system for the Waitaki and Rangitata River basins on New Zealand's South Island using a hydrologic simulation model (TopNet) and the familiar ensemble streamflow prediction (ESP) paradigm for estimating forecast uncertainty. To provide a comprehensive database for evaluation of the forecasting system, first a set of retrospective model states simulated by the hydrologic model on the first day of each month were archived from 1972-2009. Then, using the hydrologic simulation model, each of these historical model states was paired with the retrospective temperature and precipitation time series from each historical water year to create a database of retrospective hindcasts. Using the resulting database, the relative importance of initial state variables (such as soil moisture and snowpack) as fundamental drivers of uncertainties in forecasts were evaluated for different seasons and lead times. The analysis indicate that the sensitivity of flow forecast to initial condition uncertainty is depend on the hydrological regime and season of forecast. However initial conditions do not have a large impact on seasonal flow uncertainties for snow dominated catchments. Further analysis indicates that this result is valid when the hindcast database is conditioned by ENSO classification. As a result hydrological forecasts based on ESP technique, where present initial conditions with histological forcing data are used may be plausible for New Zealand catchments.

  18. Role of Perturbing Ocean Initial Condition in Simulated Regional Sea Level Change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Aixue; Meehl, Gerald; Stammer, Detlef

    Multiple lines of observational evidence indicate that the global climate has been getting warmer since the early 20th century. This warmer climate has led to a global mean sea level rise of about 18 cm during the 20th century, and over 6 cm for the first 15 years of the 21st century. Regionally the sea level rise is not uniform due in large part to internal climate variability. To better serve the community, the uncertainties of predicting/projecting regional sea level changes associated with internal climate variability need to be quantified. Previous research on this topic has used single-model large ensemblesmore » with perturbed atmospheric initial conditions (ICs). Here we compare uncertainties associated with perturbing ICs in just the atmosphere and just the ocean using a state-of-the-art coupled climate model. We find that by perturbing the oceanic ICs, the uncertainties in regional sea level changes increase compared to those with perturbed atmospheric ICs. In order for us to better assess the full spectrum of the impacts of such internal climate variability on regional and global sea level rise, approaches that involve perturbing both atmospheric and oceanic initial conditions are thus necessary.« less

  19. Role of Perturbing Ocean Initial Condition in Simulated Regional Sea Level Change

    DOE PAGES

    Hu, Aixue; Meehl, Gerald; Stammer, Detlef; ...

    2017-06-05

    Multiple lines of observational evidence indicate that the global climate has been getting warmer since the early 20th century. This warmer climate has led to a global mean sea level rise of about 18 cm during the 20th century, and over 6 cm for the first 15 years of the 21st century. Regionally the sea level rise is not uniform due in large part to internal climate variability. To better serve the community, the uncertainties of predicting/projecting regional sea level changes associated with internal climate variability need to be quantified. Previous research on this topic has used single-model large ensemblesmore » with perturbed atmospheric initial conditions (ICs). Here we compare uncertainties associated with perturbing ICs in just the atmosphere and just the ocean using a state-of-the-art coupled climate model. We find that by perturbing the oceanic ICs, the uncertainties in regional sea level changes increase compared to those with perturbed atmospheric ICs. In order for us to better assess the full spectrum of the impacts of such internal climate variability on regional and global sea level rise, approaches that involve perturbing both atmospheric and oceanic initial conditions are thus necessary.« less

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mink, S. E. de; Belczynski, K., E-mail: S.E.deMink@uva.nl, E-mail: kbelczyn@astrouw.edu.pl

    The initial mass function (IMF), binary fraction, and distributions of binary parameters (mass ratios, separations, and eccentricities) are indispensable inputs for simulations of stellar populations. It is often claimed that these are poorly constrained, significantly affecting evolutionary predictions. Recently, dedicated observing campaigns have provided new constraints on the initial conditions for massive stars. Findings include a larger close binary fraction and a stronger preference for very tight systems. We investigate the impact on the predicted merger rates of neutron stars and black holes. Despite the changes with previous assumptions, we only find an increase of less than a factor ofmore » 2 (insignificant compared with evolutionary uncertainties of typically a factor of 10–100). We further show that the uncertainties in the new initial binary properties do not significantly affect (within a factor of 2) our predictions of double compact object merger rates. An exception is the uncertainty in IMF (variations by a factor of 6 up and down). No significant changes in the distributions of final component masses, mass ratios, chirp masses, and delay times are found. We conclude that the predictions are, for practical purposes, robust against uncertainties in the initial conditions concerning binary parameters, with the exception of the IMF. This eliminates an important layer of the many uncertain assumptions affecting the predictions of merger detection rates with the gravitational wave detectors aLIGO/aVirgo.« less

  1. SCIENTIFIC UNCERTAINTIES IN ATMOSPHERIC MERCURY MODELS III: BOUNDARY AND INITIAL CONDITIONS, MODEL GRID RESOLUTION, AND HG(II) REDUCTION MECHANISMS

    EPA Science Inventory

    In this study we investigate the CMAQ model response in terms of simulated mercury concentration and deposition to boundary/initial conditions (BC/IC), model grid resolution (12- versus 36-km), and two alternative Hg(II) reduction mechanisms. The model response to the change of g...

  2. How to Make Data a Blessing to Parametric Uncertainty Quantification and Reduction?

    NASA Astrophysics Data System (ADS)

    Ye, M.; Shi, X.; Curtis, G. P.; Kohler, M.; Wu, J.

    2013-12-01

    In a Bayesian point of view, probability of model parameters and predictions are conditioned on data used for parameter inference and prediction analysis. It is critical to use appropriate data for quantifying parametric uncertainty and its propagation to model predictions. However, data are always limited and imperfect. When a dataset cannot properly constrain model parameters, it may lead to inaccurate uncertainty quantification. While in this case data appears to be a curse to uncertainty quantification, a comprehensive modeling analysis may help understand the cause and characteristics of parametric uncertainty and thus turns data into a blessing. In this study, we illustrate impacts of data on uncertainty quantification and reduction using an example of surface complexation model (SCM) developed to simulate uranyl (U(VI)) adsorption. The model includes two adsorption sites, referred to as strong and weak sites. The amount of uranium adsorption on these sites determines both the mean arrival time and the long tail of the breakthrough curves. There is one reaction on the weak site but two reactions on the strong site. The unknown parameters include fractions of the total surface site density of the two sites and surface complex formation constants of the three reactions. A total of seven experiments were conducted with different geochemical conditions to estimate these parameters. The experiments with low initial concentration of U(VI) result in a large amount of parametric uncertainty. A modeling analysis shows that it is because the experiments cannot distinguish the relative adsorption affinity of the strong and weak sites on uranium adsorption. Therefore, the experiments with high initial concentration of U(VI) are needed, because in the experiments the strong site is nearly saturated and the weak site can be determined. The experiments with high initial concentration of U(VI) are a blessing to uncertainty quantification, and the experiments with low initial concentration help modelers turn a curse into a blessing. The data impacts on uncertainty quantification and reduction are quantified using probability density functions of model parameters obtained from Markov Chain Monte Carlo simulation using the DREAM algorithm. This study provides insights to model calibration, uncertainty quantification, experiment design, and data collection in groundwater reactive transport modeling and other environmental modeling.

  3. Quantifying model uncertainty in seasonal Arctic sea-ice forecasts

    NASA Astrophysics Data System (ADS)

    Blanchard-Wrigglesworth, Edward; Barthélemy, Antoine; Chevallier, Matthieu; Cullather, Richard; Fučkar, Neven; Massonnet, François; Posey, Pamela; Wang, Wanqiu; Zhang, Jinlun; Ardilouze, Constantin; Bitz, Cecilia; Vernieres, Guillaume; Wallcraft, Alan; Wang, Muyin

    2017-04-01

    Dynamical model forecasts in the Sea Ice Outlook (SIO) of September Arctic sea-ice extent over the last decade have shown lower skill than that found in both idealized model experiments and hindcasts of previous decades. Additionally, it is unclear how different model physics, initial conditions or post-processing techniques contribute to SIO forecast uncertainty. In this work, we have produced a seasonal forecast of 2015 Arctic summer sea ice using SIO dynamical models initialized with identical sea-ice thickness in the central Arctic. Our goals are to calculate the relative contribution of model uncertainty and irreducible error growth to forecast uncertainty and assess the importance of post-processing, and to contrast pan-Arctic forecast uncertainty with regional forecast uncertainty. We find that prior to forecast post-processing, model uncertainty is the main contributor to forecast uncertainty, whereas after forecast post-processing forecast uncertainty is reduced overall, model uncertainty is reduced by an order of magnitude, and irreducible error growth becomes the main contributor to forecast uncertainty. While all models generally agree in their post-processed forecasts of September sea-ice volume and extent, this is not the case for sea-ice concentration. Additionally, forecast uncertainty of sea-ice thickness grows at a much higher rate along Arctic coastlines relative to the central Arctic ocean. Potential ways of offering spatial forecast information based on the timescale over which the forecast signal beats the noise are also explored.

  4. Mayer control problem with probabilistic uncertainty on initial positions

    NASA Astrophysics Data System (ADS)

    Marigonda, Antonio; Quincampoix, Marc

    2018-03-01

    In this paper we introduce and study an optimal control problem in the Mayer's form in the space of probability measures on Rn endowed with the Wasserstein distance. Our aim is to study optimality conditions when the knowledge of the initial state and velocity is subject to some uncertainty, which are modeled by a probability measure on Rd and by a vector-valued measure on Rd, respectively. We provide a characterization of the value function of such a problem as unique solution of an Hamilton-Jacobi-Bellman equation in the space of measures in a suitable viscosity sense. Some applications to a pursuit-evasion game with uncertainty in the state space is also discussed, proving the existence of a value for the game.

  5. A new framework for quantifying uncertainties in modelling studies for future climates - how more certain are CMIP5 precipitation and temperature simulations compared to CMIP3?

    NASA Astrophysics Data System (ADS)

    Sharma, A.; Woldemeskel, F. M.; Sivakumar, B.; Mehrotra, R.

    2014-12-01

    We outline a new framework for assessing uncertainties in model simulations, be they hydro-ecological simulations for known scenarios, or climate simulations for assumed scenarios representing the future. This framework is illustrated here using GCM projections for future climates for hydrologically relevant variables (precipitation and temperature), with the uncertainty segregated into three dominant components - model uncertainty, scenario uncertainty (representing greenhouse gas emission scenarios), and ensemble uncertainty (representing uncertain initial conditions and states). A novel uncertainty metric, the Square Root Error Variance (SREV), is used to quantify the uncertainties involved. The SREV requires: (1) Interpolating raw and corrected GCM outputs to a common grid; (2) Converting these to percentiles; (3) Estimating SREV for model, scenario, initial condition and total uncertainty at each percentile; and (4) Transforming SREV to a time series. The outcome is a spatially varying series of SREVs associated with each model that can be used to assess how uncertain the system is at each simulated point or time. This framework, while illustrated in a climate change context, is completely applicable for assessment of uncertainties any modelling framework may be subject to. The proposed method is applied to monthly precipitation and temperature from 6 CMIP3 and 13 CMIP5 GCMs across the world. For CMIP3, B1, A1B and A2 scenarios whereas for CMIP5, RCP2.6, RCP4.5 and RCP8.5 representing low, medium and high emissions are considered. For both CMIP3 and CMIP5, model structure is the largest source of uncertainty, which reduces significantly after correcting for biases. Scenario uncertainly increases, especially for temperature, in future due to divergence of the three emission scenarios analysed. While CMIP5 precipitation simulations exhibit a small reduction in total uncertainty over CMIP3, there is almost no reduction observed for temperature projections. Estimation of uncertainty in both space and time sheds lights on the spatial and temporal patterns of uncertainties in GCM outputs, providing an effective platform for risk-based assessments of any alternate plans or decisions that may be formulated using GCM simulations.

  6. Multi-model seasonal forecast of Arctic sea-ice: forecast uncertainty at pan-Arctic and regional scales

    NASA Astrophysics Data System (ADS)

    Blanchard-Wrigglesworth, E.; Barthélemy, A.; Chevallier, M.; Cullather, R.; Fučkar, N.; Massonnet, F.; Posey, P.; Wang, W.; Zhang, J.; Ardilouze, C.; Bitz, C. M.; Vernieres, G.; Wallcraft, A.; Wang, M.

    2017-08-01

    Dynamical model forecasts in the Sea Ice Outlook (SIO) of September Arctic sea-ice extent over the last decade have shown lower skill than that found in both idealized model experiments and hindcasts of previous decades. Additionally, it is unclear how different model physics, initial conditions or forecast post-processing (bias correction) techniques contribute to SIO forecast uncertainty. In this work, we have produced a seasonal forecast of 2015 Arctic summer sea ice using SIO dynamical models initialized with identical sea-ice thickness in the central Arctic. Our goals are to calculate the relative contribution of model uncertainty and irreducible error growth to forecast uncertainty and assess the importance of post-processing, and to contrast pan-Arctic forecast uncertainty with regional forecast uncertainty. We find that prior to forecast post-processing, model uncertainty is the main contributor to forecast uncertainty, whereas after forecast post-processing forecast uncertainty is reduced overall, model uncertainty is reduced by an order of magnitude, and irreducible error growth becomes the main contributor to forecast uncertainty. While all models generally agree in their post-processed forecasts of September sea-ice volume and extent, this is not the case for sea-ice concentration. Additionally, forecast uncertainty of sea-ice thickness grows at a much higher rate along Arctic coastlines relative to the central Arctic ocean. Potential ways of offering spatial forecast information based on the timescale over which the forecast signal beats the noise are also explored.

  7. Mesoscale Predictability and Error Growth in Short Range Ensemble Forecasts

    NASA Astrophysics Data System (ADS)

    Gingrich, Mark

    Although it was originally suggested that small-scale, unresolved errors corrupt forecasts at all scales through an inverse error cascade, some authors have proposed that those mesoscale circulations resulting from stationary forcing on the larger scale may inherit the predictability of the large-scale motions. Further, the relative contributions of large- and small-scale uncertainties in producing error growth in the mesoscales remain largely unknown. Here, 100 member ensemble forecasts are initialized from an ensemble Kalman filter (EnKF) to simulate two winter storms impacting the East Coast of the United States in 2010. Four verification metrics are considered: the local snow water equivalence, total liquid water, and 850 hPa temperatures representing mesoscale features; and the sea level pressure field representing a synoptic feature. It is found that while the predictability of the mesoscale features can be tied to the synoptic forecast, significant uncertainty existed on the synoptic scale at lead times as short as 18 hours. Therefore, mesoscale details remained uncertain in both storms due to uncertainties at the large scale. Additionally, the ensemble perturbation kinetic energy did not show an appreciable upscale propagation of error for either case. Instead, the initial condition perturbations from the cycling EnKF were maximized at large scales and immediately amplified at all scales without requiring initial upscale propagation. This suggests that relatively small errors in the synoptic-scale initialization may have more importance in limiting predictability than errors in the unresolved, small-scale initial conditions.

  8. Existence and Optimality Conditions for Risk-Averse PDE-Constrained Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kouri, Drew Philip; Surowiec, Thomas M.

    Uncertainty is ubiquitous in virtually all engineering applications, and, for such problems, it is inadequate to simulate the underlying physics without quantifying the uncertainty in unknown or random inputs, boundary and initial conditions, and modeling assumptions. Here in this paper, we introduce a general framework for analyzing risk-averse optimization problems constrained by partial differential equations (PDEs). In particular, we postulate conditions on the random variable objective function as well as the PDE solution that guarantee existence of minimizers. Furthermore, we derive optimality conditions and apply our results to the control of an environmental contaminant. Lastly, we introduce a new riskmore » measure, called the conditional entropic risk, that fuses desirable properties from both the conditional value-at-risk and the entropic risk measures.« less

  9. Existence and Optimality Conditions for Risk-Averse PDE-Constrained Optimization

    DOE PAGES

    Kouri, Drew Philip; Surowiec, Thomas M.

    2018-06-05

    Uncertainty is ubiquitous in virtually all engineering applications, and, for such problems, it is inadequate to simulate the underlying physics without quantifying the uncertainty in unknown or random inputs, boundary and initial conditions, and modeling assumptions. Here in this paper, we introduce a general framework for analyzing risk-averse optimization problems constrained by partial differential equations (PDEs). In particular, we postulate conditions on the random variable objective function as well as the PDE solution that guarantee existence of minimizers. Furthermore, we derive optimality conditions and apply our results to the control of an environmental contaminant. Lastly, we introduce a new riskmore » measure, called the conditional entropic risk, that fuses desirable properties from both the conditional value-at-risk and the entropic risk measures.« less

  10. Initial uncertainty in Pavlovian reward prediction persistently elevates incentive salience and extends sign-tracking to normally unattractive cues.

    PubMed

    Robinson, Mike J F; Anselme, Patrick; Fischer, Adam M; Berridge, Kent C

    2014-06-01

    Uncertainty is a component of many gambling games and may play a role in incentive motivation and cue attraction. Uncertainty can increase the attractiveness for predictors of reward in the Pavlovian procedure of autoshaping, visible as enhanced sign-tracking (or approach and nibbles) by rats of a metal lever whose sudden appearance acts as a conditioned stimulus (CS+) to predict sucrose pellets as an unconditioned stimulus (UCS). Here we examined how reward uncertainty might enhance incentive salience as sign-tracking both in intensity and by broadening the range of attractive CS+s. We also examined whether initially induced uncertainty enhancements of CS+ attraction can endure beyond uncertainty itself, and persist even when Pavlovian prediction becomes 100% certain. Our results show that uncertainty can broaden incentive salience attribution to make CS cues attractive that would otherwise not be (either because they are too distal from reward or too risky to normally attract sign-tracking). In addition, uncertainty enhancement of CS+ incentive salience, once induced by initial exposure, persisted even when Pavlovian CS-UCS correlations later rose toward 100% certainty in prediction. Persistence suggests an enduring incentive motivation enhancement potentially relevant to gambling, which in some ways resembles incentive-sensitization. Higher motivation to uncertain CS+s leads to more potent attraction to these cues when they predict the delivery of uncertain rewards. In humans, those cues might possibly include the sights and sounds associated with gambling, which contribute a major component of the play immersion experienced by problematic gamblers. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Initial uncertainty in Pavlovian reward prediction persistently elevates incentive salience and extends sign-tracking to normally unattractive cues

    PubMed Central

    Robinson, Mike J. F.; Anselme, Patrick; Fischer, Adam M.; Berridge, Kent C.

    2014-01-01

    Uncertainty is a component of many gambling games and may play a role in incentive motivation and cue attraction. Uncertainty can increase the attractiveness for predictors of reward in the Pavlovian procedure of autoshaping, visible as enhanced sign-tracking (or approach and nibbles) by rats of a metal lever whose sudden appearance acts as a conditioned stimulus (CS+) to predict sucrose pellets as an unconditioned stimulus (UCS). Here we examined how reward uncertainty might enhance incentive salience as sign-tracking both in intensity and by broadening the range of attractive CS+s. We also examined whether initially-induced uncertainty enhancements of CS+ attraction can endure beyond uncertainty itself, and persist even when Pavlovian prediction becomes 100% certain. Our results show that uncertainty can broaden incentive salience attribution to make CS cues attractive that would otherwise not be (either because they are too distal from reward or too risky to normally attract sign-tracking). In addition, uncertainty enhancement of CS+ incentive salience, once induced by initial exposure, persisted even when Pavlovian CS-UCS correlations later rose toward 100% certainty in prediction. Persistence suggests an enduring incentive motivation enhancement potentially relevant to gambling, which in some ways resembles incentive-sensitization. Higher motivation to uncertain CS+s leads to more potent attraction to these cues when they predict the delivery of uncertain rewards. In humans, those cues might possibly include the sights and sounds associated with gambling, which contribute a major component of the play immersion experienced by problematic gamblers. PMID:24631397

  12. Validation of the Small Hot Jet Acoustic Rig for Jet Noise Research

    NASA Technical Reports Server (NTRS)

    Bridges, James; Brown, Clifford A.

    2005-01-01

    The development and acoustic validation of the Small Hot Jet Aeroacoustic Rig (SHJAR) is documented. Originally conceived to support fundamental research in jet noise, the rig has been designed and developed using the best practices of the industry. While validating the rig for acoustic work, a method of characterizing all extraneous rig noise was developed. With this in hand, the researcher can know when the jet data being measured is being contaminated and design the experiment around this limitation. Also considered is the question of uncertainty, where it is shown that there is a fundamental uncertainty of 0.5dB or so to the best experiments, confirmed by repeatability studies. One area not generally accounted for in the uncertainty analysis is the variation which can result from differences in initial condition of the nozzle shear layer. This initial condition was modified and the differences in both flow and sound were documented. The bottom line is that extreme caution must be applied when working on small jet rigs, but that highly accurate results can be made independent of scale.

  13. On the long-term memory of the Greenland Ice Sheet

    NASA Astrophysics Data System (ADS)

    Rogozhina, I.; Martinec, Z.; Hagedoorn, J. M.; Thomas, M.; Fleming, K.

    2011-03-01

    In this study, the memory of the Greenland Ice Sheet (GIS) with respect to its past states is analyzed. According to ice core reconstructions, the present-day GIS reflects former climatic conditions dating back to at least 250 thousand years before the present (kyr BP). This fact must be considered when initializing an ice sheet model. The common initialization techniques are paleoclimatic simulations driven by atmospheric forcing inferred from ice core records and steady state simulations driven by the present-day or past climatic conditions. When paleoclimatic simulations are used, the information about the past climatic conditions is partly reflected in the resulting present-day state of the GIS. However, there are several important questions that need to be clarified. First, for how long does the model remember its initial state? Second, it is generally acknowledged that, prior to 100 kyr BP, the longest Greenland ice core record (GRIP) is distorted by ice-flow irregularities. The question arises as to what extent do the uncertainties inherent in the GRIP-based forcing influence the resulting GIS? Finally, how is the modeled thermodynamic state affected by the choice of initialization technique (paleo or steady state)? To answer these questions, a series of paleoclimatic and steady state simulations is carried out. We conclude that (1) the choice of an ice-covered initial configuration shortens the initialization simulation time to 100 kyr, (2) the uncertainties in the GRIP-based forcing affect present-day modeled ice-surface topographies and temperatures only slightly, and (3) the GIS forced by present-day climatic conditions is overall warmer than that resulting from a paleoclimatic simulation.

  14. Review of Reliability-Based Design Optimization Approach and Its Integration with Bayesian Method

    NASA Astrophysics Data System (ADS)

    Zhang, Xiangnan

    2018-03-01

    A lot of uncertain factors lie in practical engineering, such as external load environment, material property, geometrical shape, initial condition, boundary condition, etc. Reliability method measures the structural safety condition and determine the optimal design parameter combination based on the probabilistic theory. Reliability-based design optimization (RBDO) is the most commonly used approach to minimize the structural cost or other performance under uncertainty variables which combines the reliability theory and optimization. However, it cannot handle the various incomplete information. The Bayesian approach is utilized to incorporate this kind of incomplete information in its uncertainty quantification. In this paper, the RBDO approach and its integration with Bayesian method are introduced.

  15. Variance reduction through robust design of boundary conditions for stochastic hyperbolic systems of equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nordström, Jan, E-mail: jan.nordstrom@liu.se; Wahlsten, Markus, E-mail: markus.wahlsten@liu.se

    We consider a hyperbolic system with uncertainty in the boundary and initial data. Our aim is to show that different boundary conditions give different convergence rates of the variance of the solution. This means that we can with the same knowledge of data get a more or less accurate description of the uncertainty in the solution. A variety of boundary conditions are compared and both analytical and numerical estimates of the variance of the solution are presented. As an application, we study the effect of this technique on Maxwell's equations as well as on a subsonic outflow boundary for themore » Euler equations.« less

  16. CFD Modeling of Superheated Fuel Sprays

    NASA Technical Reports Server (NTRS)

    Raju, M. S.

    2008-01-01

    An understanding of fuel atomization and vaporization behavior at superheat conditions is identified to be a topic of importance in the design of modern supersonic engines. As a part of the NASA aeronautics initiative, we have undertaken an assessment study to establish baseline accuracy of existing CFD models used in the evaluation of a ashing jet. In a first attempt towards attaining this goal, we have incorporated an existing superheat vaporization model into our spray solution procedure but made some improvements to combine the existing models valid at superheated conditions with the models valid at stable (non-superheat) evaporating conditions. Also, the paper reports some validation results based on the experimental data obtained from the literature for a superheated spray generated by the sudden release of pressurized R134A from a cylindrical nozzle. The predicted profiles for both gas and droplet velocities show a reasonable agreement with the measured data and exhibit a self-similar pattern similar to the correlation reported in the literature. Because of the uncertainty involved in the specification of the initial conditions, we have investigated the effect of initial droplet size distribution on the validation results. The predicted results were found to be sensitive to the initial conditions used for the droplet size specification. However, it was shown that decent droplet size comparisons could be achieved with properly selected initial conditions, For the case considered, it is reasonable to assume that the present vaporization models are capable of providing a reasonable qualitative description for the two-phase jet characteristics generated by a ashing jet. However, there remains some uncertainty with regard to the specification of certain initial spray conditions and there is a need for experimental data on separate gas and liquid temperatures in order to validate the vaporization models based on the Adachi correlation for a liquid involving R134A.

  17. Assessing concentration uncertainty estimates from passive microwave sea ice products

    NASA Astrophysics Data System (ADS)

    Meier, W.; Brucker, L.; Miller, J. A.

    2017-12-01

    Sea ice concentration is an essential climate variable and passive microwave derived estimates of concentration are one of the longest satellite-derived climate records. However, until recently uncertainty estimates were not provided. Numerous validation studies provided insight into general error characteristics, but the studies have found that concentration error varied greatly depending on sea ice conditions. Thus, an uncertainty estimate from each observation is desired, particularly for initialization, assimilation, and validation of models. Here we investigate three sea ice products that include an uncertainty for each concentration estimate: the NASA Team 2 algorithm product, the EUMETSAT Ocean and Sea Ice Satellite Application Facility (OSI-SAF) product, and the NOAA/NSIDC Climate Data Record (CDR) product. Each product estimates uncertainty with a completely different approach. The NASA Team 2 product derives uncertainty internally from the algorithm method itself. The OSI-SAF uses atmospheric reanalysis fields and a radiative transfer model. The CDR uses spatial variability from two algorithms. Each approach has merits and limitations. Here we evaluate the uncertainty estimates by comparing the passive microwave concentration products with fields derived from the NOAA VIIRS sensor. The results show that the relationship between the product uncertainty estimates and the concentration error (relative to VIIRS) is complex. This may be due to the sea ice conditions, the uncertainty methods, as well as the spatial and temporal variability of the passive microwave and VIIRS products.

  18. Trends and uncertainties in budburst projections of Norway spruce in Northern Europe.

    PubMed

    Olsson, Cecilia; Olin, Stefan; Lindström, Johan; Jönsson, Anna Maria

    2017-12-01

    Budburst is regulated by temperature conditions, and a warming climate is associated with earlier budburst. A range of phenology models has been developed to assess climate change effects, and they tend to produce different results. This is mainly caused by different model representations of tree physiology processes, selection of observational data for model parameterization, and selection of climate model data to generate future projections. In this study, we applied (i) Bayesian inference to estimate model parameter values to address uncertainties associated with selection of observational data, (ii) selection of climate model data representative of a larger dataset, and (iii) ensembles modeling over multiple initial conditions, model classes, model parameterizations, and boundary conditions to generate future projections and uncertainty estimates. The ensemble projection indicated that the budburst of Norway spruce in northern Europe will on average take place 10.2 ± 3.7 days earlier in 2051-2080 than in 1971-2000, given climate conditions corresponding to RCP 8.5. Three provenances were assessed separately (one early and two late), and the projections indicated that the relationship among provenance will remain also in a warmer climate. Structurally complex models were more likely to fail predicting budburst for some combinations of site and year than simple models. However, they contributed to the overall picture of current understanding of climate impacts on tree phenology by capturing additional aspects of temperature response, for example, chilling. Model parameterizations based on single sites were more likely to result in model failure than parameterizations based on multiple sites, highlighting that the model parameterization is sensitive to initial conditions and may not perform well under other climate conditions, whether the change is due to a shift in space or over time. By addressing a range of uncertainties, this study showed that ensemble modeling provides a more robust impact assessment than would a single phenology model run.

  19. ICE CONTROL - Towards optimizing wind energy production during icing events

    NASA Astrophysics Data System (ADS)

    Dorninger, Manfred; Strauss, Lukas; Serafin, Stefano; Beck, Alexander; Wittmann, Christoph; Weidle, Florian; Meier, Florian; Bourgeois, Saskia; Cattin, René; Burchhart, Thomas; Fink, Martin

    2017-04-01

    Forecasts of wind power production loss caused by icing weather conditions are produced by a chain of physical models. The model chain consists of a numerical weather prediction model, an icing model and a production loss model. Each element of the model chain is affected by significant uncertainty, which can be quantified using targeted observations and a probabilistic forecasting approach. In this contribution, we present preliminary results from the recently launched project ICE CONTROL, an Austrian research initiative on measurements, probabilistic forecasting, and verification of icing on wind turbine blades. ICE CONTROL includes an experimental field phase, consisting of measurement campaigns in a wind park in Rhineland-Palatinate, Germany, in the winters 2016/17 and 2017/18. Instruments deployed during the campaigns consist of a conventional icing detector on the turbine hub and newly devised ice sensors (eologix Sensor System) on the turbine blades, as well as meteorological sensors for wind, temperature, humidity, visibility, and precipitation type and spectra. Liquid water content and spectral characteristics of super-cooled water droplets are measured using a Fog Monitor FM-120. Three cameras document the icing conditions on the instruments and on the blades. Different modelling approaches are used to quantify the components of the model-chain uncertainties. The uncertainty related to the initial conditions of the weather prediction is evaluated using the existing global ensemble prediction system (EPS) of the European Centre for Medium-Range Weather Forecasts (ECMWF). Furthermore, observation system experiments are conducted with the AROME model and its 3D-Var data assimilation to investigate the impact of additional observations (such as Mode-S aircraft data, SCADA data and MSG cloud mask initialization) on the numerical icing forecast. The uncertainty related to model formulation is estimated from multi-physics ensembles based on the Weather Research and Forecasting model (WRF) by perturbing parameters in the physical parameterization schemes. In addition, uncertainties of the icing model and of its adaptations to the rotating turbine blade are addressed. The model forecasts combined with the suite of instruments and their measurements make it possible to conduct a step-wise verification of all the components of the model chain - a novel aspect compared to similar ongoing and completed forecasting projects.

  20. Treatment of uncertainties in atmospheric chemical systems: A combined modeling and experimental approach

    NASA Astrophysics Data System (ADS)

    Pun, Betty Kong-Ling

    1998-12-01

    Uncertainty is endemic in modeling. This thesis is a two- phase program to understand the uncertainties in urban air pollution model predictions and in field data used to validate them. Part I demonstrates how to improve atmospheric models by analyzing the uncertainties in these models and using the results to guide new experimentation endeavors. Part II presents an experiment designed to characterize atmospheric fluctuations, which have significant implications towards the model validation process. A systematic study was undertaken to investigate the effects of uncertainties in the SAPRC mechanism for gas- phase chemistry in polluted atmospheres. The uncertainties of more than 500 parameters were compiled, including reaction rate constants, product coefficients, organic composition, and initial conditions. Uncertainty propagation using the Deterministic Equivalent Modeling Method (DEMM) revealed that the uncertainties in ozone predictions can be up to 45% based on these parametric uncertainties. The key parameters found to dominate the uncertainties of the predictions include photolysis rates of NO2, O3, and formaldehyde; the rate constant for nitric acid formation; and initial amounts of NOx and VOC. Similar uncertainty analysis procedures applied to two other mechanisms used in regional air quality models led to the conclusion that in the presence of parametric uncertainties, the mechanisms cannot be discriminated. Research efforts should focus on reducing parametric uncertainties in photolysis rates, reaction rate constants, and source terms. A new tunable diode laser (TDL) infrared spectrometer was designed and constructed to measure multiple pollutants simultaneously in the same ambient air parcels. The sensitivities of the one hertz measurements were 2 ppb for ozone, 1 ppb for NO, and 0.5 ppb for NO2. Meteorological data were also collected for wind, temperature, and UV intensity. The field data showed clear correlations between ozone, NO, and NO2 in the one-second time scale. Fluctuations in pollutant concentrations were found to be extremely dependent on meteorological conditions. Deposition fluxes calculated using the Eddy Correlation technique were found to be small on concrete surfaces. These high time-resolution measurements were used to develop an understanding of the variability in atmospheric measurements, which would be useful in determining the acceptable discrepancy of model and observations. (Copies available exclusively from MIT Libraries, Rm. 14-0551, Cambridge, MA 02139-4307. Ph. 617-253-5668; Fax 617-253-1690.)

  1. Stochastic Forcing for High-Resolution Regional and Global Ocean and Atmosphere-Ocean Coupled Ensemble Forecast System

    NASA Astrophysics Data System (ADS)

    Rowley, C. D.; Hogan, P. J.; Martin, P.; Thoppil, P.; Wei, M.

    2017-12-01

    An extended range ensemble forecast system is being developed in the US Navy Earth System Prediction Capability (ESPC), and a global ocean ensemble generation capability to represent uncertainty in the ocean initial conditions has been developed. At extended forecast times, the uncertainty due to the model error overtakes the initial condition as the primary source of forecast uncertainty. Recently, stochastic parameterization or stochastic forcing techniques have been applied to represent the model error in research and operational atmospheric, ocean, and coupled ensemble forecasts. A simple stochastic forcing technique has been developed for application to US Navy high resolution regional and global ocean models, for use in ocean-only and coupled atmosphere-ocean-ice-wave ensemble forecast systems. Perturbation forcing is added to the tendency equations for state variables, with the forcing defined by random 3- or 4-dimensional fields with horizontal, vertical, and temporal correlations specified to characterize different possible kinds of error. Here, we demonstrate the stochastic forcing in regional and global ensemble forecasts with varying perturbation amplitudes and length and time scales, and assess the change in ensemble skill measured by a range of deterministic and probabilistic metrics.

  2. Ensemble Analysis of Variational Assimilation of Hydrologic and Hydrometeorological Data into Distributed Hydrologic Model

    NASA Astrophysics Data System (ADS)

    Lee, H.; Seo, D.; Koren, V.

    2008-12-01

    A prototype 4DVAR (four-dimensional variational) data assimilator for gridded Sacramento soil-moisture accounting and kinematic-wave routing models in the Hydrology Laboratory's Research Distributed Hydrologic Model (HL-RDHM) has been developed. The prototype assimilates streamflow and in-situ soil moisture data and adjusts gridded precipitation and climatological potential evaporation data to reduce uncertainty in the model initial conditions for improved monitoring and prediction of streamflow and soil moisture at the outlet and interior locations within the catchment. Due to large degrees of freedom involved, data assimilation (DA) into distributed hydrologic models is complex. To understand and assess sensitivity of the performance of DA to uncertainties in the model initial conditions and in the data, two synthetic experiments have been carried out in an ensemble framework. Results from the synthetic experiments shed much light on the potential and limitations with DA into distributed models. For initial real-world assessment, the prototype DA has also been applied to the headwater basin at Eldon near the Oklahoma-Arkansas border. We present these results and describe the next steps.

  3. Effects of Parameter Uncertainty on Long-Term Simulations of Lake Alkalinity

    NASA Astrophysics Data System (ADS)

    Lee, Sijin; Georgakakos, Konstantine P.; Schnoor, Jerald L.

    1990-03-01

    A first-order second-moment uncertainty analysis has been applied to two lakes in the Adirondack Park, New York, to assess the long-term response of lakes to acid deposition. Uncertainty due to parameter error and initial condition error was considered. Because the enhanced trickle-down (ETD) model is calibrated with only 3 years of field data and is used to simulate a 50-year period, the uncertainty in the lake alkalinity prediction is relatively large. When a best estimate of parameter uncertainty is used, the annual average alkalinity is predicted to be -11 ±28 μeq/L for Lake Woods and 142 ± 139 μeq/L for Lake Panther after 50 years. Hydrologic parameters and chemical weathering rate constants contributed most to the uncertainty of the simulations. Results indicate that the uncertainty in long-range predictions of lake alkalinity increased significantly over a 5- to 10-year period and then reached a steady state.

  4. Initiative-Oriented Training.

    DTIC Science & Technology

    1998-06-05

    Conditions, and Standards Prior to STXs 70 13. Knowledge of the Enemy Situation Prior to STXs 71 14. Frequency of MILES Free - Play Exercises 72...in addition to realistic sights, sounds, and smells. With regards to force-on-force, or free - play MILES exercises, uncertainty and initiative can...sounds, and (to some degree) smells of the battlefield; use MILES force-on-force, free - play exercises to incorporate cognitive mental Stressors and

  5. Effects of Heterogeneity and Uncertainties in Sources and Initial and Boundary Conditions on Spatiotemporal Variations of Groundwater Levels

    NASA Astrophysics Data System (ADS)

    Zhang, Y. K.; Liang, X.

    2014-12-01

    Effects of aquifer heterogeneity and uncertainties in source/sink, and initial and boundary conditions in a groundwater flow model on the spatiotemporal variations of groundwater level, h(x,t), were investigated. Analytical solutions for the variance and covariance of h(x, t) in an unconfined aquifer described by a linearized Boussinesq equation with a white noise source/sink and a random transmissivity field were derived. It was found that in a typical aquifer the error in h(x,t) in early time is mainly caused by the random initial condition and the error reduces as time goes to reach a constant error in later time. The duration during which the effect of the random initial condition is significant may last a few hundred days in most aquifers. The constant error in groundwater in later time is due to the combined effects of the uncertain source/sink and flux boundary: the closer to the flux boundary, the larger the error. The error caused by the uncertain head boundary is limited in a narrow zone near the boundary but it remains more or less constant over time. The effect of the heterogeneity is to increase the variation of groundwater level and the maximum effect occurs close to the constant head boundary because of the linear mean hydraulic gradient. The correlation of groundwater level decreases with temporal interval and spatial distance. In addition, the heterogeneity enhances the correlation of groundwater level, especially at larger time intervals and small spatial distances.

  6. Intercomparison of model response and internal variability across climate model ensembles

    NASA Astrophysics Data System (ADS)

    Kumar, Devashish; Ganguly, Auroop R.

    2017-10-01

    Characterization of climate uncertainty at regional scales over near-term planning horizons (0-30 years) is crucial for climate adaptation. Climate internal variability (CIV) dominates climate uncertainty over decadal prediction horizons at stakeholders' scales (regional to local). In the literature, CIV has been characterized indirectly using projections of climate change from multi-model ensembles (MME) instead of directly using projections from multiple initial condition ensembles (MICE), primarily because adequate number of initial condition (IC) runs were not available for any climate model. Nevertheless, the recent availability of significant number of IC runs from one climate model allows for the first time to characterize CIV directly from climate model projections and perform a sensitivity analysis to study the dominance of CIV compared to model response variability (MRV). Here, we measure relative agreement (a dimensionless number with values ranging between 0 and 1, inclusive; a high value indicates less variability and vice versa) among MME and MICE and find that CIV is lower than MRV for all projection time horizons and spatial resolutions for precipitation and temperature. However, CIV exhibits greater dominance over MRV for seasonal and annual mean precipitation at higher latitudes where signals of climate change are expected to emerge sooner. Furthermore, precipitation exhibits large uncertainties and a rapid decline in relative agreement from global to continental, regional, or local scales for MICE compared to MME. The fractional contribution of uncertainty due to CIV is invariant for precipitation and decreases for temperature as lead time progresses towards the end of the century.

  7. Simulated forecast error and climate drift resulting from the omission of the upper stratosphere in numerical models

    NASA Technical Reports Server (NTRS)

    Boville, Byron A.; Baumhefner, David P.

    1990-01-01

    Using an NCAR community climate model, Version I, the forecast error growth and the climate drift resulting from the omission of the upper stratosphere are investigated. In the experiment, the control simulation is a seasonal integration of a medium horizontal general circulation model with 30 levels extending from the surface to the upper mesosphere, while the main experiment uses an identical model, except that only the bottom 15 levels (below 10 mb) are retained. It is shown that both random and systematic errors develop rapidly in the lower stratosphere with some local propagation into the troposphere in the 10-30-day time range. The random growth rate in the troposphere in the case of the altered upper boundary was found to be slightly faster than that for the initial-condition uncertainty alone. However, this is not likely to make a significant impact in operational forecast models, because the initial-condition uncertainty is very large.

  8. Model Forecast Skill and Sensitivity to Initial Conditions in the Seasonal Sea Ice Outlook

    NASA Technical Reports Server (NTRS)

    Blanchard-Wrigglesworth, E.; Cullather, R. I.; Wang, W.; Zhang, J.; Bitz, C. M.

    2015-01-01

    We explore the skill of predictions of September Arctic sea ice extent from dynamical models participating in the Sea Ice Outlook (SIO). Forecasts submitted in August, at roughly 2 month lead times, are skillful. However, skill is lower in forecasts submitted to SIO, which began in 2008, than in hindcasts (retrospective forecasts) of the last few decades. The multimodel mean SIO predictions offer slightly higher skill than the single-model SIO predictions, but neither beats a damped persistence forecast at longer than 2 month lead times. The models are largely unsuccessful at predicting each other, indicating a large difference in model physics and/or initial conditions. Motivated by this, we perform an initial condition sensitivity experiment with four SIO models, applying a fixed -1 m perturbation to the initial sea ice thickness. The significant range of the response among the models suggests that different model physics make a significant contribution to forecast uncertainty.

  9. Uncertainties in Future Regional Sea Level Trends: How to Deal with the Internal Climate Variability?

    NASA Astrophysics Data System (ADS)

    Becker, M.; Karpytchev, M.; Hu, A.; Deser, C.; Lennartz-Sassinek, S.

    2017-12-01

    Today, the Climate models (CM) are the main tools for forecasting sea level rise (SLR) at global and regional scales. The CM forecasts are accompanied by inherent uncertainties. Understanding and reducing these uncertainties is becoming a matter of increasing urgency in order to provide robust estimates of SLR impact on coastal societies, which need sustainable choices of climate adaptation strategy. These CM uncertainties are linked to structural model formulation, initial conditions, emission scenario and internal variability. The internal variability is due to complex non-linear interactions within the Earth Climate System and can induce diverse quasi-periodic oscillatory modes and long-term persistences. To quantify the effects of internal variability, most studies used multi-model ensembles or sea level projections from a single model ran with perturbed initial conditions. However, large ensembles are not generally available, or too small, and computationally expensive. In this study, we use a power-law scaling of sea level fluctuations, as observed in many other geophysical signals and natural systems, which can be used to characterize the internal climate variability. From this specific statistical framework, we (1) use the pre-industrial control run of the National Center for Atmospheric Research Community Climate System Model (NCAR-CCSM) to test the robustness of the power-law scaling hypothesis; (2) employ the power-law statistics as a tool for assessing the spread of regional sea level projections due to the internal climate variability for the 21st century NCAR-CCSM; (3) compare the uncertainties in predicted sea level changes obtained from a NCAR-CCSM multi-member ensemble simulations with estimates derived for power-law processes, and (4) explore the sensitivity of spatial patterns of the internal variability and its effects on regional sea level projections.

  10. On the appropriate definition of soil profile configuration and initial conditions for land surface-hydrology models in cold regions

    NASA Astrophysics Data System (ADS)

    Sapriza-Azuri, Gonzalo; Gamazo, Pablo; Razavi, Saman; Wheater, Howard S.

    2018-06-01

    Arctic and subarctic regions are amongst the most susceptible regions on Earth to global warming and climate change. Understanding and predicting the impact of climate change in these regions require a proper process representation of the interactions between climate, carbon cycle, and hydrology in Earth system models. This study focuses on land surface models (LSMs) that represent the lower boundary condition of general circulation models (GCMs) and regional climate models (RCMs), which simulate climate change evolution at the global and regional scales, respectively. LSMs typically utilize a standard soil configuration with a depth of no more than 4 m, whereas for cold, permafrost regions, field experiments show that attention to deep soil profiles is needed to understand and close the water and energy balances, which are tightly coupled through the phase change. To address this gap, we design and run a series of model experiments with a one-dimensional LSM, called CLASS (Canadian Land Surface Scheme), as embedded in the MESH (Modélisation Environmentale Communautaire - Surface and Hydrology) modelling system, to (1) characterize the effect of soil profile depth under different climate conditions and in the presence of parameter uncertainty; (2) assess the effect of including or excluding the geothermal flux in the LSM at the bottom of the soil column; and (3) develop a methodology for temperature profile initialization in permafrost regions, where the system has an extended memory, by the use of paleo-records and bootstrapping. Our study area is in Norman Wells, Northwest Territories of Canada, where measurements of soil temperature profiles and historical reconstructed climate data are available. Our results demonstrate a dominant role for parameter uncertainty, that is often neglected in LSMs. Considering such high sensitivity to parameter values and dependency on the climate condition, we show that a minimum depth of 20 m is essential to adequately represent the temperature dynamics. We further show that our proposed initialization procedure is effective and robust to uncertainty in paleo-climate reconstructions and that more than 300 years of reconstructed climate time series are needed for proper model initialization.

  11. Sensitivity of Emissions to Uncertainties in Residual Gas Fraction Measurements in Automotive Engines: A Numerical Study

    DOE PAGES

    Aithal, S. M.

    2018-01-01

    Initial conditions of the working fluid (air-fuel mixture) within an engine cylinder, namely, mixture composition and temperature, greatly affect the combustion characteristics and emissions of an engine. In particular, the percentage of residual gas fraction (RGF) in the engine cylinder can significantly alter the temperature and composition of the working fluid as compared with the air-fuel mixture inducted into the engine, thus affecting engine-out emissions. Accurate measurement of the RGF is cumbersome and expensive, thus making it hard to accurately characterize the initial mixture composition and temperature in any given engine cycle. This uncertainty can lead to challenges in accuratelymore » interpreting experimental emissions data and in implementing real-time control strategies. Quantifying the effects of the RGF can have important implications for the diagnostics and control of internal combustion engines. This paper reports on the use of a well-validated, two-zone quasi-dimensional model to compute the engine-out NO and CO emission in a gasoline engine. The effect of varying the RGF on the emissions under lean, near-stoichiometric, and rich engine conditions was investigated. Numerical results show that small uncertainties (~2–4%) in the measured/computed values of the RGF can significantly affect the engine-out NO/CO emissions.« less

  12. Sensitivity of Emissions to Uncertainties in Residual Gas Fraction Measurements in Automotive Engines: A Numerical Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aithal, S. M.

    Initial conditions of the working fluid (air-fuel mixture) within an engine cylinder, namely, mixture composition and temperature, greatly affect the combustion characteristics and emissions of an engine. In particular, the percentage of residual gas fraction (RGF) in the engine cylinder can significantly alter the temperature and composition of the working fluid as compared with the air-fuel mixture inducted into the engine, thus affecting engine-out emissions. Accurate measurement of the RGF is cumbersome and expensive, thus making it hard to accurately characterize the initial mixture composition and temperature in any given engine cycle. This uncertainty can lead to challenges in accuratelymore » interpreting experimental emissions data and in implementing real-time control strategies. Quantifying the effects of the RGF can have important implications for the diagnostics and control of internal combustion engines. This paper reports on the use of a well-validated, two-zone quasi-dimensional model to compute the engine-out NO and CO emission in a gasoline engine. The effect of varying the RGF on the emissions under lean, near-stoichiometric, and rich engine conditions was investigated. Numerical results show that small uncertainties (~2–4%) in the measured/computed values of the RGF can significantly affect the engine-out NO/CO emissions.« less

  13. Long-time uncertainty propagation using generalized polynomial chaos and flow map composition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luchtenburg, Dirk M., E-mail: dluchten@cooper.edu; Brunton, Steven L.; Rowley, Clarence W.

    2014-10-01

    We present an efficient and accurate method for long-time uncertainty propagation in dynamical systems. Uncertain initial conditions and parameters are both addressed. The method approximates the intermediate short-time flow maps by spectral polynomial bases, as in the generalized polynomial chaos (gPC) method, and uses flow map composition to construct the long-time flow map. In contrast to the gPC method, this approach has spectral error convergence for both short and long integration times. The short-time flow map is characterized by small stretching and folding of the associated trajectories and hence can be well represented by a relatively low-degree basis. The compositionmore » of these low-degree polynomial bases then accurately describes the uncertainty behavior for long integration times. The key to the method is that the degree of the resulting polynomial approximation increases exponentially in the number of time intervals, while the number of polynomial coefficients either remains constant (for an autonomous system) or increases linearly in the number of time intervals (for a non-autonomous system). The findings are illustrated on several numerical examples including a nonlinear ordinary differential equation (ODE) with an uncertain initial condition, a linear ODE with an uncertain model parameter, and a two-dimensional, non-autonomous double gyre flow.« less

  14. Stochastic Ocean Predictions with Dynamically-Orthogonal Primitive Equations

    NASA Astrophysics Data System (ADS)

    Subramani, D. N.; Haley, P., Jr.; Lermusiaux, P. F. J.

    2017-12-01

    The coastal ocean is a prime example of multiscale nonlinear fluid dynamics. Ocean fields in such regions are complex and intermittent with unstationary heterogeneous statistics. Due to the limited measurements, there are multiple sources of uncertainties, including the initial conditions, boundary conditions, forcing, parameters, and even the model parameterizations and equations themselves. For efficient and rigorous quantification and prediction of these uncertainities, the stochastic Dynamically Orthogonal (DO) PDEs for a primitive equation ocean modeling system with a nonlinear free-surface are derived and numerical schemes for their space-time integration are obtained. Detailed numerical studies with idealized-to-realistic regional ocean dynamics are completed. These include consistency checks for the numerical schemes and comparisons with ensemble realizations. As an illustrative example, we simulate the 4-d multiscale uncertainty in the Middle Atlantic/New York Bight region during the months of Jan to Mar 2017. To provide intitial conditions for the uncertainty subspace, uncertainties in the region were objectively analyzed using historical data. The DO primitive equations were subsequently integrated in space and time. The probability distribution function (pdf) of the ocean fields is compared to in-situ, remote sensing, and opportunity data collected during the coincident POSYDON experiment. Results show that our probabilistic predictions had skill and are 3- to 4- orders of magnitude faster than classic ensemble schemes.

  15. A New Approach in Generating Meteorological Forecasts for Ensemble Streamflow Forecasting using Multivariate Functions

    NASA Astrophysics Data System (ADS)

    Khajehei, S.; Madadgar, S.; Moradkhani, H.

    2014-12-01

    The reliability and accuracy of hydrological predictions are subject to various sources of uncertainty, including meteorological forcing, initial conditions, model parameters and model structure. To reduce the total uncertainty in hydrological applications, one approach is to reduce the uncertainty in meteorological forcing by using the statistical methods based on the conditional probability density functions (pdf). However, one of the requirements for current methods is to assume the Gaussian distribution for the marginal distribution of the observed and modeled meteorology. Here we propose a Bayesian approach based on Copula functions to develop the conditional distribution of precipitation forecast needed in deriving a hydrologic model for a sub-basin in the Columbia River Basin. Copula functions are introduced as an alternative approach in capturing the uncertainties related to meteorological forcing. Copulas are multivariate joint distribution of univariate marginal distributions, which are capable to model the joint behavior of variables with any level of correlation and dependency. The method is applied to the monthly forecast of CPC with 0.25x0.25 degree resolution to reproduce the PRISM dataset over 1970-2000. Results are compared with Ensemble Pre-Processor approach as a common procedure used by National Weather Service River forecast centers in reproducing observed climatology during a ten-year verification period (2000-2010).

  16. Multi-parametric variational data assimilation for hydrological forecasting

    NASA Astrophysics Data System (ADS)

    Alvarado-Montero, R.; Schwanenberg, D.; Krahe, P.; Helmke, P.; Klein, B.

    2017-12-01

    Ensemble forecasting is increasingly applied in flow forecasting systems to provide users with a better understanding of forecast uncertainty and consequently to take better-informed decisions. A common practice in probabilistic streamflow forecasting is to force deterministic hydrological model with an ensemble of numerical weather predictions. This approach aims at the representation of meteorological uncertainty but neglects uncertainty of the hydrological model as well as its initial conditions. Complementary approaches use probabilistic data assimilation techniques to receive a variety of initial states or represent model uncertainty by model pools instead of single deterministic models. This paper introduces a novel approach that extends a variational data assimilation based on Moving Horizon Estimation to enable the assimilation of observations into multi-parametric model pools. It results in a probabilistic estimate of initial model states that takes into account the parametric model uncertainty in the data assimilation. The assimilation technique is applied to the uppermost area of River Main in Germany. We use different parametric pools, each of them with five parameter sets, to assimilate streamflow data, as well as remotely sensed data from the H-SAF project. We assess the impact of the assimilation in the lead time performance of perfect forecasts (i.e. observed data as forcing variables) as well as deterministic and probabilistic forecasts from ECMWF. The multi-parametric assimilation shows an improvement of up to 23% for CRPS performance and approximately 20% in Brier Skill Scores with respect to the deterministic approach. It also improves the skill of the forecast in terms of rank histogram and produces a narrower ensemble spread.

  17. Initial conditions for cosmological perturbations

    NASA Astrophysics Data System (ADS)

    Ashtekar, Abhay; Gupt, Brajesh

    2017-02-01

    Penrose proposed that the big bang singularity should be constrained by requiring that the Weyl curvature vanishes there. The idea behind this past hypothesis is attractive because it constrains the initial conditions for the universe in geometric terms and is not confined to a specific early universe paradigm. However, the precise statement of Penrose’s hypothesis is tied to classical space-times and furthermore restricts only the gravitational degrees of freedom. These are encapsulated only in the tensor modes of the commonly used cosmological perturbation theory. Drawing inspiration from the underlying idea, we propose a quantum generalization of Penrose’s hypothesis using the Planck regime in place of the big bang, and simultaneously incorporating tensor as well as scalar modes. Initial conditions selected by this generalization constrain the universe to be as homogeneous and isotropic in the Planck regime as permitted by the Heisenberg uncertainty relations.

  18. Robust fixed-time synchronization of delayed Cohen-Grossberg neural networks.

    PubMed

    Wan, Ying; Cao, Jinde; Wen, Guanghui; Yu, Wenwu

    2016-01-01

    The fixed-time master-slave synchronization of Cohen-Grossberg neural networks with parameter uncertainties and time-varying delays is investigated. Compared with finite-time synchronization where the convergence time relies on the initial synchronization errors, the settling time of fixed-time synchronization can be adjusted to desired values regardless of initial conditions. Novel synchronization control strategy for the slave neural network is proposed. By utilizing the Filippov discontinuous theory and Lyapunov stability theory, some sufficient schemes are provided for selecting the control parameters to ensure synchronization with required convergence time and in the presence of parameter uncertainties. Corresponding criteria for tuning control inputs are also derived for the finite-time synchronization. Finally, two numerical examples are given to illustrate the validity of the theoretical results. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. Insights into the deterministic skill of air quality ensembles from the analysis of AQMEII data

    EPA Science Inventory

    Simulations from chemical weather models are subject to uncertainties in the input data (e.g. emission inventory, initial and boundary conditions) as well as those intrinsic to the model (e.g. physical parameterization, chemical mechanism). Multi-model ensembles can improve the f...

  20. Monitoring Top-of-Atmosphere Radiative Energy Imbalance for Climate Prediction

    NASA Technical Reports Server (NTRS)

    Lin, Bing; Chambers, Lin H.; Stackhouse, Paul W., Jr.; Minnis, Patrick

    2009-01-01

    Large climate feedback uncertainties limit the prediction accuracy of the Earth s future climate with an increased CO2 atmosphere. One potential to reduce the feedback uncertainties using satellite observations of top-of-atmosphere (TOA) radiative energy imbalance is explored. Instead of solving the initial condition problem in previous energy balance analysis, current study focuses on the boundary condition problem with further considerations on climate system memory and deep ocean heat transport, which is more applicable for the climate. Along with surface temperature measurements of the present climate, the climate feedbacks are obtained based on the constraints of the TOA radiation imbalance. Comparing to the feedback factor of 3.3 W/sq m/K of the neutral climate system, the estimated feedback factor for the current climate system ranges from -1.3 to -1.0 W/sq m/K with an uncertainty of +/-0.26 W/sq m/K. That is, a positive climate feedback is found because of the measured TOA net radiative heating (0.85 W/sq m) to the climate system. The uncertainty is caused by the uncertainties in the climate memory length. The estimated time constant of the climate is large (70 to approx. 120 years), implying that the climate is not in an equilibrium state under the increasing CO2 forcing in the last century.

  1. Understanding Climate Uncertainty with an Ocean Focus

    NASA Astrophysics Data System (ADS)

    Tokmakian, R. T.

    2009-12-01

    Uncertainty in climate simulations arises from various aspects of the end-to-end process of modeling the Earth’s climate. First, there is uncertainty from the structure of the climate model components (e.g. ocean/ice/atmosphere). Even the most complex models are deficient, not only in the complexity of the processes they represent, but in which processes are included in a particular model. Next, uncertainties arise from the inherent error in the initial and boundary conditions of a simulation. Initial conditions are the state of the weather or climate at the beginning of the simulation and other such things, and typically come from observations. Finally, there is the uncertainty associated with the values of parameters in the model. These parameters may represent physical constants or effects, such as ocean mixing, or non-physical aspects of modeling and computation. The uncertainty in these input parameters propagates through the non-linear model to give uncertainty in the outputs. The models in 2020 will no doubt be better than today’s models, but they will still be imperfect, and development of uncertainty analysis technology is a critical aspect of understanding model realism and prediction capability. Smith [2002] and Cox and Stephenson [2007] discuss the need for methods to quantify the uncertainties within complicated systems so that limitations or weaknesses of the climate model can be understood. In making climate predictions, we need to have available both the most reliable model or simulation and a methods to quantify the reliability of a simulation. If quantitative uncertainty questions of the internal model dynamics are to be answered with complex simulations such as AOGCMs, then the only known path forward is based on model ensembles that characterize behavior with alternative parameter settings [e.g. Rougier, 2007]. The relevance and feasibility of using "Statistical Analysis of Computer Code Output" (SACCO) methods for examining uncertainty in ocean circulation due to parameter specification will be described and early results using the ocean/ice components of the CCSM climate model in a designed experiment framework will be shown. Cox, P. and D. Stephenson, Climate Change: A Changing Climate for Prediction, 2007, Science 317 (5835), 207, DOI: 10.1126/science.1145956. Rougier, J. C., 2007: Probabilistic Inference for Future Climate Using an Ensemble of Climate Model Evaluations, Climatic Change, 81, 247-264. Smith L., 2002, What might we learn from climate forecasts? Proc. Nat’l Academy of Sciences, Vol. 99, suppl. 1, 2487-2492 doi:10.1073/pnas.012580599.

  2. Effects of Ensemble Configuration on Estimates of Regional Climate Uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldenson, N.; Mauger, G.; Leung, L. R.

    Internal variability in the climate system can contribute substantial uncertainty in climate projections, particularly at regional scales. Internal variability can be quantified using large ensembles of simulations that are identical but for perturbed initial conditions. Here we compare methods for quantifying internal variability. Our study region spans the west coast of North America, which is strongly influenced by El Niño and other large-scale dynamics through their contribution to large-scale internal variability. Using a statistical framework to simultaneously account for multiple sources of uncertainty, we find that internal variability can be quantified consistently using a large ensemble or an ensemble ofmore » opportunity that includes small ensembles from multiple models and climate scenarios. The latter also produce estimates of uncertainty due to model differences. We conclude that projection uncertainties are best assessed using small single-model ensembles from as many model-scenario pairings as computationally feasible, which has implications for ensemble design in large modeling efforts.« less

  3. The Canadian Hydrological Model (CHM): A multi-scale, variable-complexity hydrological model for cold regions

    NASA Astrophysics Data System (ADS)

    Marsh, C.; Pomeroy, J. W.; Wheater, H. S.

    2016-12-01

    There is a need for hydrological land surface schemes that can link to atmospheric models, provide hydrological prediction at multiple scales and guide the development of multiple objective water predictive systems. Distributed raster-based models suffer from an overrepresentation of topography, leading to wasted computational effort that increases uncertainty due to greater numbers of parameters and initial conditions. The Canadian Hydrological Model (CHM) is a modular, multiphysics, spatially distributed modelling framework designed for representing hydrological processes, including those that operate in cold-regions. Unstructured meshes permit variable spatial resolution, allowing coarse resolutions at low spatial variability and fine resolutions as required. Model uncertainty is reduced by lessening the necessary computational elements relative to high-resolution rasters. CHM uses a novel multi-objective approach for unstructured triangular mesh generation that fulfills hydrologically important constraints (e.g., basin boundaries, water bodies, soil classification, land cover, elevation, and slope/aspect). This provides an efficient spatial representation of parameters and initial conditions, as well as well-formed and well-graded triangles that are suitable for numerical discretization. CHM uses high-quality open source libraries and high performance computing paradigms to provide a framework that allows for integrating current state-of-the-art process algorithms. The impact of changes to model structure, including individual algorithms, parameters, initial conditions, driving meteorology, and spatial/temporal discretization can be easily tested. Initial testing of CHM compared spatial scales and model complexity for a spring melt period at a sub-arctic mountain basin. The meshing algorithm reduced the total number of computational elements and preserved the spatial heterogeneity of predictions.

  4. Robust fixed order dynamic compensation for large space structure control

    NASA Technical Reports Server (NTRS)

    Calise, Anthony J.; Byrns, Edward V., Jr.

    1989-01-01

    A simple formulation for designing fixed order dynamic compensators which are robust to both uncertainty at the plant input and structured uncertainty in the plant dynamics is presented. The emphasis is on designing low order compensators for systems of high order. The formulation is done in an output feedback setting which exploits an observer canonical form to represent the compensator dynamics. The formulation also precludes the use of direct feedback of the plant output. The main contribution lies in defining a method for penalizing the states of the plant and of the compensator, and for choosing the distribution on initial conditions so that the loop transfer matrix approximates that of a full state design. To improve robustness to parameter uncertainty, the formulation avoids the introduction of sensitivity states, which has led to complex formulations in earlier studies where only structured uncertainty has been considered.

  5. Internal Variability-Generated Uncertainty in East Asian Climate Projections Estimated with 40 CCSM3 Ensembles.

    PubMed

    Yao, Shuai-Lei; Luo, Jing-Jia; Huang, Gang

    2016-01-01

    Regional climate projections are challenging because of large uncertainty particularly stemming from unpredictable, internal variability of the climate system. Here, we examine the internal variability-induced uncertainty in precipitation and surface air temperature (SAT) trends during 2005-2055 over East Asia based on 40 member ensemble projections of the Community Climate System Model Version 3 (CCSM3). The model ensembles are generated from a suite of different atmospheric initial conditions using the same SRES A1B greenhouse gas scenario. We find that projected precipitation trends are subject to considerably larger internal uncertainty and hence have lower confidence, compared to the projected SAT trends in both the boreal winter and summer. Projected SAT trends in winter have relatively higher uncertainty than those in summer. Besides, the lower-level atmospheric circulation has larger uncertainty than that in the mid-level. Based on k-means cluster analysis, we demonstrate that a substantial portion of internally-induced precipitation and SAT trends arises from internal large-scale atmospheric circulation variability. These results highlight the importance of internal climate variability in affecting regional climate projections on multi-decadal timescales.

  6. Signal to noise quantification of regional climate projections

    NASA Astrophysics Data System (ADS)

    Li, S.; Rupp, D. E.; Mote, P.

    2016-12-01

    One of the biggest challenges in interpreting climate model outputs for impacts studies and adaptation planning is understanding the sources of disagreement among models (which is often used imperfectly as a stand-in for system uncertainty). Internal variability is a primary source of uncertainty in climate projections, especially for precipitation, for which models disagree about even the sign of changes in large areas like the continental US. Taking advantage of a large initial-condition ensemble of regional climate simulations, this study quantifies the magnitude of changes forced by increasing greenhouse gas concentrations relative to internal variability. Results come from a large initial-condition ensemble of regional climate model simulations generated by weather@home, a citizen science computing platform, where the western United States climate was simulated for the recent past (1985-2014) and future (2030-2059) using a 25-km horizontal resolution regional climate model (HadRM3P) nested in global atmospheric model (HadAM3P). We quantify grid point level signal-to-noise not just in temperature and precipitation responses, but also the energy and moisture flux terms that are related to temperature and precipitation responses, to provide important insights regarding uncertainty in climate change projections at local and regional scales. These results will aid modelers in determining appropriate ensemble sizes for different climate variables and help users of climate model output with interpreting climate model projections.

  7. Surfing on the edge: chaos versus near-integrability in the system of Jovian planets

    NASA Astrophysics Data System (ADS)

    Hayes, Wayne B.

    2008-05-01

    We demonstrate that the system of Sun and Jovian planets, integrated for 200Myr as an isolated five-body system using many sets of initial conditions all within the uncertainty bounds of their currently known positions, can display both chaos and near-integrability. The conclusion is consistent across four different integrators, including several comparisons against integrations utilizing quadruple precision. We demonstrate that the Wisdom-Holman symplectic map using simple symplectic correctors as implemented in MERCURY 6.2 gives a reliable characterization of the existence of chaos for a particular initial condition only with time-steps less than about 10d, corresponding to about 400 steps per orbit. We also integrate the canonical DE405 initial condition out to 5Gyr, and show that it has a Lyapunov time of 200-400Myr, opening the remote possibility of accurate prediction of the Jovian planetary positions for 5Gyr.

  8. Adaptive estimation of nonlinear parameters of a nonholonomic spherical robot using a modified fuzzy-based speed gradient algorithm

    NASA Astrophysics Data System (ADS)

    Roozegar, Mehdi; Mahjoob, Mohammad J.; Ayati, Moosa

    2017-05-01

    This paper deals with adaptive estimation of the unknown parameters and states of a pendulum-driven spherical robot (PDSR), which is a nonlinear in parameters (NLP) chaotic system with parametric uncertainties. Firstly, the mathematical model of the robot is deduced by applying the Newton-Euler methodology for a system of rigid bodies. Then, based on the speed gradient (SG) algorithm, the states and unknown parameters of the robot are estimated online for different step length gains and initial conditions. The estimated parameters are updated adaptively according to the error between estimated and true state values. Since the errors of the estimated states and parameters as well as the convergence rates depend significantly on the value of step length gain, this gain should be chosen optimally. Hence, a heuristic fuzzy logic controller is employed to adjust the gain adaptively. Simulation results indicate that the proposed approach is highly encouraging for identification of this NLP chaotic system even if the initial conditions change and the uncertainties increase; therefore, it is reliable to be implemented on a real robot.

  9. Adaptive change in corporate control practices.

    PubMed

    Alexander, J A

    1991-03-01

    Multidivisional organizations are not concerned with what structure to adopt but with how they should exercise control within the divisional form to achieve economic efficiencies. Using an information-processing framework, I examined control arrangements between the headquarters and operating divisions of such organizations and how managers adapted control practices to accommodate increasing environmental uncertainty. Also considered were the moderating effects of contextual attributes on such adaptive behavior. Analyses of panel data from 97 multihospital systems suggested that organizations generally practice selective decentralization under conditions of increasing uncertainty but that organizational age, dispersion, and initial control arrangements significantly moderate the direction and magnitude of such changes.

  10. Six Degree-of-Freedom Entry Dispersion Analysis for the METEOR Recovery Module

    NASA Technical Reports Server (NTRS)

    Desai, Prasun N.; Braun, Robert D.; Powell, Richard W.; Engelund, Walter C.; Tartabini, Paul V.

    1996-01-01

    The present study performs a six degree-of-freedom entry dispersion analysis for the Multiple Experiment Transporter to Earth Orbit and Return (METEOR) mission. METEOR offered the capability of flying a recoverable science package in a microgravity environment. However, since the Recovery Module has no active control system, an accurate determination of the splashdown position is difficult because no opportunity exists to remove any errors. Hence, uncertainties in the initial conditions prior to deorbit burn initiation, during deorbit burn and exo-atmospheric coast phases, and during atmospheric flight impact the splashdown location. This investigation was undertaken to quantify the impact of the various exo-atmospheric and atmospheric uncertainties. Additionally, a Monte-Carlo analysis was performed to statistically assess the splashdown dispersion footprint caused by the multiple mission uncertainties. The Monte-Carlo analysis showed that a 3-sigma splashdown dispersion footprint with axes of 43.3 nm (long), -33.5 nm (short), and 10.0 nm (crossrange) can be constructed. A 58% probability exists that the Recovery Module will overshoot the nominal splashdown site.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, H; Chen, Z; Nath, R

    Purpose: kV fluoroscopic imaging combined with MV treatment beam imaging has been investigated for intrafractional motion monitoring and correction. It is, however, subject to additional kV imaging dose to normal tissue. To balance tracking accuracy and imaging dose, we previously proposed an adaptive imaging strategy to dynamically decide future imaging type and moments based on motion tracking uncertainty. kV imaging may be used continuously for maximal accuracy or only when the position uncertainty (probability of out of threshold) is high if a preset imaging dose limit is considered. In this work, we propose more accurate methods to estimate tracking uncertaintymore » through analyzing acquired data in real-time. Methods: We simulated motion tracking process based on a previously developed imaging framework (MV + initial seconds of kV imaging) using real-time breathing data from 42 patients. Motion tracking errors for each time point were collected together with the time point’s corresponding features, such as tumor motion speed and 2D tracking error of previous time points, etc. We tested three methods for error uncertainty estimation based on the features: conditional probability distribution, logistic regression modeling, and support vector machine (SVM) classification to detect errors exceeding a threshold. Results: For conditional probability distribution, polynomial regressions on three features (previous tracking error, prediction quality, and cosine of the angle between the trajectory and the treatment beam) showed strong correlation with the variation (uncertainty) of the mean 3D tracking error and its standard deviation: R-square = 0.94 and 0.90, respectively. The logistic regression and SVM classification successfully identified about 95% of tracking errors exceeding 2.5mm threshold. Conclusion: The proposed methods can reliably estimate the motion tracking uncertainty in real-time, which can be used to guide adaptive additional imaging to confirm the tumor is within the margin or initialize motion compensation if it is out of the margin.« less

  12. Personal Conflict Impairs Performance on an Unrelated Self-Control Task: Lingering Costs of Uncertainty and Conflict.

    PubMed

    Alquist, Jessica L; Baumeister, Roy F; McGregor, Ian; Core, Tammy J; Benjamin, Ilil; Tice, Dianne M

    2018-01-01

    People have the ability to make important choices in their lives, but deliberating about these choices can have costs. The present study was designed to test the hypothesis that writing about conflicted personal goals and values (conflict condition) would impair self-control on an unrelated subsequent task as compared to writing about clear personal goals and values (clarity condition). Personal conflict activates the behavioral inhibition system (BIS; Hirsh, Mar, & Peterson, 2012), which may make it harder for participants to successfully execute self-control. In this large ( N =337), pre-registered study participants in the conflict condition performed worse on anagrams than participants in the clarity condition, and the effect of condition on anagram performance was mediated by a subjective uncertainty measure of BIS activation. This suggests that BIS activation leads to poor self-control. Moreover, given that conflict is inherent in the exercise of self-control, results point to BIS activation as a mechanism for why initial acts of self-control impair self-control on subsequent, unrelated tasks.

  13. Governing Laws of Complex System Predictability under Co-evolving Uncertainty Sources: Theory and Nonlinear Geophysical Applications

    NASA Astrophysics Data System (ADS)

    Perdigão, R. A. P.

    2017-12-01

    Predictability assessments are traditionally made on a case-by-case basis, often by running the particular model of interest with randomly perturbed initial/boundary conditions and parameters, producing computationally expensive ensembles. These approaches provide a lumped statistical view of uncertainty evolution, without eliciting the fundamental processes and interactions at play in the uncertainty dynamics. In order to address these limitations, we introduce a systematic dynamical framework for predictability assessment and forecast, by analytically deriving governing equations of predictability in terms of the fundamental architecture of dynamical systems, independent of any particular problem under consideration. The framework further relates multiple uncertainty sources along with their coevolutionary interplay, enabling a comprehensive and explicit treatment of uncertainty dynamics along time, without requiring the actual model to be run. In doing so, computational resources are freed and a quick and effective a-priori systematic dynamic evaluation is made of predictability evolution and its challenges, including aspects in the model architecture and intervening variables that may require optimization ahead of initiating any model runs. It further brings out universal dynamic features in the error dynamics elusive to any case specific treatment, ultimately shedding fundamental light on the challenging issue of predictability. The formulated approach, framed with broad mathematical physics generality in mind, is then implemented in dynamic models of nonlinear geophysical systems with various degrees of complexity, in order to evaluate their limitations and provide informed assistance on how to optimize their design and improve their predictability in fundamental dynamical terms.

  14. Practical implementation of a particle filter data assimilation approach to estimate initial hydrologic conditions and initialize medium-range streamflow forecasts

    NASA Astrophysics Data System (ADS)

    Clark, E.; Wood, A.; Nijssen, B.; Newman, A. J.; Mendoza, P. A.

    2016-12-01

    The System for Hydrometeorological Applications, Research and Prediction (SHARP), developed at the National Center for Atmospheric Research (NCAR), University of Washington, U.S. Army Corps of Engineers, and U.S. Bureau of Reclamation, is a fully automated ensemble prediction system for short-term to seasonal applications. It incorporates uncertainty in initial hydrologic conditions (IHCs) and in hydrometeorological predictions. In this implementation, IHC uncertainty is estimated by propagating an ensemble of 100 plausible temperature and precipitation time series through the Sacramento/Snow-17 model. The forcing ensemble explicitly accounts for measurement and interpolation uncertainties in the development of gridded meteorological forcing time series. The resulting ensemble of derived IHCs exhibits a broad range of possible soil moisture and snow water equivalent (SWE) states. To select the IHCs that are most consistent with the observations, we employ a particle filter (PF) that weights IHC ensemble members based on observations of streamflow and SWE. These particles are then used to initialize ensemble precipitation and temperature forecasts downscaled from the Global Ensemble Forecast System (GEFS), generating a streamflow forecast ensemble. We test this method in two basins in the Pacific Northwest that are important for water resources management: 1) the Green River upstream of Howard Hanson Dam, and 2) the South Fork Flathead River upstream of Hungry Horse Dam. The first of these is characterized by mixed snow and rain, while the second is snow-dominated. The PF-based forecasts are compared to forecasts based on a single IHC (corresponding to median streamflow) paired with the full GEFS ensemble, and 2) the full IHC ensemble, without filtering, paired with the full GEFS ensemble. In addition to assessing improvements in the spread of IHCs, we perform a hindcast experiment to evaluate the utility of PF-based data assimilation on streamflow forecasts at 1- to 7-day lead times.

  15. Uncertainty and dispersion in air parcel trajectories near the tropical tropopause

    NASA Astrophysics Data System (ADS)

    Bergman, John; Jensen, Eric; Pfister, Leonhard; Bui, Thoapaul

    2016-04-01

    The Tropical Tropopause Layer (TTL) is important as the gateway to the stratosphere for chemical constituents produced at the Earth's surface. As such, understanding the processes that transport air through the upper tropical troposphere is important for a number of current scientific issues such as the impact of stratospheric water vapor on the global radiative budget and the depletion of ozone by both anthropogenically- and naturally-produced halocarbons. Compared to the lower troposphere, transport in the TTL is relatively unaffected by turbulent motion. Consequently, Lagrangian particle models are thought to provide reasonable estimates of parcel pathways through the TTL. However, there are complications that make trajectory analyses difficult to interpret; uncertainty in the wind data used to drive these calculations and trajectory dispersion being among the most important. These issues are examined using ensembles of backward air parcel trajectories that are initially tightly grouped near the tropical tropopause using three approaches: A Monte Carlo ensemble, in which different members use identical resolved wind fluctuations but different realizations of stochastic, multi-fractal simulations of unresolved winds, perturbed initial location ensembles, in which members use identical resolved wind fields but initial locations are displaced 2° in latitude and longitude, and a multi-model ensemble that uses identical initial conditions but different resolved wind fields and/or trajectory formulations. Comparisons among the approaches distinguish, to some degree, physical dispersion from that due to data uncertainty and the impact of unresolved wind fluctuations from that of resolved variability.

  16. Cloudy Windows: What GCM Ensembles, Reanalyses and Observations Tell Us About Uncertainty in Greenland's Future Climate and Surface Melting

    NASA Astrophysics Data System (ADS)

    Reusch, D. B.

    2016-12-01

    Any analysis that wants to use a GCM-based scenario of future climate benefits from knowing how much uncertainty the GCM's inherent variability adds to the development of climate change predictions. This is extra relevant in the polar regions due to the potential of global impacts (e.g., sea level rise) from local (ice sheet) climate changes such as more frequent/intense surface melting. High-resolution, regional-scale models using GCMs for boundary/initial conditions in future scenarios inherit a measure of GCM-derived externally-driven uncertainty. We investigate these uncertainties for the Greenland ice sheet using the 30-member CESM1.0-CAM5-BGC Large Ensemble (CESMLE) for recent (1981-2000) and future (2081-2100, RCP 8.5) decades. Recent simulations are skill-tested against the ERA-Interim reanalysis and AWS observations with results informing future scenarios. We focus on key variables influencing surface melting through decadal climatologies, nonlinear analysis of variability with self-organizing maps (SOMs), regional-scale modeling (Polar WRF), and simple melt models. Relative to the ensemble average, spatially averaged climatological July temperature anomalies over a Greenland ice-sheet/ocean domain are mostly between +/- 0.2 °C. The spatial average hides larger local anomalies of up to +/- 2 °C. The ensemble average itself is 2 °C cooler than ERA-Interim. SOMs extend our diagnostics by providing a concise, objective summary of model variability as a set of generalized patterns. For CESMLE, the SOM patterns summarize the variability of multiple realizations of climate. Changes in pattern frequency by ensemble member show the influence of initial conditions. For example, basic statistical analysis of pattern frequency yields interquartile ranges of 2-4% for individual patterns across the ensemble. In climate terms, this tells us about climate state variability through the range of the ensemble, a potentially significant source of melt-prediction uncertainty. SOMs can also capture the different trajectories of climate due to intramodel variability over time. Polar WRF provides higher resolution regional modeling with improved, polar-centric model physics. Simple melt models allow us to characterize impacts of the upstream uncertainties on estimates of surface melting.

  17. Biogenic isoprene emissions driven by regional weather predictions using different initialization methods: case studies during the SEAC4RS and DISCOVER-AQ airborne campaigns

    NASA Astrophysics Data System (ADS)

    Huang, Min; Carmichael, Gregory R.; Crawford, James H.; Wisthaler, Armin; Zhan, Xiwu; Hain, Christopher R.; Lee, Pius; Guenther, Alex B.

    2017-08-01

    Land and atmospheric initial conditions of the Weather Research and Forecasting (WRF) model are often interpolated from a different model output. We perform case studies during NASA's SEAC4RS and DISCOVER-AQ Houston airborne campaigns, demonstrating that using land initial conditions directly downscaled from a coarser resolution dataset led to significant positive biases in the coupled NASA-Unified WRF (NUWRF, version 7) surface and near-surface air temperature and planetary boundary layer height (PBLH) around the Missouri Ozarks and Houston, Texas, as well as poorly partitioned latent and sensible heat fluxes. Replacing land initial conditions with the output from a long-term offline Land Information System (LIS) simulation can effectively reduce the positive biases in NUWRF surface air temperature by ˜ 2 °C. We also show that the LIS land initialization can modify surface air temperature errors almost 10 times as effectively as applying a different atmospheric initialization method. The LIS-NUWRF-based isoprene emission calculations by the Model of Emissions of Gases and Aerosols from Nature (MEGAN, version 2.1) are at least 20 % lower than those computed using the coarser resolution data-initialized NUWRF run, and are closer to aircraft-observation-derived emissions. Higher resolution MEGAN calculations are prone to amplified discrepancies with aircraft-observation-derived emissions on small scales. This is possibly a result of some limitations of MEGAN's parameterization and uncertainty in its inputs on small scales, as well as the representation error and the neglect of horizontal transport in deriving emissions from aircraft data. This study emphasizes the importance of proper land initialization to the coupled atmospheric weather modeling and the follow-on emission modeling. We anticipate it to also be critical to accurately representing other processes included in air quality modeling and chemical data assimilation. Having more confidence in the weather inputs is also beneficial for determining and quantifying the other sources of uncertainties (e.g., parameterization, other input data) of the models that they drive.

  18. Calibration of decadal ensemble predictions

    NASA Astrophysics Data System (ADS)

    Pasternack, Alexander; Rust, Henning W.; Bhend, Jonas; Liniger, Mark; Grieger, Jens; Müller, Wolfgang; Ulbrich, Uwe

    2017-04-01

    Decadal climate predictions are of great socio-economic interest due to the corresponding planning horizons of several political and economic decisions. Due to uncertainties of weather and climate, forecasts (e.g. due to initial condition uncertainty), they are issued in a probabilistic way. One issue frequently observed for probabilistic forecasts is that they tend to be not reliable, i.e. the forecasted probabilities are not consistent with the relative frequency of the associated observed events. Thus, these kind of forecasts need to be re-calibrated. While re-calibration methods for seasonal time scales are available and frequently applied, these methods still have to be adapted for decadal time scales and its characteristic problems like climate trend and lead time dependent bias. Regarding this, we propose a method to re-calibrate decadal ensemble predictions that takes the above mentioned characteristics into account. Finally, this method will be applied and validated to decadal forecasts from the MiKlip system (Germany's initiative for decadal prediction).

  19. What is needed to make low-density lipoprotein transport in human aorta computational models suitable to explore links to atherosclerosis? Impact of initial and inflow boundary conditions.

    PubMed

    De Nisco, Giuseppe; Zhang, Peng; Calò, Karol; Liu, Xiao; Ponzini, Raffaele; Bignardi, Cristina; Rizzo, Giovanna; Deng, Xiaoyan; Gallo, Diego; Morbiducci, Umberto

    2018-02-08

    Personalized computational hemodynamics (CH) is a promising tool to clarify/predict the link between low density lipoproteins (LDL) transport in aorta, disturbed shear and atherogenesis. However, CH uses simplifying assumptions that represent sources of uncertainty. In particular, modelling blood-side to wall LDL transfer is challenged by the cumbersomeness of protocols needed to obtain reliable LDL concentration profile estimations. This paucity of data is limiting the establishment of rigorous CH protocols able to balance the trade-offs among the variety of in vivo data to be acquired, and the accuracy required by biological/clinical applications. In this study, we analyze the impact of LDL concentration initialization (initial conditions, ICs) and inflow boundary conditions (BCs) on CH models of LDL blood-to-wall transfer in aorta. Technically, in an image-based model of human aorta, two different inflow BCs are generated imposing subject-specific inflow 3D PC-MRI measured or idealized (flat) velocity profiles. For each simulated BC, four different ICs for LDL concentration are applied, imposing as IC the LDL distribution resulting from steady-state simulations with average conditions, or constant LDL concentration values. Based on CH results, we conclude that: (1) the imposition of realistic 3D velocity profiles as inflow BC reduces the uncertainty affecting the representation of LDL transfer; (2) different LDL concentration ICs lead to markedly different patterns of LDL transfer. Given that it is not possible to verify in vivo the proper LDL concentration initialization to be applied, we suggest to carefully set and unambiguously declare the imposed BCs and LDL concentration IC when modelling LDL transfer in aorta, in order to obtain reproducible and ultimately comparable results among different laboratories. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. A Comparison of Perturbed Initial Conditions and Multiphysics Ensembles in a Severe Weather Episode in Spain

    NASA Technical Reports Server (NTRS)

    Tapiador, Francisco; Tao, Wei-Kuo; Angelis, Carlos F.; Martinez, Miguel A.; Cecilia Marcos; Antonio Rodriguez; Hou, Arthur; Jong Shi, Jain

    2012-01-01

    Ensembles of numerical model forecasts are of interest to operational early warning forecasters as the spread of the ensemble provides an indication of the uncertainty of the alerts, and the mean value is deemed to outperform the forecasts of the individual models. This paper explores two ensembles on a severe weather episode in Spain, aiming to ascertain the relative usefulness of each one. One ensemble uses sensible choices of physical parameterizations (precipitation microphysics, land surface physics, and cumulus physics) while the other follows a perturbed initial conditions approach. The results show that, depending on the parameterizations, large differences can be expected in terms of storm location, spatial structure of the precipitation field, and rain intensity. It is also found that the spread of the perturbed initial conditions ensemble is smaller than the dispersion due to physical parameterizations. This confirms that in severe weather situations operational forecasts should address moist physics deficiencies to realize the full benefits of the ensemble approach, in addition to optimizing initial conditions. The results also provide insights into differences in simulations arising from ensembles of weather models using several combinations of different physical parameterizations.

  1. Uncertainty quantification for discrimination of nuclear events as violations of the comprehensive nuclear-test-ban treaty.

    PubMed

    Sloan, Jamison; Sun, Yunwei; Carrigan, Charles

    2016-05-01

    Enforcement of the Comprehensive Nuclear Test Ban Treaty (CTBT) will involve monitoring for radiologic indicators of underground nuclear explosions (UNEs). A UNE produces a variety of radioisotopes which then decay through connected radionuclide chains. A particular species of interest is xenon, namely the four isotopes (131m)Xe, (133m)Xe, (133)Xe, and (135)Xe. Due to their half lives, some of these isotopes can exist in the subsurface for more than 100 days. This convenient timescale, combined with modern detection capabilities, makes the xenon family a desirable candidate for UNE detection. Ratios of these isotopes as a function of time have been studied in the past for distinguishing nuclear explosions from civilian nuclear applications. However, the initial yields from UNEs have been treated as fixed values. In reality, these independent yields are uncertain to a large degree. This study quantifies the uncertainty in xenon ratios as a result of these uncertain initial conditions to better bound the values that xenon ratios can assume. We have successfully used a combination of analytical and sampling based statistical methods to reliably bound xenon isotopic ratios. We have also conducted a sensitivity analysis and found that xenon isotopic ratios are primarily sensitive to only a few of many uncertain initial conditions. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. Robust Landing Using Time-to-Collision Measurement with Actuator Saturation

    NASA Technical Reports Server (NTRS)

    Kuwata, Yoshiaki; Matthies, Larry

    2009-01-01

    This paper considers a landing problem for an MAV that uses only a monocular camera for guidance. Although this sensor cannot measure the absolute distance to the target, by using optical flow algorithms, time-to-collision to the target is obtained. Existing work has applied a simple proportional feedback control to simple dynamics and demonstrated its potential. However, due to the singularity in the time-to-collision measurement around the target, this feedback could require an infinite control action. This paper extends the approach into nonlinear dynamics. In particular, we explicitly consider the saturation of the actuator and include the effect of the aerial drag. It is shown that the convergence to the target is guaranteed from a set of initial conditions, and the boundaries of such initial conditions in the state space are numerically obtained. The paper then introduces parametric uncertainties in the vehicle model and in the time-to-collision measurements. Using an argument similar to the nominal case, the robust convergence to the target is proven, but the region of attraction is shown to shrink due to the existence of uncertainties. The numerical simulation validates these theoretical results.

  3. Accuracy of available methods for quantifying the heat power generation of nanoparticles for magnetic hyperthermia.

    PubMed

    Andreu, Irene; Natividad, Eva

    2013-12-01

    In magnetic hyperthermia, characterising the specific functionality of magnetic nanoparticle arrangements is essential to plan the therapies by simulating maximum achievable temperatures. This functionality, i.e. the heat power released upon application of an alternating magnetic field, is quantified by means of the specific absorption rate (SAR), also referred to as specific loss power (SLP). Many research groups are currently involved in the SAR/SLP determination of newly synthesised materials by several methods, either magnetic or calorimetric, some of which are affected by important and unquantifiable uncertainties that may turn measurements into rough estimates. This paper reviews all these methods, discussing in particular sources of uncertainties, as well as their possible minimisation. In general, magnetic methods, although accurate, do not operate in the conditions of magnetic hyperthermia. Calorimetric methods do, but the easiest to implement, the initial-slope method in isoperibol conditions, derives inaccuracies coming from the lack of matching between thermal models, experimental set-ups and measuring conditions, while the most accurate, the pulse-heating method in adiabatic conditions, requires more complex set-ups.

  4. Monitoring and Research of the Colorado River Ecosystem: When Is Enough Enough?

    NASA Astrophysics Data System (ADS)

    Schmidt, J. C.

    2014-12-01

    The Glen Canyon Dam Adaptive Management Program (GCDAMP) is a well-funded ( $10 million/yr.) river rehabilitation program with long-term monitoring and research focused on 400 km of the Colorado River in Glen, Marble, and Grand Canyons downstream from Lake Powell. More than 15 years of substantive science concerning hydrology, hydraulics, sediment transport, geomorphology, aquatic and fish ecology, riparian ecology, and socio-economics has yielded significant insights that guide experimental river management initiatives, such as a new protocol to annually release sediment-triggered controlled floods; administratively called the High Flow Experimental Protocol (HFEP). Implementation of the HFEP requires nearly real-time monitoring of sediment delivery from key sand producing tributaries, transport in and calculation of sand mass balance in segments of the Colorado River, and defined uncertainty of those processes and conditions (see: http://www.gcmrc.gov/). The HFEP aims to rebuild sandbars within the active channel, but many stakeholders remain focused on other aquatic ecosystem, riparian ecosystem, archaeological resources, or cultural values that are linked in complex ways to active channel conditions. Tension exists within the GCDAMP about how funding is allocated for innovative data collection, analysis, and publication strategies that allow implementation of the HFEP, and to also measure derivative resource conditions about which some stakeholders have concern. Monitoring and research initiatives that attempt to incorporate traditional cultural values also have high uncertainty when resource condition is linked with the simple implementation paradigm of the HFEP. Thus, the GCDAMP is faced with the complex challenge of allocating sufficient resources to monitor active channel processes and characteristics, resolve remaining scientific uncertainties, and develop new strategies for incorporating science insights into engineering and policy decisions, while also monitoring terrestrial resources supported by stakeholders but only indirectly linked with dam operations. The challenge of balancing these scientific and adaptive management objectives is substantial.

  5. The critical role of uncertainty in projections of hydrological extremes

    NASA Astrophysics Data System (ADS)

    Meresa, Hadush K.; Romanowicz, Renata J.

    2017-08-01

    This paper aims to quantify the uncertainty in projections of future hydrological extremes in the Biala Tarnowska River at Koszyce gauging station, south Poland. The approach followed is based on several climate projections obtained from the EURO-CORDEX initiative, raw and bias-corrected realizations of catchment precipitation, and flow simulations derived using multiple hydrological model parameter sets. The projections cover the 21st century. Three sources of uncertainty are considered: one related to climate projection ensemble spread, the second related to the uncertainty in hydrological model parameters and the third related to the error in fitting theoretical distribution models to annual extreme flow series. The uncertainty of projected extreme indices related to hydrological model parameters was conditioned on flow observations from the reference period using the generalized likelihood uncertainty estimation (GLUE) approach, with separate criteria for high- and low-flow extremes. Extreme (low and high) flow quantiles were estimated using the generalized extreme value (GEV) distribution at different return periods and were based on two different lengths of the flow time series. A sensitivity analysis based on the analysis of variance (ANOVA) shows that the uncertainty introduced by the hydrological model parameters can be larger than the climate model variability and the distribution fit uncertainty for the low-flow extremes whilst for the high-flow extremes higher uncertainty is observed from climate models than from hydrological parameter and distribution fit uncertainties. This implies that ignoring one of the three uncertainty sources may cause great risk to future hydrological extreme adaptations and water resource planning and management.

  6. Development of a Prototype Model-Form Uncertainty Knowledge Base

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.

    2016-01-01

    Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.

  7. Quantifying Uncertainty in Model Predictions for the Pliocene (Plio-QUMP): Initial results

    USGS Publications Warehouse

    Pope, J.O.; Collins, M.; Haywood, A.M.; Dowsett, H.J.; Hunter, S.J.; Lunt, D.J.; Pickering, S.J.; Pound, M.J.

    2011-01-01

    Examination of the mid-Pliocene Warm Period (mPWP; ~. 3.3 to 3.0. Ma BP) provides an excellent opportunity to test the ability of climate models to reproduce warm climate states, thereby assessing our confidence in model predictions. To do this it is necessary to relate the uncertainty in model simulations of mPWP climate to uncertainties in projections of future climate change. The uncertainties introduced by the model can be estimated through the use of a Perturbed Physics Ensemble (PPE). Developing on the UK Met Office Quantifying Uncertainty in Model Predictions (QUMP) Project, this paper presents the results from an initial investigation using the end members of a PPE in a fully coupled atmosphere-ocean model (HadCM3) running with appropriate mPWP boundary conditions. Prior work has shown that the unperturbed version of HadCM3 may underestimate mPWP sea surface temperatures at higher latitudes. Initial results indicate that neither the low sensitivity nor the high sensitivity simulations produce unequivocally improved mPWP climatology relative to the standard. Whilst the high sensitivity simulation was able to reconcile up to 6 ??C of the data/model mismatch in sea surface temperatures in the high latitudes of the Northern Hemisphere (relative to the standard simulation), it did not produce a better prediction of global vegetation than the standard simulation. Overall the low sensitivity simulation was degraded compared to the standard and high sensitivity simulations in all aspects of the data/model comparison. The results have shown that a PPE has the potential to explore weaknesses in mPWP modelling simulations which have been identified by geological proxies, but that a 'best fit' simulation will more likely come from a full ensemble in which simulations that contain the strengths of the two end member simulations shown here are combined. ?? 2011 Elsevier B.V.

  8. Design and results of the ice sheet model initialisation experiments initMIP-Greenland: an ISMIP6 intercomparison

    NASA Astrophysics Data System (ADS)

    Goelzer, Heiko; Nowicki, Sophie; Edwards, Tamsin; Beckley, Matthew; Abe-Ouchi, Ayako; Aschwanden, Andy; Calov, Reinhard; Gagliardini, Olivier; Gillet-Chaulet, Fabien; Golledge, Nicholas R.; Gregory, Jonathan; Greve, Ralf; Humbert, Angelika; Huybrechts, Philippe; Kennedy, Joseph H.; Larour, Eric; Lipscomb, William H.; Le clec'h, Sébastien; Lee, Victoria; Morlighem, Mathieu; Pattyn, Frank; Payne, Antony J.; Rodehacke, Christian; Rückamp, Martin; Saito, Fuyuki; Schlegel, Nicole; Seroussi, Helene; Shepherd, Andrew; Sun, Sainan; van de Wal, Roderik; Ziemen, Florian A.

    2018-04-01

    Earlier large-scale Greenland ice sheet sea-level projections (e.g. those run during the ice2sea and SeaRISE initiatives) have shown that ice sheet initial conditions have a large effect on the projections and give rise to important uncertainties. The goal of this initMIP-Greenland intercomparison exercise is to compare, evaluate, and improve the initialisation techniques used in the ice sheet modelling community and to estimate the associated uncertainties in modelled mass changes. initMIP-Greenland is the first in a series of ice sheet model intercomparison activities within ISMIP6 (the Ice Sheet Model Intercomparison Project for CMIP6), which is the primary activity within the Coupled Model Intercomparison Project Phase 6 (CMIP6) focusing on the ice sheets. Two experiments for the large-scale Greenland ice sheet have been designed to allow intercomparison between participating models of (1) the initial present-day state of the ice sheet and (2) the response in two idealised forward experiments. The forward experiments serve to evaluate the initialisation in terms of model drift (forward run without additional forcing) and in response to a large perturbation (prescribed surface mass balance anomaly); they should not be interpreted as sea-level projections. We present and discuss results that highlight the diversity of data sets, boundary conditions, and initialisation techniques used in the community to generate initial states of the Greenland ice sheet. We find good agreement across the ensemble for the dynamic response to surface mass balance changes in areas where the simulated ice sheets overlap but differences arising from the initial size of the ice sheet. The model drift in the control experiment is reduced for models that participated in earlier intercomparison exercises.

  9. Design and results of the ice sheet model initialisation experiments initMIP-Greenland: an ISMIP6 intercomparison

    DOE PAGES

    Goelzer, Heiko; Nowicki, Sophie; Edwards, Tamsin; ...

    2018-04-19

    Earlier large-scale Greenland ice sheet sea-level projections (e.g. those run during the ice2sea and SeaRISE initiatives) have shown that ice sheet initial conditions have a large effect on the projections and give rise to important uncertainties. Here, the goal of this initMIP-Greenland intercomparison exercise is to compare, evaluate, and improve the initialisation techniques used in the ice sheet modelling community and to estimate the associated uncertainties in modelled mass changes. initMIP-Greenland is the first in a series of ice sheet model intercomparison activities within ISMIP6 (the Ice Sheet Model Intercomparison Project for CMIP6), which is the primary activity within themore » Coupled Model Intercomparison Project Phase 6 (CMIP6) focusing on the ice sheets. Two experiments for the large-scale Greenland ice sheet have been designed to allow intercomparison between participating models of (1) the initial present-day state of the ice sheet and (2) the response in two idealised forward experiments. The forward experiments serve to evaluate the initialisation in terms of model drift (forward run without additional forcing) and in response to a large perturbation (prescribed surface mass balance anomaly); they should not be interpreted as sea-level projections. We present and discuss results that highlight the diversity of data sets, boundary conditions, and initialisation techniques used in the community to generate initial states of the Greenland ice sheet. We find good agreement across the ensemble for the dynamic response to surface mass balance changes in areas where the simulated ice sheets overlap but differences arising from the initial size of the ice sheet. The model drift in the control experiment is reduced for models that participated in earlier intercomparison exercises.« less

  10. Design and results of the ice sheet model initialisation experiments initMIP-Greenland: an ISMIP6 intercomparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goelzer, Heiko; Nowicki, Sophie; Edwards, Tamsin

    Earlier large-scale Greenland ice sheet sea-level projections (e.g. those run during the ice2sea and SeaRISE initiatives) have shown that ice sheet initial conditions have a large effect on the projections and give rise to important uncertainties. Here, the goal of this initMIP-Greenland intercomparison exercise is to compare, evaluate, and improve the initialisation techniques used in the ice sheet modelling community and to estimate the associated uncertainties in modelled mass changes. initMIP-Greenland is the first in a series of ice sheet model intercomparison activities within ISMIP6 (the Ice Sheet Model Intercomparison Project for CMIP6), which is the primary activity within themore » Coupled Model Intercomparison Project Phase 6 (CMIP6) focusing on the ice sheets. Two experiments for the large-scale Greenland ice sheet have been designed to allow intercomparison between participating models of (1) the initial present-day state of the ice sheet and (2) the response in two idealised forward experiments. The forward experiments serve to evaluate the initialisation in terms of model drift (forward run without additional forcing) and in response to a large perturbation (prescribed surface mass balance anomaly); they should not be interpreted as sea-level projections. We present and discuss results that highlight the diversity of data sets, boundary conditions, and initialisation techniques used in the community to generate initial states of the Greenland ice sheet. We find good agreement across the ensemble for the dynamic response to surface mass balance changes in areas where the simulated ice sheets overlap but differences arising from the initial size of the ice sheet. The model drift in the control experiment is reduced for models that participated in earlier intercomparison exercises.« less

  11. Conditional uncertainty principle

    NASA Astrophysics Data System (ADS)

    Gour, Gilad; Grudka, Andrzej; Horodecki, Michał; Kłobus, Waldemar; Łodyga, Justyna; Narasimhachar, Varun

    2018-04-01

    We develop a general operational framework that formalizes the concept of conditional uncertainty in a measure-independent fashion. Our formalism is built upon a mathematical relation which we call conditional majorization. We define conditional majorization and, for the case of classical memory, we provide its thorough characterization in terms of monotones, i.e., functions that preserve the partial order under conditional majorization. We demonstrate the application of this framework by deriving two types of memory-assisted uncertainty relations, (1) a monotone-based conditional uncertainty relation and (2) a universal measure-independent conditional uncertainty relation, both of which set a lower bound on the minimal uncertainty that Bob has about Alice's pair of incompatible measurements, conditioned on arbitrary measurement that Bob makes on his own system. We next compare the obtained relations with their existing entropic counterparts and find that they are at least independent.

  12. The devil that we know: lead (Pb) replacement policies under conditions of scientific uncertainty

    NASA Technical Reports Server (NTRS)

    Ogunseitan, Dele; Schoenung, Julie; Saphores, Jean-Daniel; Shapiro, Andrew; Bhuie, Amrit; Kang, Hai-Yong; Nixon, Hilary; Stein, Antionette

    2003-01-01

    Engineering and economic considerations are typical driving forces behind the selection of specific chemicals used in the manufacture of consumer products. Only recently has post-consumer environmental impact become part of the major considerations during the initial phases of product design. Therefore, reactive, rather than proactive strategies have dominated the consideration of environmental and health issues in product design.

  13. Forensic Uncertainty Quantification of Explosive Dispersal of Particles

    NASA Astrophysics Data System (ADS)

    Hughes, Kyle; Park, Chanyoung; Haftka, Raphael; Kim, Nam-Ho

    2017-06-01

    In addition to the numerical challenges of simulating the explosive dispersal of particles, validation of the simulation is often plagued with poor knowledge of the experimental conditions. The level of experimental detail required for validation is beyond what is usually included in the literature. This presentation proposes the use of forensic uncertainty quantification (UQ) to investigate validation-quality experiments to discover possible sources of uncertainty that may have been missed in initial design of experiments or under-reported. The current experience of the authors has found that by making an analogy to crime scene investigation when looking at validation experiments, valuable insights may be gained. One examines all the data and documentation provided by the validation experimentalists, corroborates evidence, and quantifies large sources of uncertainty a posteriori with empirical measurements. In addition, it is proposed that forensic UQ may benefit from an independent investigator to help remove possible implicit biases and increases the likelihood of discovering unrecognized uncertainty. Forensic UQ concepts will be discussed and then applied to a set of validation experiments performed at Eglin Air Force Base. This work was supported in part by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program.

  14. Translation of Land Surface Model Accuracy and Uncertainty into Coupled Land-Atmosphere Prediction

    NASA Technical Reports Server (NTRS)

    Santanello, Joseph A.; Kumar, Sujay; Peters-Lidard, Christa D.; Harrison, Kenneth W.; Zhou, Shuija

    2012-01-01

    Land-atmosphere (L-A) Interactions playa critical role in determining the diurnal evolution of both planetary boundary layer (PBL) and land surface heat and moisture budgets, as well as controlling feedbacks with clouds and precipitation that lead to the persistence of dry and wet regimes. Recent efforts to quantify the strength of L-A coupling in prediction models have produced diagnostics that integrate across both the land and PBL components of the system. In this study, we examine the impact of improved specification of land surface states, anomalies, and fluxes on coupled WRF forecasts during the summers of extreme dry (2006) and wet (2007) land surface conditions in the U.S. Southern Great Plains. The improved land initialization and surface flux parameterizations are obtained through the use of a new optimization and uncertainty estimation module in NASA's Land Information System (US-OPT/UE), whereby parameter sets are calibrated in the Noah land surface model and classified according to a land cover and soil type mapping of the observation sites to the full model domain. The impact of calibrated parameters on the a) spinup of the land surface used as initial conditions, and b) heat and moisture states and fluxes of the coupled WRF Simulations are then assessed in terms of ambient weather and land-atmosphere coupling along with measures of uncertainty propagation into the forecasts. In addition, the sensitivity of this approach to the period of calibration (dry, wet, average) is investigated. Finally, tradeoffs of computational tractability and scientific validity, and the potential for combining this approach with satellite remote sensing data are also discussed.

  15. Direct Imaging of a Cold Jovian Exoplanet in Orbit around the Sun-Like Star GJ 504

    NASA Technical Reports Server (NTRS)

    Kuzuhara, M.; Tamura, M.; Kudo, T.; Janson, M; Kandori, R.; Brandt, T. D.; Thalmann, C.; Spiegel, D.; Biller, B.; Carson, J.; hide

    2013-01-01

    Several exoplanets have recently been imaged at wide separations of >10 AU from their parent stars. These span a limited range of ages (<50 Myr) and atmospheric properties, with temperatures of 800-1800 K and very red colors (J -H > 0.5 mag), implying thick cloud covers. Furthermore, substantial model uncertainties exist at these young ages due to the unknown initial conditions at formation, which can lead to an order of magnitude of uncertainty in the modeled planet mass. Here, we report the direct imaging discovery of a Jovian exoplanet around the Sun-like star GJ 504, detected as part of the SEEDS survey. The system is older than all other known directly-imaged planets; as a result, its estimated mass remains in the planetary regime independent of uncertainties related to choices of initial conditions in the exoplanet modeling. Using the most common exoplanet cooling model, and given the system age of 160(+350/-60) Myr, GJ 504 b has an estimated mass of 4(+4.5/-1.0) Jupiter masses, among the lowest of directly imaged planets. Its projected separation of 43.5 AU exceeds the typical outer boundary of approx.. 30 AU predicted for the core accretion mechanism. GJ 504 b is also significantly cooler (510(+30/-20) K)) and has a bluer color (J - H = -0.23 mag) than previously imaged exoplanets, suggesting a largely cloud-free atmosphere accessible to spectroscopic characterization. Thus, it has the potential of providing novel insights into the origins of giant planets, as well as their atmospheric properties.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuzuhara, M.; Tamura, M.; Kandori, R.

    Several exoplanets have recently been imaged at wide separations of >10 AU from their parent stars. These span a limited range of ages (<50 Myr) and atmospheric properties, with temperatures of 800-1800 K and very red colors (J - H > 0.5 mag), implying thick cloud covers. Furthermore, substantial model uncertainties exist at these young ages due to the unknown initial conditions at formation, which can lead to an order of magnitude of uncertainty in the modeled planet mass. Here, we report the direct-imaging discovery of a Jovian exoplanet around the Sun-like star GJ 504, detected as part of themore » SEEDS survey. The system is older than all other known directly imaged planets; as a result, its estimated mass remains in the planetary regime independent of uncertainties related to choices of initial conditions in the exoplanet modeling. Using the most common exoplanet cooling model, and given the system age of 160{sup +350}{sub -60} Myr, GJ 504b has an estimated mass of 4{sup +4.5}{sub -1.0} Jupiter masses, among the lowest of directly imaged planets. Its projected separation of 43.5 AU exceeds the typical outer boundary of {approx}30 AU predicted for the core accretion mechanism. GJ 504b is also significantly cooler (510{sup +30}{sub -20} K) and has a bluer color (J - H = -0.23 mag) than previously imaged exoplanets, suggesting a largely cloud-free atmosphere accessible to spectroscopic characterization. Thus, it has the potential of providing novel insights into the origins of giant planets as well as their atmospheric properties.« less

  17. Possible sources of forecast errors generated by the global/regional assimilation and prediction system for landfalling tropical cyclones. Part I: Initial uncertainties

    NASA Astrophysics Data System (ADS)

    Zhou, Feifan; Yamaguchi, Munehiko; Qin, Xiaohao

    2016-07-01

    This paper investigates the possible sources of errors associated with tropical cyclone (TC) tracks forecasted using the Global/Regional Assimilation and Prediction System (GRAPES). The GRAPES forecasts were made for 16 landfalling TCs in the western North Pacific basin during the 2008 and 2009 seasons, with a forecast length of 72 hours, and using the default initial conditions ("initials", hereafter), which are from the NCEP-FNL dataset, as well as ECMWF initials. The forecasts are compared with ECMWF forecasts. The results show that in most TCs, the GRAPES forecasts are improved when using the ECMWF initials compared with the default initials. Compared with the ECMWF initials, the default initials produce lower intensity TCs and a lower intensity subtropical high, but a higher intensity South Asia high and monsoon trough, as well as a higher temperature but lower specific humidity at the TC center. Replacement of the geopotential height and wind fields with the ECMWF initials in and around the TC center at the initial time was found to be the most efficient way to improve the forecasts. In addition, TCs that showed the greatest improvement in forecast accuracy usually had the largest initial uncertainties in TC intensity and were usually in the intensifying phase. The results demonstrate the importance of the initial intensity for TC track forecasts made using GRAPES, and indicate the model is better in describing the intensifying phase than the decaying phase of TCs. Finally, the limit of the improvement indicates that the model error associated with GRAPES forecasts may be the main cause of poor forecasts of landfalling TCs. Thus, further examinations of the model errors are required.

  18. Implementation ambiguity: The fifth element long lost in uncertainty budgets for land biogeochemical modeling

    NASA Astrophysics Data System (ADS)

    Tang, J.; Riley, W. J.

    2015-12-01

    Previous studies have identified four major sources of predictive uncertainty in modeling land biogeochemical (BGC) processes: (1) imperfect initial conditions (e.g., assumption of preindustrial equilibrium); (2) imperfect boundary conditions (e.g., climate forcing data); (3) parameterization (type I equifinality); and (4) model structure (type II equifinality). As if that were not enough to cause substantial sleep loss in modelers, we propose here a fifth element of uncertainty that results from implementation ambiguity that occurs when the model's mathematical description is translated into computational code. We demonstrate the implementation ambiguity using the example of nitrogen down regulation, a necessary process in modeling carbon-climate feedbacks. We show that, depending on common land BGC model interpretations of the governing equations for mineral nitrogen, there are three different implementations of nitrogen down regulation. We coded these three implementations in the ACME land model (ALM), and explored how they lead to different preindustrial and contemporary land biogeochemical states and fluxes. We also show how this implementation ambiguity can lead to different carbon-climate feedback estimates across the RCP scenarios. We conclude by suggesting how to avoid such implementation ambiguity in ESM BGC models.

  19. Verification of an ensemble prediction system for storm surge forecast in the Adriatic Sea

    NASA Astrophysics Data System (ADS)

    Mel, Riccardo; Lionello, Piero

    2014-12-01

    In the Adriatic Sea, storm surges present a significant threat to Venice and to the flat coastal areas of the northern coast of the basin. Sea level forecast is of paramount importance for the management of daily activities and for operating the movable barriers that are presently being built for the protection of the city. In this paper, an EPS (ensemble prediction system) for operational forecasting of storm surge in the northern Adriatic Sea is presented and applied to a 3-month-long period (October-December 2010). The sea level EPS is based on the HYPSE (hydrostatic Padua Sea elevation) model, which is a standard single-layer nonlinear shallow water model, whose forcings (mean sea level pressure and surface wind fields) are provided by the ensemble members of the ECMWF (European Center for Medium-Range Weather Forecasts) EPS. Results are verified against observations at five tide gauges located along the Croatian and Italian coasts of the Adriatic Sea. Forecast uncertainty increases with the predicted value of the storm surge and with the forecast lead time. The EMF (ensemble mean forecast) provided by the EPS has a rms (root mean square) error lower than the DF (deterministic forecast), especially for short (up to 3 days) lead times. Uncertainty for short lead times of the forecast and for small storm surges is mainly caused by uncertainty of the initial condition of the hydrodynamical model. Uncertainty for large lead times and large storm surges is mainly caused by uncertainty in the meteorological forcings. The EPS spread increases with the rms error of the forecast. For large lead times the EPS spread and the forecast error substantially coincide. However, the EPS spread in this study, which does not account for uncertainty in the initial condition, underestimates the error during the early part of the forecast and for small storm surge values. On the contrary, it overestimates the rms error for large surge values. The PF (probability forecast) of the EPS has a clear skill in predicting the actual probability distribution of sea level, and it outperforms simple "dressed" PF methods. A probability estimate based on the single DF is shown to be inadequate. However, a PF obtained with a prescribed Gaussian distribution and centered on the DF value performs very similarly to the EPS-based PF.

  20. AMYGDALA MICROCIRCUITS CONTROLLING LEARNED FEAR

    PubMed Central

    Duvarci, Sevil; Pare, Denis

    2014-01-01

    We review recent work on the role of intrinsic amygdala networks in the regulation of classically conditioned defensive behaviors, commonly known as conditioned fear. These new developments highlight how conditioned fear depends on far more complex networks than initially envisioned. Indeed, multiple parallel inhibitory and excitatory circuits are differentially recruited during the expression versus extinction of conditioned fear. Moreover, shifts between expression and extinction circuits involve coordinated interactions with different regions of the medial prefrontal cortex. However, key areas of uncertainty remain, particularly with respect to the connectivity of the different cell types. Filling these gaps in our knowledge is important because much evidence indicates that human anxiety disorders results from an abnormal regulation of the networks supporting fear learning. PMID:24908482

  1. Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.

    2001-11-09

    Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of anmore » uncertainty analysis framework.« less

  2. Fears, Uncertainties, and Hopes: Patient-Initiated Actions and Doctors’ Responses During Oncology Interviews*

    PubMed Central

    Beach, Wayne A.; Dozier, David M.

    2015-01-01

    New cancer patients frequently raise concerns about fears, uncertainties, and hopes during oncology interviews. This study sought to understand when and how patients raise their concerns, how doctors responded to these patient-initiated actions, and implications for communication satisfaction. A sub-sampling of video recorded and transcribed encounters was investigated involving 44 new patients and 14 oncologists. Patients completed pre-post self-report measures about fears, uncertainties, and hopes as well as post-evaluations of interview satisfaction. Conversation Analysis (CA) was employed to initially identify pairs of patient-initiated and doctor-responsive actions. A coding scheme was subsequently developed, and two independent coding teams, comprised of two coders each, reliably identified patient-initiated and doctor-responsive social actions. Interactional findings reveal that new cancer patients initiate actions much more frequently than previous research had identified, concerns are usually raised indirectly, and with minimal emotion. Doctors tend to respond to these concerns immediately, but with even less affect, and rarely partner with patients. From pre-post results it was determined that the higher patients’ reported fears, the higher their post-visit fears and lower their satisfaction. Patients with high uncertainty were highly proactive (e.g., asked more questions), yet reported even greater uncertainties following encounters. Hopeful patients also exited interviews with high hopes. Overall, new patients were very satisfied: Oncology interviews significantly decreased patients’ fears and uncertainties, while increasing hopes. Discussion raises key issues for improving communication and managing quality cancer care. PMID:26134261

  3. Predictability of horizontal water vapor transport relative to precipitation: Enhancing situational awareness for forecasting western U.S. extreme precipitation and flooding

    USGS Publications Warehouse

    Lavers, David A.; Waliser, Duane E.; Ralph, F. Martin; Dettinger, Michael

    2016-01-01

    The western United States is vulnerable to socioeconomic disruption due to extreme winter precipitation and floods. Traditionally, forecasts of precipitation and river discharge provide the basis for preparations. Herein we show that earlier event awareness may be possible through use of horizontal water vapor transport (integrated vapor transport (IVT)) forecasts. Applying the potential predictability concept to the National Centers for Environmental Prediction global ensemble reforecasts, across 31 winters, IVT is found to be more predictable than precipitation. IVT ensemble forecasts with the smallest spreads (least forecast uncertainty) are associated with initiation states with anomalously high geopotential heights south of Alaska, a setup conducive for anticyclonic conditions and weak IVT into the western United States. IVT ensemble forecasts with the greatest spreads (most forecast uncertainty) have initiation states with anomalously low geopotential heights south of Alaska and correspond to atmospheric rivers. The greater IVT predictability could provide warnings of impending storminess with additional lead times for hydrometeorological applications.

  4. Improved forecasts of winter weather extremes over midlatitudes with extra Arctic observations

    NASA Astrophysics Data System (ADS)

    Sato, Kazutoshi; Inoue, Jun; Yamazaki, Akira; Kim, Joo-Hong; Maturilli, Marion; Dethloff, Klaus; Hudson, Stephen R.; Granskog, Mats A.

    2017-02-01

    Recent cold winter extremes over Eurasia and North America have been considered to be a consequence of a warming Arctic. More accurate weather forecasts are required to reduce human and socioeconomic damages associated with severe winters. However, the sparse observing network over the Arctic brings errors in initializing a weather prediction model, which might impact accuracy of prediction results at midlatitudes. Here we show that additional Arctic radiosonde observations from the Norwegian young sea ICE expedition (N-ICE2015) drifting ice camps and existing land stations during winter improved forecast skill and reduced uncertainties of weather extremes at midlatitudes of the Northern Hemisphere. For two winter storms over East Asia and North America in February 2015, ensemble forecast experiments were performed with initial conditions taken from an ensemble atmospheric reanalysis in which the observation data were assimilated. The observations reduced errors in initial conditions in the upper troposphere over the Arctic region, yielding more precise prediction of the locations and strengths of upper troughs and surface synoptic disturbances. Errors and uncertainties of predicted upper troughs at midlatitudes would be brought with upper level high potential vorticity (PV) intruding southward from the observed Arctic region. This is because the PV contained a "signal" of the additional Arctic observations as it moved along an isentropic surface. This suggests that a coordinated sustainable Arctic observing network would be effective not only for regional weather services but also for reducing weather risks in locations distant from the Arctic.

  5. Intrusive Method for Uncertainty Quantification in a Multiphase Flow Solver

    NASA Astrophysics Data System (ADS)

    Turnquist, Brian; Owkes, Mark

    2016-11-01

    Uncertainty quantification (UQ) is a necessary, interesting, and often neglected aspect of fluid flow simulations. To determine the significance of uncertain initial and boundary conditions, a multiphase flow solver is being created which extends a single phase, intrusive, polynomial chaos scheme into multiphase flows. Reliably estimating the impact of input uncertainty on design criteria can help identify and minimize unwanted variability in critical areas, and has the potential to help advance knowledge in atomizing jets, jet engines, pharmaceuticals, and food processing. Use of an intrusive polynomial chaos method has been shown to significantly reduce computational cost over non-intrusive collocation methods such as Monte-Carlo. This method requires transforming the model equations into a weak form through substitution of stochastic (random) variables. Ultimately, the model deploys a stochastic Navier Stokes equation, a stochastic conservative level set approach including reinitialization, as well as stochastic normals and curvature. By implementing these approaches together in one framework, basic problems may be investigated which shed light on model expansion, uncertainty theory, and fluid flow in general. NSF Grant Number 1511325.

  6. Idealized Experiments for Optimizing Model Parameters Using a 4D-Variational Method in an Intermediate Coupled Model of ENSO

    NASA Astrophysics Data System (ADS)

    Gao, Chuan; Zhang, Rong-Hua; Wu, Xinrong; Sun, Jichang

    2018-04-01

    Large biases exist in real-time ENSO prediction, which can be attributed to uncertainties in initial conditions and model parameters. Previously, a 4D variational (4D-Var) data assimilation system was developed for an intermediate coupled model (ICM) and used to improve ENSO modeling through optimized initial conditions. In this paper, this system is further applied to optimize model parameters. In the ICM used, one important process for ENSO is related to the anomalous temperature of subsurface water entrained into the mixed layer ( T e), which is empirically and explicitly related to sea level (SL) variation. The strength of the thermocline effect on SST (referred to simply as "the thermocline effect") is represented by an introduced parameter, α Te. A numerical procedure is developed to optimize this model parameter through the 4D-Var assimilation of SST data in a twin experiment context with an idealized setting. Experiments having their initial condition optimized only, and having their initial condition plus this additional model parameter optimized, are compared. It is shown that ENSO evolution can be more effectively recovered by including the additional optimization of this parameter in ENSO modeling. The demonstrated feasibility of optimizing model parameters and initial conditions together through the 4D-Var method provides a modeling platform for ENSO studies. Further applications of the 4D-Var data assimilation system implemented in the ICM are also discussed.

  7. Using a 4D-Variational Method to Optimize Model Parameters in an Intermediate Coupled Model of ENSO

    NASA Astrophysics Data System (ADS)

    Gao, C.; Zhang, R. H.

    2017-12-01

    Large biases exist in real-time ENSO prediction, which is attributed to uncertainties in initial conditions and model parameters. Previously, a four dimentional variational (4D-Var) data assimilation system was developed for an intermediate coupled model (ICM) and used to improve ENSO modeling through optimized initial conditions. In this paper, this system is further applied to optimize model parameters. In the ICM used, one important process for ENSO is related to the anomalous temperature of subsurface water entrained into the mixed layer (Te), which is empirically and explicitly related to sea level (SL) variation, written as Te=αTe×FTe (SL). The introduced parameter, αTe, represents the strength of the thermocline effect on sea surface temperature (SST; referred as the thermocline effect). A numerical procedure is developed to optimize this model parameter through the 4D-Var assimilation of SST data in a twin experiment context with an idealized setting. Experiments having initial condition optimized only and having initial condition plus this additional model parameter optimized both are compared. It is shown that ENSO evolution can be more effectively recovered by including the additional optimization of this parameter in ENSO modeling. The demonstrated feasibility of optimizing model parameter and initial condition together through the 4D-Var method provides a modeling platform for ENSO studies. Further applications of the 4D-Var data assimilation system implemented in the ICM are also discussed.

  8. Accounting for multiple sources of uncertainty in impact assessments: The example of the BRACE study

    NASA Astrophysics Data System (ADS)

    O'Neill, B. C.

    2015-12-01

    Assessing climate change impacts often requires the use of multiple scenarios, types of models, and data sources, leading to a large number of potential sources of uncertainty. For example, a single study might require a choice of a forcing scenario, climate model, bias correction and/or downscaling method, societal development scenario, model (typically several) for quantifying elements of societal development such as economic and population growth, biophysical model (such as for crop yields or hydrology), and societal impact model (e.g. economic or health model). Some sources of uncertainty are reduced or eliminated by the framing of the question. For example, it may be useful to ask what an impact outcome would be conditional on a given societal development pathway, forcing scenario, or policy. However many sources of uncertainty remain, and it is rare for all or even most of these sources to be accounted for. I use the example of a recent integrated project on the Benefits of Reduced Anthropogenic Climate changE (BRACE) to explore useful approaches to uncertainty across multiple components of an impact assessment. BRACE comprises 23 papers that assess the differences in impacts between two alternative climate futures: those associated with Representative Concentration Pathways (RCPs) 4.5 and 8.5. It quantifies difference in impacts in terms of extreme events, health, agriculture, tropical cyclones, and sea level rise. Methodologically, it includes climate modeling, statistical analysis, integrated assessment modeling, and sector-specific impact modeling. It employs alternative scenarios of both radiative forcing and societal development, but generally uses a single climate model (CESM), partially accounting for climate uncertainty by drawing heavily on large initial condition ensembles. Strengths and weaknesses of the approach to uncertainty in BRACE are assessed. Options under consideration for improving the approach include the use of perturbed physics ensembles of CESM, employing results from multiple climate models, and combining the results from single impact models with statistical representations of uncertainty across multiple models. A key consideration is the relationship between the question being addressed and the uncertainty approach.

  9. Uncertainty quantification methodologies development for stress corrosion cracking of canister welds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dingreville, Remi Philippe Michel; Bryan, Charles R.

    2016-09-30

    This letter report presents a probabilistic performance assessment model to evaluate the probability of canister failure (through-wall penetration) by SCC. The model first assesses whether environmental conditions for SCC – the presence of an aqueous film – are present at canister weld locations (where tensile stresses are likely to occur) on the canister surface. Geometry-specific storage system thermal models and weather data sets representative of U.S. spent nuclear fuel (SNF) storage sites are implemented to evaluate location-specific canister surface temperature and relative humidity (RH). As the canister cools and aqueous conditions become possible, the occurrence of corrosion is evaluated. Corrosionmore » is modeled as a two-step process: first, pitting is initiated, and the extent and depth of pitting is a function of the chloride surface load and the environmental conditions (temperature and RH). Second, as corrosion penetration increases, the pit eventually transitions to a SCC crack, with crack initiation becoming more likely with increasing pit depth. Once pits convert to cracks, a crack growth model is implemented. The SCC growth model includes rate dependencies on both temperature and crack tip stress intensity factor, and crack growth only occurs in time steps when aqueous conditions are predicted. The model suggests that SCC is likely to occur over potential SNF interim storage intervals; however, this result is based on many modeling assumptions. Sensitivity analyses provide information on the model assumptions and parameter values that have the greatest impact on predicted storage canister performance, and provide guidance for further research to reduce uncertainties.« less

  10. Time-evolution of quantum systems via a complex nonlinear Riccati equation. I. Conservative systems with time-independent Hamiltonian

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cruz, Hans, E-mail: hans@ciencias.unam.mx; Schuch, Dieter; Castaños, Octavio, E-mail: ocasta@nucleares.unam.mx

    2015-09-15

    The sensitivity of the evolution of quantum uncertainties to the choice of the initial conditions is shown via a complex nonlinear Riccati equation leading to a reformulation of quantum dynamics. This sensitivity is demonstrated for systems with exact analytic solutions with the form of Gaussian wave packets. In particular, one-dimensional conservative systems with at most quadratic Hamiltonians are studied.

  11. Using Uncertainty Quantification to Guide Development and Improvements of a Regional-Scale Model of the Coastal Lowlands Aquifer System Spanning Texas, Louisiana, Mississippi, Alabama and Florida

    NASA Astrophysics Data System (ADS)

    Foster, L. K.; Clark, B. R.; Duncan, L. L.; Tebo, D. T.; White, J.

    2017-12-01

    Several historical groundwater models exist within the Coastal Lowlands Aquifer System (CLAS), which spans the Gulf Coastal Plain in Texas, Louisiana, Mississippi, Alabama, and Florida. The largest of these models, called the Gulf Coast Regional Aquifer System Analysis (RASA) model, has been brought into a new framework using the Newton formulation for MODFLOW-2005 (MODFLOW-NWT) and serves as the starting point of a new investigation underway by the U.S. Geological Survey to improve understanding of the CLAS and provide predictions of future groundwater availability within an uncertainty quantification (UQ) framework. The use of an UQ framework will not only provide estimates of water-level observation worth, hydraulic parameter uncertainty, boundary-condition uncertainty, and uncertainty of future potential predictions, but it will also guide the model development process. Traditionally, model development proceeds from dataset construction to the process of deterministic history matching, followed by deterministic predictions using the model. This investigation will combine the use of UQ with existing historical models of the study area to assess in a quantitative framework the effect model package and property improvements have on the ability to represent past-system states, as well as the effect on the model's ability to make certain predictions of water levels, water budgets, and base-flow estimates. Estimates of hydraulic property information and boundary conditions from the existing models and literature, forming the prior, will be used to make initial estimates of model forecasts and their corresponding uncertainty, along with an uncalibrated groundwater model run within an unconstrained Monte Carlo analysis. First-Order Second-Moment (FOSM) analysis will also be used to investigate parameter and predictive uncertainty, and guide next steps in model development prior to rigorous history matching by using PEST++ parameter estimation code.

  12. Quantum speed limit time in a magnetic resonance

    NASA Astrophysics Data System (ADS)

    Ivanchenko, E. A.

    2017-12-01

    A visualization for dynamics of a qudit spin vector in a time-dependent magnetic field is realized by means of mapping a solution for a spin vector on the three-dimensional spherical curve (vector hodograph). The obtained results obviously display the quantum interference of precessional and nutational effects on the spin vector in the magnetic resonance. For any spin the bottom bounds of the quantum speed limit time (QSL) are found. It is shown that the bottom bound goes down when using multilevel spin systems. Under certain conditions the non-nil minimal time, which is necessary to achieve the orthogonal state from the initial one, is attained at spin S = 2. An estimation of the product of two and three standard deviations of the spin components are presented. We discuss the dynamics of the mutual uncertainty, conditional uncertainty and conditional variance in terms of spin standard deviations. The study can find practical applications in the magnetic resonance, 3D visualization of computational data and in designing of optimized information processing devices for quantum computation and communication.

  13. Uncertainty relation in Schwarzschild spacetime

    NASA Astrophysics Data System (ADS)

    Feng, Jun; Zhang, Yao-Zhong; Gould, Mark D.; Fan, Heng

    2015-04-01

    We explore the entropic uncertainty relation in the curved background outside a Schwarzschild black hole, and find that Hawking radiation introduces a nontrivial modification on the uncertainty bound for particular observer, therefore it could be witnessed by proper uncertainty game experimentally. We first investigate an uncertainty game between a free falling observer and his static partner holding a quantum memory initially entangled with the quantum system to be measured. Due to the information loss from Hawking decoherence, we find an inevitable increase of the uncertainty on the outcome of measurements in the view of static observer, which is dependent on the mass of the black hole, the distance of observer from event horizon, and the mode frequency of quantum memory. To illustrate the generality of this paradigm, we relate the entropic uncertainty bound with other uncertainty probe, e.g., time-energy uncertainty. In an alternative game between two static players, we show that quantum information of qubit can be transferred to quantum memory through a bath of fluctuating quantum fields outside the black hole. For a particular choice of initial state, we show that the Hawking decoherence cannot counteract entanglement generation after the dynamical evolution of system, which triggers an effectively reduced uncertainty bound that violates the intrinsic limit -log2 ⁡ c. Numerically estimation for a proper choice of initial state shows that our result is comparable with possible real experiments. Finally, a discussion on the black hole firewall paradox in the context of entropic uncertainty relation is given.

  14. Climate change impact on North Sea wave conditions: a consistent analysis of ten projections

    NASA Astrophysics Data System (ADS)

    Grabemann, Iris; Groll, Nikolaus; Möller, Jens; Weisse, Ralf

    2015-02-01

    Long-term changes in the mean and extreme wind wave conditions as they may occur in the course of anthropogenic climate change can influence and endanger human coastal and offshore activities. A set of ten wave climate projections derived from time slice and transient simulations of future conditions is analyzed to estimate the possible impact of anthropogenic climate change on mean and extreme wave conditions in the North Sea. This set includes different combinations of IPCC SRES emission scenarios (A2, B2, A1B, and B1), global and regional models, and initial states. A consistent approach is used to provide a more robust assessment of expected changes and uncertainties. While the spatial patterns and the magnitude of the climate change signals vary, some robust features among the ten projections emerge: mean and severe wave heights tend to increase in the eastern parts of the North Sea towards the end of the twenty-first century in nine to ten projections, but the magnitude of the increase in extreme waves varies in the order of decimeters between these projections. For the western parts of the North Sea more than half of the projections suggest a decrease in mean and extreme wave heights. Comparing the different sources of uncertainties due to models, scenarios, and initial conditions, it can be inferred that the influence of the emission scenario on the climate change signal seems to be less important. Furthermore, the transient projections show strong multi-decadal fluctuations, and changes towards the end of the twenty-first century might partly be associated with internal variability rather than with systematic changes.

  15. The application of Global Sensitivity Analysis to quantify the dominant input factors for hydraulic model simulations

    NASA Astrophysics Data System (ADS)

    Savage, James; Pianosi, Francesca; Bates, Paul; Freer, Jim; Wagener, Thorsten

    2015-04-01

    Predicting flood inundation extents using hydraulic models is subject to a number of critical uncertainties. For a specific event, these uncertainties are known to have a large influence on model outputs and any subsequent analyses made by risk managers. Hydraulic modellers often approach such problems by applying uncertainty analysis techniques such as the Generalised Likelihood Uncertainty Estimation (GLUE) methodology. However, these methods do not allow one to attribute which source of uncertainty has the most influence on the various model outputs that inform flood risk decision making. Another issue facing modellers is the amount of computational resource that is available to spend on modelling flood inundations that are 'fit for purpose' to the modelling objectives. Therefore a balance needs to be struck between computation time, realism and spatial resolution, and effectively characterising the uncertainty spread of predictions (for example from boundary conditions and model parameterisations). However, it is not fully understood how much of an impact each factor has on model performance, for example how much influence changing the spatial resolution of a model has on inundation predictions in comparison to other uncertainties inherent in the modelling process. Furthermore, when resampling fine scale topographic data in the form of a Digital Elevation Model (DEM) to coarser resolutions, there are a number of possible coarser DEMs that can be produced. Deciding which DEM is then chosen to represent the surface elevations in the model could also influence model performance. In this study we model a flood event using the hydraulic model LISFLOOD-FP and apply Sobol' Sensitivity Analysis to estimate which input factor, among the uncertainty in model boundary conditions, uncertain model parameters, the spatial resolution of the DEM and the choice of resampled DEM, have the most influence on a range of model outputs. These outputs include whole domain maximum inundation indicators and flood wave travel time in addition to temporally and spatially variable indicators. This enables us to assess whether the sensitivity of the model to various input factors is stationary in both time and space. Furthermore, competing models are assessed against observations of water depths from a historical flood event. Consequently we are able to determine which of the input factors has the most influence on model performance. Initial findings suggest the sensitivity of the model to different input factors varies depending on the type of model output assessed and at what stage during the flood hydrograph the model output is assessed. We have also found that initial decisions regarding the characterisation of the input factors, for example defining the upper and lower bounds of the parameter sample space, can be significant in influencing the implied sensitivities.

  16. What is the relative role of initial hydrological conditions and meteorological forcing to the seasonal hydrological forecasting skill? Analysis along Europe's hydro-climatic gradient

    NASA Astrophysics Data System (ADS)

    Pechlivanidis, Ilias; Crochemore, Louise

    2017-04-01

    Recent advances in understanding and forecasting of climate have led into skilful seasonal meteorological predictions, which can consequently increase the confidence of hydrological prognosis. The majority of seasonal impact modelling has commonly been conducted at only one or a limited number of basins limiting the potential to understand large systems. Nevertheless, there is a necessity to develop operational seasonal forecasting services at the pan-European scale, capable of addressing the end-user needs. The skill of such forecasting services is subject to a number of sources of uncertainty, i.e. model structure, parameters, and forcing input. In here, we complement the "deep" knowledge from basin based modelling by investigating the relative contributions of initial hydrological conditions (IHCs) and meteorological forcing (MF) to the skill of a seasonal pan-European hydrological forecasting system. We use the Ensemble Streamflow Prediction (ESP) and reverse ESP (revESP) procedure to show a proxy of hydrological forecasting uncertainty due to MF and IHC uncertainties respectively. We further calculate the critical lead time (CLT), as a proxy of the river memory, after which the importance of MFs surpasses the importance of IHCs. We analyze these results in the context of prevailing hydro-climatic conditions for about 35000 European basins. Both model state initialisation (level in surface water, i.e. reservoirs, lakes and wetlands, soil moisture, snow depth) and provision of climatology are based on forcing input derived from the WFDEI product for the period 1981-2010. The analysis shows that the contribution of ICs and MFs to the hydrological forecasting skill varies considerably according to location, season and lead time. This analysis allows clustering of basins in which hydrological forecasting skill may be improved by better estimation of IHCs, e.g. via data assimilation of in-situ and/or satellite observations; whereas in other basins skill improvement depends on better MFs.

  17. Geostatistical applications in ground-water modeling in south-central Kansas

    USGS Publications Warehouse

    Ma, T.-S.; Sophocleous, M.; Yu, Y.-S.

    1999-01-01

    This paper emphasizes the supportive role of geostatistics in applying ground-water models. Field data of 1994 ground-water level, bedrock, and saltwater-freshwater interface elevations in south-central Kansas were collected and analyzed using the geostatistical approach. Ordinary kriging was adopted to estimate initial conditions for ground-water levels and topography of the Permian bedrock at the nodes of a finite difference grid used in a three-dimensional numerical model. Cokriging was used to estimate initial conditions for the saltwater-freshwater interface. An assessment of uncertainties in the estimated data is presented. The kriged and cokriged estimation variances were analyzed to evaluate the adequacy of data employed in the modeling. Although water levels and bedrock elevations are well described by spherical semivariogram models, additional data are required for better cokriging estimation of the interface data. The geostatistically analyzed data were employed in a numerical model of the Siefkes site in the project area. Results indicate that the computed chloride concentrations and ground-water drawdowns reproduced the observed data satisfactorily.This paper emphasizes the supportive role of geostatistics in applying ground-water models. Field data of 1994 ground-water level, bedrock, and saltwater-freshwater interface elevations in south-central Kansas were collected and analyzed using the geostatistical approach. Ordinary kriging was adopted to estimate initial conditions for ground-water levels and topography of the Permian bedrock at the nodes of a finite difference grid used in a three-dimensional numerical model. Cokriging was used to estimate initial conditions for the saltwater-freshwater interface. An assessment of uncertainties in the estimated data is presented. The kriged and cokriged estimation variances were analyzed to evaluate the adequacy of data employed in the modeling. Although water levels and bedrock elevations are well described by spherical semivariogram models, additional data are required for better cokriging estimation of the interface data. The geostatistically analyzed data were employed in a numerical model of the Siefkes site in the project area. Results indicate that the computed chloride concentrations and ground-water drawdowns reproduced the observed data satisfactorily.

  18. The neural systems for perceptual updating.

    PubMed

    Stöttinger, Elisabeth; Aichhorn, Markus; Anderson, Britt; Danckert, James

    2018-04-01

    In a constantly changing environment we must adapt to both abrupt and gradual changes to incoming information. Previously, we demonstrated that a distributed network (including the anterior insula and anterior cingulate cortex) was active when participants updated their initial representations (e.g., it's a cat) in a gradually morphing picture task (e.g., now it's a rabbit; Stöttinger et al., 2015). To shed light on whether these activations reflect the proactive decisions to update or perceptual uncertainty, we introduced two additional conditions. By presenting picture morphs twice we controlled for uncertainty in perceptual decision making. Inducing an abrupt shift in a third condition allowed us to differentiate between a proactive decision in uncertainty-driven updating and a reactive decision in surprise-based updating. We replicated our earlier result, showing the robustness of the effect. In addition, we found activation in the anterior insula (bilaterally) and the mid frontal area/ACC in all three conditions, indicative of the importance of these areas in updating of all kinds. When participants were naïve as to the identity of the second object, we found higher activations in the mid-cingulate cortex and cuneus - areas typically associated with task difficulty, in addition to higher activations in the right TPJ most likely reflecting the shift to a new perspective. Activations associated with the proactive decision to update to a new interpretation were found in a network including the dorsal ACC known to be involved in exploration and the endogenous decision to switch to a new interpretation. These findings suggest a general network commonly engaged in all types of perceptual decision making supported by additional networks associated with perceptual uncertainty or updating provoked by either proactive or reactive decision making. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Uncertainty in User-contributed Weather Data

    NASA Astrophysics Data System (ADS)

    Bell, S.; Cornford, D.; Bastin, L.; Molyneux, M.

    2012-04-01

    Websites such as Weather Underground and the Met Office's recently launched Weather Observations Website encourage members of the public to not only record meteorological observations for personal use but to upload them to a free online community to be shared and compared with data from hundreds of other weather stations in the UK alone. With such a concentration of freely available surface observations the question is whether it would be beneficial to incorporate this data into existing data assimilation schemes for constructing the initial conditions in Numerical Weather Prediction models. This question ultimately relates to how closely the amateur data represents reality, and how to quantify this uncertainty such that it may be accounted for when using the data. We will highlight factors that can lead to increased uncertainty. For instance as amateur data often comes with limited metadata it is difficult to assess whether an amateur station conforms to the strict guidelines and quality procedures that professional sites do. These guidelines relate to factors such as siting, exposure and calibration and in many cases it is practically impossible for amateur sites to conform to the guidelines due to a tendency for amateur sites to be located in enclosed urbanised areas. We will present exploratory research comparing amateur data from Weather Observations Website and Weather Underground against the Met Office's meteorological monitoring system which will be taken to represent the 'truth'. We are particularly aiming to identify bias in the amateur data and residual variances which will help to quantify our degree of uncertainty. The research will focus on 3 case periods, each with different synoptic conditions (clear skies, overcast, a frontal progression) and on observations of surface air temperature, precipitation, humidity. Future plans of the project will also be introduced such as further investigations into which factors lead to increased uncertainty, highlighting the importance of quantifying and accounting for their effects. Factors may include the degree of urbanisation around the site as well as those that may vary temporally such as the prevailing synoptic conditions. Will we also describe plans to take a Bayesian approach to assessing uncertainty and how this can be incorporated into data assimilation schemes.

  20. An 'Observational Large Ensemble' to compare observed and modeled temperature trend uncertainty due to internal variability.

    NASA Astrophysics Data System (ADS)

    Poppick, A. N.; McKinnon, K. A.; Dunn-Sigouin, E.; Deser, C.

    2017-12-01

    Initial condition climate model ensembles suggest that regional temperature trends can be highly variable on decadal timescales due to characteristics of internal climate variability. Accounting for trend uncertainty due to internal variability is therefore necessary to contextualize recent observed temperature changes. However, while the variability of trends in a climate model ensemble can be evaluated directly (as the spread across ensemble members), internal variability simulated by a climate model may be inconsistent with observations. Observation-based methods for assessing the role of internal variability on trend uncertainty are therefore required. Here, we use a statistical resampling approach to assess trend uncertainty due to internal variability in historical 50-year (1966-2015) winter near-surface air temperature trends over North America. We compare this estimate of trend uncertainty to simulated trend variability in the NCAR CESM1 Large Ensemble (LENS), finding that uncertainty in wintertime temperature trends over North America due to internal variability is largely overestimated by CESM1, on average by a factor of 32%. Our observation-based resampling approach is combined with the forced signal from LENS to produce an 'Observational Large Ensemble' (OLENS). The members of OLENS indicate a range of spatially coherent fields of temperature trends resulting from different sequences of internal variability consistent with observations. The smaller trend variability in OLENS suggests that uncertainty in the historical climate change signal in observations due to internal variability is less than suggested by LENS.

  1. Comments on "Drill-string horizontal dynamics with uncertainty on the frictional force" by T.G. Ritto, M.R. Escalante, Rubens Sampaio, M.B. Rosales [J. Sound Vib. 332 (2013) 145-153

    NASA Astrophysics Data System (ADS)

    Li, Zifeng

    2016-12-01

    This paper analyzes the mechanical and mathematical models in "Ritto et al. (2013) [1]". The results are that: (1) the mechanical model is obviously incorrect; (2) the mathematical model is not complete; (3) the differential equation is obviously incorrect; (4) the finite element equation is obviously not discretized from the corresponding mathematical model above, and is obviously incorrect. A mathematical model of dynamics should include the differential equations, the boundary conditions and the initial conditions.

  2. Impact of Calibrated Land Surface Model Parameters on the Accuracy and Uncertainty of Land-Atmosphere Coupling in WRF Simulations

    NASA Technical Reports Server (NTRS)

    Santanello, Joseph A., Jr.; Kumar, Sujay V.; Peters-Lidard, Christa D.; Harrison, Ken; Zhou, Shujia

    2012-01-01

    Land-atmosphere (L-A) interactions play a critical role in determining the diurnal evolution of both planetary boundary layer (PBL) and land surface temperature and moisture budgets, as well as controlling feedbacks with clouds and precipitation that lead to the persistence of dry and wet regimes. Recent efforts to quantify the strength of L-A coupling in prediction models have produced diagnostics that integrate across both the land and PBL components of the system. In this study, we examine the impact of improved specification of land surface states, anomalies, and fluxes on coupled WRF forecasts during the summers of extreme dry (2006) and wet (2007) land surface conditions in the U.S. Southern Great Plains. The improved land initialization and surface flux parameterizations are obtained through the use of a new optimization and uncertainty estimation module in NASA's Land Information System (LIS-OPT/UE), whereby parameter sets are calibrated in the Noah land surface model and classified according to a land cover and soil type mapping of the observation sites to the full model domain. The impact of calibrated parameters on the a) spinup of the land surface used as initial conditions, and b) heat and moisture states and fluxes of the coupled WRF simulations are then assessed in terms of ambient weather and land-atmosphere coupling along with measures of uncertainty propagation into the forecasts. In addition, the sensitivity of this approach to the period of calibration (dry, wet, average) is investigated. Finally, tradeoffs of computational tractability and scientific validity, and the potential for combining this approach with satellite remote sensing data are also discussed.

  3. Uncertainties in the estimation of specific absorption rate during radiofrequency alternating magnetic field induced non-adiabatic heating of ferrofluids

    NASA Astrophysics Data System (ADS)

    Lahiri, B. B.; Ranoo, Surojit; Philip, John

    2017-11-01

    Magnetic fluid hyperthermia (MFH) is becoming a viable cancer treatment methodology where the alternating magnetic field induced heating of magnetic fluid is utilized for ablating the cancerous cells or making them more susceptible to the conventional treatments. The heating efficiency in MFH is quantified in terms of specific absorption rate (SAR), which is defined as the heating power generated per unit mass. In majority of the experimental studies, SAR is evaluated from the temperature rise curves, obtained under non-adiabatic experimental conditions, which is prone to various thermodynamic uncertainties. A proper understanding of the experimental uncertainties and its remedies is a prerequisite for obtaining accurate and reproducible SAR. Here, we study the thermodynamic uncertainties associated with peripheral heating, delayed heating, heat loss from the sample and spatial variation in the temperature profile within the sample. Using first order approximations, an adiabatic reconstruction protocol for the measured temperature rise curves is developed for SAR estimation, which is found to be in good agreement with those obtained from the computationally intense slope corrected method. Our experimental findings clearly show that the peripheral and delayed heating are due to radiation heat transfer from the heating coils and slower response time of the sensor, respectively. Our results suggest that the peripheral heating is linearly proportional to the sample area to volume ratio and coil temperature. It is also observed that peripheral heating decreases in presence of a non-magnetic insulating shielding. The delayed heating is found to contribute up to ~25% uncertainties in SAR values. As the SAR values are very sensitive to the initial slope determination method, explicit mention of the range of linear regression analysis is appropriate to reproduce the results. The effect of sample volume to area ratio on linear heat loss rate is systematically studied and the results are compared using a lumped system thermal model. The various uncertainties involved in SAR estimation are categorized as material uncertainties, thermodynamic uncertainties and parametric uncertainties. The adiabatic reconstruction is found to decrease the uncertainties in SAR measurement by approximately three times. Additionally, a set of experimental guidelines for accurate SAR estimation using adiabatic reconstruction protocol is also recommended. These results warrant a universal experimental and data analysis protocol for SAR measurements during field induced heating of magnetic fluids under non-adiabatic conditions.

  4. Decoherence effect on quantum-memory-assisted entropic uncertainty relations

    NASA Astrophysics Data System (ADS)

    Ming, Fei; Wang, Dong; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu

    2018-01-01

    Uncertainty principle significantly provides a bound to predict precision of measurement with regard to any two incompatible observables, and thereby plays a nontrivial role in quantum precision measurement. In this work, we observe the dynamical features of the quantum-memory-assisted entropic uncertainty relations (EUR) for a pair of incompatible measurements in an open system characterized by local generalized amplitude damping (GAD) noises. Herein, we derive the dynamical evolution of the entropic uncertainty with respect to the measurement affecting by the canonical GAD noises when particle A is initially entangled with quantum memory B. Specifically, we examine the dynamics of EUR in the frame of three realistic scenarios: one case is that particle A is affected by environmental noise (GAD) while particle B as quantum memory is free from any noises, another case is that particle B is affected by the external noise while particle A is not, and the last case is that both of the particles suffer from the noises. By analytical methods, it turns out that the uncertainty is not full dependent of quantum correlation evolution of the composite system consisting of A and B, but the minimal conditional entropy of the measured subsystem. Furthermore, we present a possible physical interpretation for the behavior of the uncertainty evolution by means of the mixedness of the observed system; we argue that the uncertainty might be dramatically correlated with the systematic mixedness. Furthermore, we put forward a simple and effective strategy to reduce the measuring uncertainty of interest upon quantum partially collapsed measurement. Therefore, our explorations might offer an insight into the dynamics of the entropic uncertainty relation in a realistic system, and be of importance to quantum precision measurement during quantum information processing.

  5. Alternative configurations of Quantile Regression for estimating predictive uncertainty in water level forecasts for the Upper Severn River: a comparison

    NASA Astrophysics Data System (ADS)

    Lopez, Patricia; Verkade, Jan; Weerts, Albrecht; Solomatine, Dimitri

    2014-05-01

    Hydrological forecasting is subject to many sources of uncertainty, including those originating in initial state, boundary conditions, model structure and model parameters. Although uncertainty can be reduced, it can never be fully eliminated. Statistical post-processing techniques constitute an often used approach to estimate the hydrological predictive uncertainty, where a model of forecast error is built using a historical record of past forecasts and observations. The present study focuses on the use of the Quantile Regression (QR) technique as a hydrological post-processor. It estimates the predictive distribution of water levels using deterministic water level forecasts as predictors. This work aims to thoroughly verify uncertainty estimates using the implementation of QR that was applied in an operational setting in the UK National Flood Forecasting System, and to inter-compare forecast quality and skill in various, differing configurations of QR. These configurations are (i) 'classical' QR, (ii) QR constrained by a requirement that quantiles do not cross, (iii) QR derived on time series that have been transformed into the Normal domain (Normal Quantile Transformation - NQT), and (iv) a piecewise linear derivation of QR models. The QR configurations are applied to fourteen hydrological stations on the Upper Severn River with different catchments characteristics. Results of each QR configuration are conditionally verified for progressively higher flood levels, in terms of commonly used verification metrics and skill scores. These include Brier's probability score (BS), the continuous ranked probability score (CRPS) and corresponding skill scores as well as the Relative Operating Characteristic score (ROCS). Reliability diagrams are also presented and analysed. The results indicate that none of the four Quantile Regression configurations clearly outperforms the others.

  6. Preliminary performance assessment for the Waste Isolation Pilot Plant, December 1992. Volume 5, Uncertainty and sensitivity analyses of gas and brine migration for undisturbed performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1993-08-01

    Before disposing of transuranic radioactive waste in the Waste Isolation Pilot Plant (WIPP), the United States Department of Energy (DOE) must evaluate compliance with applicable long-term regulations of the United States Environmental Protection Agency (EPA). Sandia National Laboratories is conducting iterative performance assessments (PAs) of the WIPP for the DOE to provide interim guidance while preparing for a final compliance evaluation. This volume of the 1992 PA contains results of uncertainty and sensitivity analyses with respect to migration of gas and brine from the undisturbed repository. Additional information about the 1992 PA is provided in other volumes. Volume 1 containsmore » an overview of WIPP PA and results of a preliminary comparison with 40 CFR 191, Subpart B. Volume 2 describes the technical basis for the performance assessment, including descriptions of the linked computational models used in the Monte Carlo analyses. Volume 3 contains the reference data base and values for input parameters used in consequence and probability modeling. Volume 4 contains uncertainty and sensitivity analyses with respect to the EPA`s Environmental Standards for the Management and Disposal of Spent Nuclear Fuel, High-Level and Transuranic Radioactive Wastes (40 CFR 191, Subpart B). Finally, guidance derived from the entire 1992 PA is presented in Volume 6. Results of the 1992 uncertainty and sensitivity analyses indicate that, conditional on the modeling assumptions and the assigned parameter-value distributions, the most important parameters for which uncertainty has the potential to affect gas and brine migration from the undisturbed repository are: initial liquid saturation in the waste, anhydrite permeability, biodegradation-reaction stoichiometry, gas-generation rates for both corrosion and biodegradation under inundated conditions, and the permeability of the long-term shaft seal.« less

  7. Discriminative Random Field Models for Subsurface Contamination Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Arshadi, M.; Abriola, L. M.; Miller, E. L.; De Paolis Kaluza, C.

    2017-12-01

    Application of flow and transport simulators for prediction of the release, entrapment, and persistence of dense non-aqueous phase liquids (DNAPLs) and associated contaminant plumes is a computationally intensive process that requires specification of a large number of material properties and hydrologic/chemical parameters. Given its computational burden, this direct simulation approach is particularly ill-suited for quantifying both the expected performance and uncertainty associated with candidate remediation strategies under real field conditions. Prediction uncertainties primarily arise from limited information about contaminant mass distributions, as well as the spatial distribution of subsurface hydrologic properties. Application of direct simulation to quantify uncertainty would, thus, typically require simulating multiphase flow and transport for a large number of permeability and release scenarios to collect statistics associated with remedial effectiveness, a computationally prohibitive process. The primary objective of this work is to develop and demonstrate a methodology that employs measured field data to produce equi-probable stochastic representations of a subsurface source zone that capture the spatial distribution and uncertainty associated with key features that control remediation performance (i.e., permeability and contamination mass). Here we employ probabilistic models known as discriminative random fields (DRFs) to synthesize stochastic realizations of initial mass distributions consistent with known, and typically limited, site characterization data. Using a limited number of full scale simulations as training data, a statistical model is developed for predicting the distribution of contaminant mass (e.g., DNAPL saturation and aqueous concentration) across a heterogeneous domain. Monte-Carlo sampling methods are then employed, in conjunction with the trained statistical model, to generate realizations conditioned on measured borehole data. Performance of the statistical model is illustrated through comparisons of generated realizations with the `true' numerical simulations. Finally, we demonstrate how these realizations can be used to determine statistically optimal locations for further interrogation of the subsurface.

  8. Development and application of a standardized flow measurement uncertainty analysis framework to various low-head short-converging intake types across the United States federal hydropower fleet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Brennan T

    2015-01-01

    Turbine discharges at low-head short converging intakes are difficult to measure accurately. The proximity of the measurement section to the intake entrance admits large uncertainties related to asymmetry of the velocity profile, swirl, and turbulence. Existing turbine performance codes [10, 24] do not address this special case and published literature is largely silent on rigorous evaluation of uncertainties associated with this measurement context. The American Society of Mechanical Engineers (ASME) Committee investigated the use of Acoustic transit time (ATT), Acoustic scintillation (AS), and Current meter (CM) in a short converging intake at the Kootenay Canal Generating Station in 2009. Basedmore » on their findings, a standardized uncertainty analysis (UA) framework for velocity-area method (specifically for CM measurements) is presented in this paper given the fact that CM is still the most fundamental and common type of measurement system. Typical sources of systematic and random errors associated with CM measurements are investigated, and the major sources of uncertainties associated with turbulence and velocity fluctuations, numerical velocity integration technique (bi-cubic spline), and the number and placement of current meters are being considered for an evaluation. Since the velocity measurements in a short converging intake are associated with complex nonlinear and time varying uncertainties (e.g., Reynolds stress in fluid dynamics), simply applying the law of propagation of uncertainty is known to overestimate the measurement variance while the Monte Carlo method does not. Therefore, a pseudo-Monte Carlo simulation method (random flow generation technique [8]) which was initially developed for the purpose of establishing upstream or initial conditions in the Large-Eddy Simulation (LES) and the Direct Numerical Simulation (DNS) is used to statistically determine uncertainties associated with turbulence and velocity fluctuations. This technique is then combined with a bi-cubic spline interpolation method which converts point velocities into a continuous velocity distribution over the measurement domain. Subsequently the number and placement of current meters are simulated to investigate the accuracy of the estimated flow rates using the numerical velocity-area integration method outlined in ISO 3354 [12]. The authors herein consider that statistics on generated flow rates processed with bi-cubic interpolation and sensor simulations are the combined uncertainties which already accounted for the effects of all those three uncertainty sources. A preliminary analysis based on the current meter data obtained through an upgrade acceptance test of a single unit located in a mainstem plant has been presented.« less

  9. Probabilistic projections of 21st century climate change over Northern Eurasia

    NASA Astrophysics Data System (ADS)

    Monier, E.; Sokolov, A. P.; Schlosser, C. A.; Scott, J. R.; Gao, X.

    2013-12-01

    We present probabilistic projections of 21st century climate change over Northern Eurasia using the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an earth system model of intermediate complexity, with a two-dimensional zonal-mean atmosphere, to a human activity model. Regional climate change is obtained by two downscaling methods: a dynamical downscaling, where the IGSM is linked to a three dimensional atmospheric model; and a statistical downscaling, where a pattern scaling algorithm uses climate-change patterns from 17 climate models. This framework allows for key sources of uncertainty in future projections of regional climate change to be accounted for: emissions projections; climate system parameters (climate sensitivity, strength of aerosol forcing and ocean heat uptake rate); natural variability; and structural uncertainty. Results show that the choice of climate policy and the climate parameters are the largest drivers of uncertainty. We also nd that dierent initial conditions lead to dierences in patterns of change as large as when using different climate models. Finally, this analysis reveals the wide range of possible climate change over Northern Eurasia, emphasizing the need to consider all sources of uncertainty when modeling climate impacts over Northern Eurasia.

  10. Probabilistic projections of 21st century climate change over Northern Eurasia

    NASA Astrophysics Data System (ADS)

    Monier, Erwan; Sokolov, Andrei; Schlosser, Adam; Scott, Jeffery; Gao, Xiang

    2013-12-01

    We present probabilistic projections of 21st century climate change over Northern Eurasia using the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an Earth system model of intermediate complexity with a two-dimensional zonal-mean atmosphere to a human activity model. Regional climate change is obtained by two downscaling methods: a dynamical downscaling, where the IGSM is linked to a three-dimensional atmospheric model, and a statistical downscaling, where a pattern scaling algorithm uses climate change patterns from 17 climate models. This framework allows for four major sources of uncertainty in future projections of regional climate change to be accounted for: emissions projections, climate system parameters (climate sensitivity, strength of aerosol forcing and ocean heat uptake rate), natural variability, and structural uncertainty. The results show that the choice of climate policy and the climate parameters are the largest drivers of uncertainty. We also find that different initial conditions lead to differences in patterns of change as large as when using different climate models. Finally, this analysis reveals the wide range of possible climate change over Northern Eurasia, emphasizing the need to consider these sources of uncertainty when modeling climate impacts over Northern Eurasia.

  11. Stochastic modelling of basal temperatures in divide regions of the Antarctic ice sheet over the last 1.5 million years

    NASA Astrophysics Data System (ADS)

    Van Liefferinge, Brice; Pattyn, Frank; Cavitte, Marie G. P.; Young, Duncan A.; Roberts, Jason L.

    2017-04-01

    The quest for oldest ice in Antarctica has recently been launched through an EU H2020 project (Beyond EPICA - Oldest Ice) and aims at identifying suitable areas for a potential future drilling. Retrieving an ice core of such age is essential to understand the relation between orbital changes and atmospheric composition during the mid-Pliocene transition. However, sites for a potential undisturbed record of 1.5 million-year old ice in Antarctica are difficult to find and require slow-moving ice (preferably an ice divide) and basal conditions that are not disturbed by large topographic variations. Furthermore, ice should be sufficiently thick but cold basal conditions should still prevail, since basal melting would destroy the bottom layers. Therefore, ice-flow conditions and thermodynamic characteristics are crucial for identifying potential locations of undisturbed ice. Van Liefferinge and Pattyn (2013) identified suitable areas based on a pan-Antarctic simplified thermodynamic ice sheet model and demonstrated that uncertainty in geothermal conditions remain a major unknown. In order to refine these estimates, and provide uncertainties, we employ a full thermo-mechanically coupled higher-order ice sheet model (Pattyn, 2003; Pattyn et al., 2004). Initial conditions for the calculations are based on an inversion of basal slipperiness, based on observed surface topography (Pollard and DeConto, 2012; Pattyn, in prep.). Uncertainties in geothermal conditions are introduced using the convolution of two Gaussian probability density functions: (a) the reconstruction of the Antarctic ice sheet geometry and testing ice thickness variability over the last 2 million years (Pollard and DeConto, 2009) and (b) the surface temperature reconstruction over the same period (Snyder et al., 2016). The standard deviation, the skewness and the kurtosis of the whole Antarctic ice sheet are analyzed to observe likely probable melt conditions. Finally, we focus on model results in the divide area between Dome Concordia and Dome Fuji, and compare to newly acquired radar data in the region (OIA survey).

  12. Model Projections and their Uncertainties of Future Intensity Change of Typhoon Haiyan (2013)

    NASA Astrophysics Data System (ADS)

    Yoshino, J.; Toyoda, M.; Shinohara, K.; Kobayashi, T.

    2017-12-01

    The IPCC fifth assessment report indicated that the global mean maximum wind speed and precipitation of tropical cyclone (TC) are likely to increase by the end of 21st century. However, the specific characteristics of future changes are not yet well quantified and there are high uncertainties in region-specific projections. Such uncertainties in future projections may be attributed to the uncertainties of general circulation models (GCMs) and global warming scenarios (GWSs). In order to quantify uncertainties of future changes of TC intensity among 15 GCMs and 9 GWSs, a present climate experiment (PCE), future climate experiments (FCEs) and sensitivity experiments on TC intensity are carried out in this study using a high-resolution typhoon model, coupled with sea spray, dissipative heating, ocean mixed layer parameterizations and automatic-TC tracking moving nests. The initial and boundary conditions for FCEs are produced by the method of pseudo-global warming downscaling technique. Typhoon Haiyan (2013) is selected as the worst case of a TC under the present climate. Using the high-resolution typhoon model with a grid spacing of 3km, PCE reproduces a peak intensity (minimum central pressure) of about 897.1hPa, which is in close agreement with the besttrack data (895hPa). Comparing the results between 9 FCEs driven by 9 GCMs fixed by one of GCMs (HadCM3) and 15 FCEs driven by 15 GWSs fixed by one of GWSs (in the 2090s of SRES A1B), the TC intensities are slightly weakened by +7.9hPa for GCMs and +3.7hPa for GWSs. The standard deviations of future changes of the peak intensity are 9.47hPa for GCMs and 5.89hPa for GWSs. Thus, the uncertainty of future changes of TC intensity among GCMs is approximately two times larger than that among GWSs. Sensitivity experiments, in which each of global warming differences derived from GCMs are separately added to the initial and boundary conditions of PCE, suggest that the future changes of sea surface temperature in GCMs are responsible for intensifying Haiyan by -19.6hPa while the future changes of air temperature in GCMs are accountable for weakening Haiyan by +45.5hPa. Furthermore, the future changes of air temperature and wind speed in GCMs especially reduce the reliabilities of future projections with standard deviations of 7.82hPa and 9.04hPa, respectively.

  13. Multiscale Informatics for Low-Temperature Propane Oxidation: Further Complexities in Studies of Complex Reactions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burke, Michael P.; Goldsmith, C. Franklin; Klippenstein, Stephen J.

    2015-07-16

    We have developed a multi-scale approach (Burke, M. P.; Klippenstein, S. J.; Harding, L. B. Proc. Combust. Inst. 2013, 34, 547–555.) to kinetic model formulation that directly incorporates elementary kinetic theories as a means to provide reliable, physics-based extrapolation to unexplored conditions. Here, we extend and generalize the multi-scale modeling strategy to treat systems of considerable complexity – involving multi-well reactions, potentially missing reactions, non-statistical product branching ratios, and non-Boltzmann (i.e. non-thermal) reactant distributions. The methodology is demonstrated here for a subsystem of low-temperature propane oxidation, as a representative system for low-temperature fuel oxidation. A multi-scale model is assembled andmore » informed by a wide variety of targets that include ab initio calculations of molecular properties, rate constant measurements of isolated reactions, and complex systems measurements. Active model parameters are chosen to accommodate both “parametric” and “structural” uncertainties. Theoretical parameters (e.g. barrier heights) are included as active model parameters to account for parametric uncertainties in the theoretical treatment; experimental parameters (e.g. initial temperatures) are included to account for parametric uncertainties in the physical models of the experiments. RMG software is used to assess potential structural uncertainties due to missing reactions. Additionally, branching ratios among product channels are included as active model parameters to account for structural uncertainties related to difficulties in modeling sequences of multiple chemically activated steps. The approach is demonstrated here for interpreting time-resolved measurements of OH, HO2, n-propyl, i-propyl, propene, oxetane, and methyloxirane from photolysis-initiated low-temperature oxidation of propane at pressures from 4 to 60 Torr and temperatures from 300 to 700 K. In particular, the multi-scale informed model provides a consistent quantitative explanation of both ab initio calculations and time-resolved species measurements. The present results show that interpretations of OH measurements are significantly more complicated than previously thought – in addition to barrier heights for key transition states considered previously, OH profiles also depend on additional theoretical parameters for R + O2 reactions, secondary reactions, QOOH + O2 reactions, and treatment of non-Boltzmann reaction sequences. Extraction of physically rigorous information from those measurements may require more sophisticated treatment of all of those model aspects, as well as additional experimental data under more conditions, to discriminate among possible interpretations and ensure model reliability. Keywords: Optimization, Uncertainty quantification, Chemical mechanism, Low-Temperature Oxidation, Non-Boltzmann« less

  14. Cycle-expansion method for the Lyapunov exponent, susceptibility, and higher moments.

    PubMed

    Charbonneau, Patrick; Li, Yue Cathy; Pfister, Henry D; Yaida, Sho

    2017-09-01

    Lyapunov exponents characterize the chaotic nature of dynamical systems by quantifying the growth rate of uncertainty associated with the imperfect measurement of initial conditions. Finite-time estimates of the exponent, however, experience fluctuations due to both the initial condition and the stochastic nature of the dynamical path. The scale of these fluctuations is governed by the Lyapunov susceptibility, the finiteness of which typically provides a sufficient condition for the law of large numbers to apply. Here, we obtain a formally exact expression for this susceptibility in terms of the Ruelle dynamical ζ function for one-dimensional systems. We further show that, for systems governed by sequences of random matrices, the cycle expansion of the ζ function enables systematic computations of the Lyapunov susceptibility and of its higher-moment generalizations. The method is here applied to a class of dynamical models that maps to static disordered spin chains with interactions stretching over a varying distance and is tested against Monte Carlo simulations.

  15. Vadose zone transport field study: Detailed test plan for simulated leak tests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    AL Ward; GW Gee

    2000-06-23

    The US Department of Energy (DOE) Groundwater/Vadose Zone Integration Project Science and Technology initiative was created in FY 1999 to reduce the uncertainty associated with vadose zone transport processes beneath waste sites at DOE's Hanford Site near Richland, Washington. This information is needed not only to evaluate the risks from transport, but also to support the adoption of measures for minimizing impacts to the groundwater and surrounding environment. The principal uncertainties in vadose zone transport are the current distribution of source contaminants and the natural heterogeneity of the soil in which the contaminants reside. Oversimplified conceptual models resulting from thesemore » uncertainties and limited use of hydrologic characterization and monitoring technologies have hampered the understanding contaminant migration through Hanford's vadose zone. Essential prerequisites for reducing vadose transport uncertainly include the development of accurate conceptual models and the development or adoption of monitoring techniques capable of delineating the current distributions of source contaminants and characterizing natural site heterogeneity. The Vadose Zone Transport Field Study (VZTFS) was conceived as part of the initiative to address the major uncertainties confronting vadose zone fate and transport predictions at the Hanford Site and to overcome the limitations of previous characterization attempts. Pacific Northwest National Laboratory (PNNL) is managing the VZTFS for DOE. The VZTFS will conduct field investigations that will improve the understanding of field-scale transport and lead to the development or identification of efficient and cost-effective characterization methods. Ideally, these methods will capture the extent of contaminant plumes using existing infrastructure (i.e., more than 1,300 steel-cased boreholes). The objectives of the VZTFS are to conduct controlled transport experiments at well-instrumented field sites at Hanford to: identify mechanisms controlling transport processes in soils typical of the hydrogeologic conditions of Hanford's waste disposal sites; reduce uncertainty in conceptual models; develop a detailed and accurate database of hydraulic and transport parameters for validation of three-dimensional numerical models; identify and evaluate advanced, cost-effective characterization methods with the potential to assess changing conditions in the vadose zone, particularly as surrogates of currently undetectable high-risk contaminants. This plan provides details for conducting field tests during FY 2000 to accomplish these objectives. Details of additional testing during FY 2001 and FY 2002 will be developed as part of the work planning process implemented by the Integration Project.« less

  16. Operational Risk Management is Ineffective at Addressing Nonlinear Problems

    DTIC Science & Technology

    2009-02-20

    brains are not linear: even though the sound of an oboe and the sound of a string section may be independent when they enter your ear, the emotional...impact of both sounds together may be very much greater than either one alone. (This is what keeps symphony orchestras in business .) Nor is the...involving people. “In nonlinear systems... chaos theory tells you that the slightest uncertainty in your knowledge of the initial conditions will often

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Nicholas; Burns, Joseph R.

    The aftermath of the Tōhoku earthquake and the Fukushima accident has led to a global push to improve the safety of existing light water reactors. A key component of this initiative is the development of nuclear fuel and cladding materials with potentially enhanced accident tolerance, also known as accident-tolerant fuels (ATF). These materials are intended to improve core fuel and cladding integrity under beyond design basis accident conditions while maintaining or enhancing reactor performance and safety characteristics during normal operation. To complement research that has already been carried out to characterize ATF neutronics, the present study provides an initial investigationmore » of the sensitivity and uncertainty of ATF systems responses to nuclear cross section data. ATF concepts incorporate novel materials, including SiC and FeCrAl cladding and high density uranium silicide composite fuels, in turn introducing new cross section sensitivities and uncertainties which may behave differently from traditional fuel and cladding materials. In this paper, we conducted sensitivity and uncertainty analysis using the TSUNAMI-2D sequence of SCALE with infinite lattice models of ATF assemblies. Of all the ATF materials considered, it is found that radiative capture in 56Fe in FeCrAl cladding is the most significant contributor to eigenvalue uncertainty. 56Fe yields significant potential eigenvalue uncertainty associated with its radiative capture cross section; this is by far the largest ATF-specific uncertainty found in these cases, exceeding even those of uranium. We found that while significant new sensitivities indeed arise, the general sensitivity behavior of ATF assemblies does not markedly differ from traditional UO2/zirconium-based fuel/cladding systems, especially with regard to uncertainties associated with uranium. We assessed the similarity of the IPEN/MB-01 reactor benchmark model to application models with FeCrAl cladding. We used TSUNAMI-IP to calculate similarity indices of the application model and IPEN/MB-01 reactor benchmark model. This benchmark was selected for its use of SS304 as a cladding and structural material, with significant 56Fe content. The similarity indices suggest that while many differences in reactor physics arise from differences in design, sensitivity to and behavior of 56Fe absorption is comparable between systems, thus indicating the potential for this benchmark to reduce uncertainties in 56Fe radiative capture cross sections.« less

  18. Assessing the inherent uncertainty of one-dimensional diffusions

    NASA Astrophysics Data System (ADS)

    Eliazar, Iddo; Cohen, Morrel H.

    2013-01-01

    In this paper we assess the inherent uncertainty of one-dimensional diffusion processes via a stochasticity classification which provides an à la Mandelbrot categorization into five states of uncertainty: infra-mild, mild, borderline, wild, and ultra-wild. Two settings are considered. (i) Stopped diffusions: the diffusion initiates from a high level and is stopped once it first reaches a low level; in this setting we analyze the inherent uncertainty of the diffusion's maximal exceedance above its initial high level. (ii) Stationary diffusions: the diffusion is in dynamical statistical equilibrium; in this setting we analyze the inherent uncertainty of the diffusion's equilibrium level. In both settings general closed-form analytic results are established, and their application is exemplified by stock prices in the stopped-diffusions setting, and by interest rates in the stationary-diffusions setting. These results provide a highly implementable decision-making tool for the classification of uncertainty in the context of one-dimensional diffusions.

  19. Probabilistic accounting of uncertainty in forecasts of species distributions under climate change

    USGS Publications Warehouse

    Wenger, Seth J.; Som, Nicholas A.; Dauwalter, Daniel C.; Isaak, Daniel J.; Neville, Helen M.; Luce, Charles H.; Dunham, Jason B.; Young, Michael K.; Fausch, Kurt D.; Rieman, Bruce E.

    2013-01-01

    Forecasts of species distributions under future climates are inherently uncertain, but there have been few attempts to describe this uncertainty comprehensively in a probabilistic manner. We developed a Monte Carlo approach that accounts for uncertainty within generalized linear regression models (parameter uncertainty and residual error), uncertainty among competing models (model uncertainty), and uncertainty in future climate conditions (climate uncertainty) to produce site-specific frequency distributions of occurrence probabilities across a species’ range. We illustrated the method by forecasting suitable habitat for bull trout (Salvelinus confluentus) in the Interior Columbia River Basin, USA, under recent and projected 2040s and 2080s climate conditions. The 95% interval of total suitable habitat under recent conditions was estimated at 30.1–42.5 thousand km; this was predicted to decline to 0.5–7.9 thousand km by the 2080s. Projections for the 2080s showed that the great majority of stream segments would be unsuitable with high certainty, regardless of the climate data set or bull trout model employed. The largest contributor to uncertainty in total suitable habitat was climate uncertainty, followed by parameter uncertainty and model uncertainty. Our approach makes it possible to calculate a full distribution of possible outcomes for a species, and permits ready graphical display of uncertainty for individual locations and of total habitat.

  20. Tracing the source of numerical climate model uncertainties in precipitation simulations using a feature-oriented statistical model

    NASA Astrophysics Data System (ADS)

    Xu, Y.; Jones, A. D.; Rhoades, A.

    2017-12-01

    Precipitation is a key component in hydrologic cycles, and changing precipitation regimes contribute to more intense and frequent drought and flood events around the world. Numerical climate modeling is a powerful tool to study climatology and to predict future changes. Despite the continuous improvement in numerical models, long-term precipitation prediction remains a challenge especially at regional scales. To improve numerical simulations of precipitation, it is important to find out where the uncertainty in precipitation simulations comes from. There are two types of uncertainty in numerical model predictions. One is related to uncertainty in the input data, such as model's boundary and initial conditions. These uncertainties would propagate to the final model outcomes even if the numerical model has exactly replicated the true world. But a numerical model cannot exactly replicate the true world. Therefore, the other type of model uncertainty is related the errors in the model physics, such as the parameterization of sub-grid scale processes, i.e., given precise input conditions, how much error could be generated by the in-precise model. Here, we build two statistical models based on a neural network algorithm to predict long-term variation of precipitation over California: one uses "true world" information derived from observations, and the other uses "modeled world" information using model inputs and outputs from the North America Coordinated Regional Downscaling Project (NA CORDEX). We derive multiple climate feature metrics as the predictors for the statistical model to represent the impact of global climate on local hydrology, and include topography as a predictor to represent the local control. We first compare the predictors between the true world and the modeled world to determine the errors contained in the input data. By perturbing the predictors in the statistical model, we estimate how much uncertainty in the model's final outcomes is accounted for by each predictor. By comparing the statistical model derived from true world information and modeled world information, we assess the errors lying in the physics of the numerical models. This work provides a unique insight to assess the performance of numerical climate models, and can be used to guide improvement of precipitation prediction.

  1. The Pliocene Model Intercomparison Project - Phase 2

    NASA Astrophysics Data System (ADS)

    Haywood, Alan; Dowsett, Harry; Dolan, Aisling; Rowley, David; Abe-Ouchi, Ayako; Otto-Bliesner, Bette; Chandler, Mark; Hunter, Stephen; Lunt, Daniel; Pound, Matthew; Salzmann, Ulrich

    2016-04-01

    The Pliocene Model Intercomparison Project (PlioMIP) is a co-ordinated international climate modelling initiative to study and understand climate and environments of the Late Pliocene, and their potential relevance in the context of future climate change. PlioMIP examines the consistency of model predictions in simulating Pliocene climate, and their ability to reproduce climate signals preserved by geological climate archives. Here we provide a description of the aim and objectives of the next phase of the model intercomparison project (PlioMIP Phase 2), and we present the experimental design and boundary conditions that will be utilised for climate model experiments in Phase 2. Following on from PlioMIP Phase 1, Phase 2 will continue to be a mechanism for sampling structural uncertainty within climate models. However, Phase 1 demonstrated the requirement to better understand boundary condition uncertainties as well as uncertainty in the methodologies used for data-model comparison. Therefore, our strategy for Phase 2 is to utilise state-of-the-art boundary conditions that have emerged over the last 5 years. These include a new palaeogeographic reconstruction, detailing ocean bathymetry and land/ice surface topography. The ice surface topography is built upon the lessons learned from offline ice sheet modelling studies. Land surface cover has been enhanced by recent additions of Pliocene soils and lakes. Atmospheric reconstructions of palaeo-CO2 are emerging on orbital timescales and these are also incorporated into PlioMIP Phase 2. New records of surface and sea surface temperature change are being produced that will be more temporally consistent with the boundary conditions and forcings used within models. Finally we have designed a suite of prioritized experiments that tackle issues surrounding the basic understanding of the Pliocene and its relevance in the context of future climate change in a discrete way.

  2. An Objective Approach to Select Climate Scenarios when Projecting Species Distribution under Climate Change

    PubMed Central

    Casajus, Nicolas; Périé, Catherine; Logan, Travis; Lambert, Marie-Claude; de Blois, Sylvie; Berteaux, Dominique

    2016-01-01

    An impressive number of new climate change scenarios have recently become available to assess the ecological impacts of climate change. Among these impacts, shifts in species range analyzed with species distribution models are the most widely studied. Whereas it is widely recognized that the uncertainty in future climatic conditions must be taken into account in impact studies, many assessments of species range shifts still rely on just a few climate change scenarios, often selected arbitrarily. We describe a method to select objectively a subset of climate change scenarios among a large ensemble of available ones. Our k-means clustering approach reduces the number of climate change scenarios needed to project species distributions, while retaining the coverage of uncertainty in future climate conditions. We first show, for three biologically-relevant climatic variables, that a reduced number of six climate change scenarios generates average climatic conditions very close to those obtained from a set of 27 scenarios available before reduction. A case study on potential gains and losses of habitat by three northeastern American tree species shows that potential future species distributions projected from the selected six climate change scenarios are very similar to those obtained from the full set of 27, although with some spatial discrepancies at the edges of species distributions. In contrast, projections based on just a few climate models vary strongly according to the initial choice of climate models. We give clear guidance on how to reduce the number of climate change scenarios while retaining the central tendencies and coverage of uncertainty in future climatic conditions. This should be particularly useful during future climate change impact studies as more than twice as many climate models were reported in the fifth assessment report of IPCC compared to the previous one. PMID:27015274

  3. An Objective Approach to Select Climate Scenarios when Projecting Species Distribution under Climate Change.

    PubMed

    Casajus, Nicolas; Périé, Catherine; Logan, Travis; Lambert, Marie-Claude; de Blois, Sylvie; Berteaux, Dominique

    2016-01-01

    An impressive number of new climate change scenarios have recently become available to assess the ecological impacts of climate change. Among these impacts, shifts in species range analyzed with species distribution models are the most widely studied. Whereas it is widely recognized that the uncertainty in future climatic conditions must be taken into account in impact studies, many assessments of species range shifts still rely on just a few climate change scenarios, often selected arbitrarily. We describe a method to select objectively a subset of climate change scenarios among a large ensemble of available ones. Our k-means clustering approach reduces the number of climate change scenarios needed to project species distributions, while retaining the coverage of uncertainty in future climate conditions. We first show, for three biologically-relevant climatic variables, that a reduced number of six climate change scenarios generates average climatic conditions very close to those obtained from a set of 27 scenarios available before reduction. A case study on potential gains and losses of habitat by three northeastern American tree species shows that potential future species distributions projected from the selected six climate change scenarios are very similar to those obtained from the full set of 27, although with some spatial discrepancies at the edges of species distributions. In contrast, projections based on just a few climate models vary strongly according to the initial choice of climate models. We give clear guidance on how to reduce the number of climate change scenarios while retaining the central tendencies and coverage of uncertainty in future climatic conditions. This should be particularly useful during future climate change impact studies as more than twice as many climate models were reported in the fifth assessment report of IPCC compared to the previous one.

  4. Impact of inherent meteorology uncertainty on air quality ...

    EPA Pesticide Factsheets

    It is well established that there are a number of different classifications and sources of uncertainties in environmental modeling systems. Air quality models rely on two key inputs, namely, meteorology and emissions. When using air quality models for decision making, it is important to understand how uncertainties in these inputs affect the simulated concentrations. Ensembles are one method to explore how uncertainty in meteorology affects air pollution concentrations. Most studies explore this uncertainty by running different meteorological models or the same model with different physics options and in some cases combinations of different meteorological and air quality models. While these have been shown to be useful techniques in some cases, we present a technique that leverages the initial condition perturbations of a weather forecast ensemble, namely, the Short-Range Ensemble Forecast system to drive the four-dimensional data assimilation in the Weather Research and Forecasting (WRF)-Community Multiscale Air Quality (CMAQ) model with a key focus being the response of ozone chemistry and transport. Results confirm that a sizable spread in WRF solutions, including common weather variables of temperature, wind, boundary layer depth, clouds, and radiation, can cause a relatively large range of ozone-mixing ratios. Pollutant transport can be altered by hundreds of kilometers over several days. Ozone-mixing ratios of the ensemble can vary as much as 10–20 ppb

  5. Calibrating airborne measurements of airspeed, pressure and temperature using a Doppler laser air-motion sensor

    NASA Astrophysics Data System (ADS)

    Cooper, W. A.; Spuler, S. M.; Spowart, M.; Lenschow, D. H.; Friesen, R. B.

    2014-09-01

    A new laser air-motion sensor measures the true airspeed with a standard uncertainty of less than 0.1 m s-1 and so reduces uncertainty in the measured component of the relative wind along the longitudinal axis of the aircraft to about the same level. The calculated pressure expected from that airspeed at the inlet of a pitot tube then provides a basis for calibrating the measurements of dynamic and static pressure, reducing standard uncertainty in those measurements to less than 0.3 hPa and the precision applicable to steady flight conditions to about 0.1 hPa. These improved measurements of pressure, combined with high-resolution measurements of geometric altitude from the global positioning system, then indicate (via integrations of the hydrostatic equation during climbs and descents) that the offset and uncertainty in temperature measurement for one research aircraft are +0.3 ± 0.3 °C. For airspeed, pressure and temperature, these are significant reductions in uncertainty vs. those obtained from calibrations using standard techniques. Finally, it is shown that although the initial calibration of the measured static and dynamic pressures requires a measured temperature, once calibrated these measured pressures and the measurement of airspeed from the new laser air-motion sensor provide a measurement of temperature that does not depend on any other temperature sensor.

  6. Global assessment of water policy vulnerability under uncertainty in water scarcity projections

    NASA Astrophysics Data System (ADS)

    Greve, Peter; Kahil, Taher; Satoh, Yusuke; Burek, Peter; Fischer, Günther; Tramberend, Sylvia; Byers, Edward; Flörke, Martina; Eisner, Stephanie; Hanasaki, Naota; Langan, Simon; Wada, Yoshihide

    2017-04-01

    Water scarcity is a critical environmental issue worldwide, which has been driven by the significant increase in water extractions during the last century. In the coming decades, climate change is projected to further exacerbate water scarcity conditions in many regions around the world. At present, one important question for policy debate is the identification of water policy interventions that could address the mounting water scarcity problems. Main interventions include investing in water storage infrastructures, water transfer canals, efficient irrigation systems, and desalination plants, among many others. This type of interventions involve long-term planning, long-lived investments and some irreversibility in choices which can shape development of countries for decades. Making decisions on these water infrastructures requires anticipating the long term environmental conditions, needs and constraints under which they will function. This brings large uncertainty in the decision-making process, for instance from demographic or economic projections. But today, climate change is bringing another layer of uncertainty that make decisions even more complex. In this study, we assess in a probabilistic approach the uncertainty in global water scarcity projections following different socioeconomic pathways (SSPs) and climate scenarios (RCPs) within the first half of the 21st century. By utilizing an ensemble of 45 future water scarcity projections based on (i) three state-of-the-art global hydrological models (PCR-GLOBWB, H08, and WaterGAP), (ii) five climate models, and (iii) three water scenarios, we have assessed changes in water scarcity and the associated uncertainty distribution worldwide. The water scenarios used here are developed by IIASA's Water Futures and Solutions (WFaS) Initiative. The main objective of this study is to improve the contribution of hydro-climatic information to effective policymaking by identifying spatial and temporal policy vulnerabilities under large uncertainty about the future socio-economic and climatic changes and to guide policymakers in charting a more sustainable pathway and avoiding maladaptive development pathways. The results show that water scarcity is increasing in up to 83% of all land area under a high-emission scenario (RCP 6.0-SSP3). Importantly, the range of uncertainty in projected water scarcity is increasing; in some regions by several orders of magnitude (e.g. sub-Saharan Africa, eastern Europe, Central Asia). This is further illustrated by focusing on a set of large river basins that will be subject both to substantial changes in basin-wide water scarcity and to strong increases in the overall range of uncertainty (e.g. the Niger, Indus, Yangtze). These conditions pose a significant challenge for water management options in those vulnerable basins, complicating decisions on needed investments in water supply infrastructure and other system improvements, and leading to the degradation of valuable resources such as non-renewable groundwater resources and water-dependent ecosystems. The results of this study call for careful and deliberative design of water policy interventions under a wide range of socio-economic and climate conditions.

  7. Management of California Oak Woodlands: Uncertainties and Modeling

    Treesearch

    Jay E. Noel; Richard P. Thompson

    1995-01-01

    A mathematical policy model of oak woodlands is presented. The model illustrates the policy uncertainties that exist in the management of oak woodlands. These uncertainties include: (1) selection of a policy criterion function, (2) woodland dynamics, (3) initial and final state of the woodland stock. The paper provides a review of each of the uncertainty issues. The...

  8. Topics in General Relativity theory: Gravitational-wave measurements of black-hole parameters; gravitational collapse of a cylindrical body; and classical-particle evolution in the presence of closed, timelike curves

    NASA Astrophysics Data System (ADS)

    Echeverria, Fernando

    I study three different topics in general relativity. The first study investigates the accuracy with which the mass and angular momentum of a black hole can be determined by measurements of gravitational waves from the hole, using a gravitational-wave detector. The black hole is assumed to have been strongly perturbed and the detector measures the waves produced by its resulting vibration and ring-down. The uncertainties in the measured parameters arise from the noise present in the detector. It is found that the faster the hole rotates, the more accurate the measurements will be, with the uncertainty in the angular momentum decreasing rapidly with increasing rotation speed. The second study is an analysis of the gravitational collapse of an infinitely long, cylindrical dust shell, an idealization of more realistic, finite-length bodies. It is found that the collapse evolves into a naked singularity in finite time. Analytical expressions for the variables describing the collapse are found at late times, near the singularity. The collapse is also followed, with a numerical simulation, from the start until very close to the singularity. The singularity is found to be strong, in the sense that an observer riding on the shell will be infinitely stretched in one direction and infinitely compressed in another. The gravitational waves emitted from the collapse are also analyzed. The last study focuses on the consequences of the existence of closed time like curves in a worm hole space time. One might expect that such curves might cause a system with apparently well-posed initial conditions to have no self-consistent evolution. We study the case of a classical particle with a hard-sphere potential, focusing attention on initial conditions for which the evolution, if followed naively, is self-inconsistent: the ball travels to the past through the worm hole colliding with its younger self, preventing itself from entering the worm hole. We find, surprisingly, that for all such 'dangerous' initial conditions, there are an infinite number of self-consistent solutions. We also find that for many non-dangerous initial conditions, there also exist an infinity of possible evolutions.

  9. Cued uncertainty modulates later recognition of emotional pictures: An ERP study.

    PubMed

    Lin, Huiyan; Xiang, Jing; Li, Saili; Liang, Jiafeng; Zhao, Dongmei; Yin, Desheng; Jin, Hua

    2017-06-01

    Previous studies have shown that uncertainty about the emotional content of an upcoming event modulates event-related potentials (ERPs) during the encoding of the event, and this modulation is affected by whether there are cues (i.e., cued uncertainty) or not (i.e., uncued uncertainty) prior to the encoding of the uncertain event. Recently, we showed that uncued uncertainty affected ERPs in later recognition of the emotional event. However, it is as yet unknown how the ERP effects of recognition are modulated by cued uncertainty. To address this issue, participants were asked to view emotional (negative and neutral) pictures that were presented after cues. The cues either indicated the emotional content of the pictures (the certain condition) or not (the cued uncertain condition). Subsequently, participants had to perform an unexpected old/new task in which old and novel pictures were shown without any cues. ERP data in the old/new task showed smaller P2 amplitudes for neutral pictures in the cued uncertain condition compared to the certain condition, but this uncertainty effect was not observed for negative pictures. Additionally, P3 amplitudes were generally enlarged for pictures in the cued uncertain condition. Taken together, the present findings indicate that cued uncertainty alters later recognition of emotional events in relevance to feature processing and attention allocation. Copyright © 2017. Published by Elsevier B.V.

  10. Quantifying Information Gain from Dynamic Downscaling Experiments

    NASA Astrophysics Data System (ADS)

    Tian, Y.; Peters-Lidard, C. D.

    2015-12-01

    Dynamic climate downscaling experiments are designed to produce information at higher spatial and temporal resolutions. Such additional information is generated from the low-resolution initial and boundary conditions via the predictive power of the physical laws. However, errors and uncertainties in the initial and boundary conditions can be propagated and even amplified to the downscaled simulations. Additionally, the limit of predictability in nonlinear dynamical systems will also damper the information gain, even if the initial and boundary conditions were error-free. Thus it is critical to quantitatively define and measure the amount of information increase from dynamic downscaling experiments, to better understand and appreciate their potentials and limitations. We present a scheme to objectively measure the information gain from such experiments. The scheme is based on information theory, and we argue that if a downscaling experiment is to exhibit value, it has to produce more information than what can be simply inferred from information sources already available. These information sources include the initial and boundary conditions, the coarse resolution model in which the higher-resolution models are embedded, and the same set of physical laws. These existing information sources define an "information threshold" as a function of the spatial and temporal resolution, and this threshold serves as a benchmark to quantify the information gain from the downscaling experiments, or any other approaches. For a downscaling experiment to shown any value, the information has to be above this threshold. A recent NASA-supported downscaling experiment is used as an example to illustrate the application of this scheme.

  11. Robust pre-specified time synchronization of chaotic systems by employing time-varying switching surfaces in the sliding mode control scheme

    NASA Astrophysics Data System (ADS)

    Khanzadeh, Alireza; Pourgholi, Mahdi

    2016-08-01

    In the conventional chaos synchronization methods, the time at which two chaotic systems are synchronized, is usually unknown and depends on initial conditions. In this work based on Lyapunov stability theory a sliding mode controller with time-varying switching surfaces is proposed to achieve chaos synchronization at a pre-specified time for the first time. The proposed controller is able to synchronize chaotic systems precisely at any time when we want. Moreover, by choosing the time-varying switching surfaces in a way that the reaching phase is eliminated, the synchronization becomes robust to uncertainties and exogenous disturbances. Simulation results are presented to show the effectiveness of the proposed method of stabilizing and synchronizing chaotic systems with complete robustness to uncertainty and disturbances exactly at a pre-specified time.

  12. Initial information prior to movement onset influences kinematics of upward arm pointing movements

    PubMed Central

    Pozzo, Thierry; White, Olivier

    2016-01-01

    To elaborate a motor plan and perform online control in the gravity field, the brain relies on priors and multisensory integration of information. In particular, afferent and efferent inputs related to the initial state are thought to convey sensorimotor information to plan the upcoming action. Yet it is still unclear to what extent these cues impact motor planning. Here we examined the role of initial information on the planning and execution of arm movements. Participants performed upward arm movements around the shoulder at three speeds and in two arm conditions. In the first condition, the arm was outstretched horizontally and required a significant muscular command to compensate for the gravitational shoulder torque before movement onset. In contrast, in the second condition the arm was passively maintained in the same position with a cushioned support and did not require any muscle contraction before movement execution. We quantified differences in motor performance by comparing shoulder velocity profiles. Previous studies showed that asymmetric velocity profiles reflect an optimal integration of the effects of gravity on upward movements. Consistent with this, we found decreased acceleration durations in both arm conditions. However, early differences in kinematic asymmetries and EMG patterns between the two conditions signaled a change of the motor plan. This different behavior carried on through trials when the arm was at rest before movement onset and may reveal a distinct motor strategy chosen in the context of uncertainty. Altogether, we suggest that the information available online must be complemented by accurate initial information. PMID:27486106

  13. Initial information prior to movement onset influences kinematics of upward arm pointing movements.

    PubMed

    Rousseau, Célia; Papaxanthis, Charalambos; Gaveau, Jérémie; Pozzo, Thierry; White, Olivier

    2016-10-01

    To elaborate a motor plan and perform online control in the gravity field, the brain relies on priors and multisensory integration of information. In particular, afferent and efferent inputs related to the initial state are thought to convey sensorimotor information to plan the upcoming action. Yet it is still unclear to what extent these cues impact motor planning. Here we examined the role of initial information on the planning and execution of arm movements. Participants performed upward arm movements around the shoulder at three speeds and in two arm conditions. In the first condition, the arm was outstretched horizontally and required a significant muscular command to compensate for the gravitational shoulder torque before movement onset. In contrast, in the second condition the arm was passively maintained in the same position with a cushioned support and did not require any muscle contraction before movement execution. We quantified differences in motor performance by comparing shoulder velocity profiles. Previous studies showed that asymmetric velocity profiles reflect an optimal integration of the effects of gravity on upward movements. Consistent with this, we found decreased acceleration durations in both arm conditions. However, early differences in kinematic asymmetries and EMG patterns between the two conditions signaled a change of the motor plan. This different behavior carried on through trials when the arm was at rest before movement onset and may reveal a distinct motor strategy chosen in the context of uncertainty. Altogether, we suggest that the information available online must be complemented by accurate initial information. Copyright © 2016 the American Physiological Society.

  14. Assessment of Uncertainty-Infused Scientific Argumentation

    ERIC Educational Resources Information Center

    Lee, Hee-Sun; Liu, Ou Lydia; Pallant, Amy; Roohr, Katrina Crotts; Pryputniewicz, Sarah; Buck, Zoë E.

    2014-01-01

    Though addressing sources of uncertainty is an important part of doing science, it has largely been neglected in assessing students' scientific argumentation. In this study, we initially defined a scientific argumentation construct in four structural elements consisting of claim, justification, uncertainty qualifier, and uncertainty…

  15. CALIBRATION, OPTIMIZATION, AND SENSITIVITY AND UNCERTAINTY ALGORITHMS APPLICATION PROGRAMMING INTERFACE (COSU-API)

    EPA Science Inventory

    The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and Parameter Estimation (UA/SA/PE API) tool development, here fore referred to as the Calibration, Optimization, and Sensitivity and Uncertainty Algorithms API (COSU-API), was initially d...

  16. Assessing Degree of Susceptibility to Landslide Hazard

    NASA Astrophysics Data System (ADS)

    Sheridan, M. F.; Cordoba, G. A.; Delgado, H.; Stefanescu, R.

    2013-05-01

    The modeling of hazardous mass flows, both dry and water saturated, is currently an area of active research and several stable models have now emerged that have differing degrees of physical and mathematical fidelity. Models based on the early work of Savage and Hutter (1989) assume that very large dense granular flows could be modeled as incompressible continua governed by a Coulomb failure criterion. Based on this concept, Patra et al. (2005) developed a code for dry avalanches, which proposes a thin layer mathematical model similar to shallow-water equations. This concept was implemented in the widely-used TITAN2D program, which integrates the shock-capturing Godunov solution methodology for the equation system. We propose a method to assess the susceptibility of specific locations susceptible to landslides following heavy tephra fall using the TIATN2D code. Successful application requires that the range of several uncertainties must be framed in the selection of model input data: 1) initial conditions, like volume and location of origin of the landslide, 2) bed and internal friction parameters and 3) digital elevation model (DEM) uncertainties. Among the possible ways of coping with these uncertainties, we chose to use Latin Hypercube Sampling (LHS). This statistical technique reduces a computationally intractable problem to such an extent that is it possible to apply it, even with current personal computers. LHS requires that there is only one sample in each row and each column of the sampling matrix, where each row (multi-dimensional) corresponds to each uncertainty. LHS requires less than 10% of the sample runs needed by Monte Carlo approaches to achieve a stable solution. In our application LHS output provides model sampling for 4 input parameters: initial random volumes, UTM location (x and y), and bed friction. We developed a simple Octave script to link the output of LHS with TITAN2D. In this way, TITAN2D can run several times with successively different initial conditions provided by the LHC routine. Finally the set of results from TITAN2D are computed to obtain the distribution of maximum exceedance probability given that a landslide occurs at a place of interest. We apply this method to find sectors least prone to be affected by landslides, in a region along the Panamerican Highway in the southern part of Colombia. The goal of such a study is to provide decision makers to improve their assessments regarding permissions for development along the highway.

  17. Uncertainty information in climate data records from Earth observation

    NASA Astrophysics Data System (ADS)

    Merchant, Christopher J.; Paul, Frank; Popp, Thomas; Ablain, Michael; Bontemps, Sophie; Defourny, Pierre; Hollmann, Rainer; Lavergne, Thomas; Laeng, Alexandra; de Leeuw, Gerrit; Mittaz, Jonathan; Poulsen, Caroline; Povey, Adam C.; Reuter, Max; Sathyendranath, Shubha; Sandven, Stein; Sofieva, Viktoria F.; Wagner, Wolfgang

    2017-07-01

    The question of how to derive and present uncertainty information in climate data records (CDRs) has received sustained attention within the European Space Agency Climate Change Initiative (CCI), a programme to generate CDRs addressing a range of essential climate variables (ECVs) from satellite data. Here, we review the nature, mathematics, practicalities, and communication of uncertainty information in CDRs from Earth observations. This review paper argues that CDRs derived from satellite-based Earth observation (EO) should include rigorous uncertainty information to support the application of the data in contexts such as policy, climate modelling, and numerical weather prediction reanalysis. Uncertainty, error, and quality are distinct concepts, and the case is made that CDR products should follow international metrological norms for presenting quantified uncertainty. As a baseline for good practice, total standard uncertainty should be quantified per datum in a CDR, meaning that uncertainty estimates should clearly discriminate more and less certain data. In this case, flags for data quality should not duplicate uncertainty information, but instead describe complementary information (such as the confidence in the uncertainty estimate provided or indicators of conditions violating the retrieval assumptions). The paper discusses the many sources of error in CDRs, noting that different errors may be correlated across a wide range of timescales and space scales. Error effects that contribute negligibly to the total uncertainty in a single-satellite measurement can be the dominant sources of uncertainty in a CDR on the large space scales and long timescales that are highly relevant for some climate applications. For this reason, identifying and characterizing the relevant sources of uncertainty for CDRs is particularly challenging. The characterization of uncertainty caused by a given error effect involves assessing the magnitude of the effect, the shape of the error distribution, and the propagation of the uncertainty to the geophysical variable in the CDR accounting for its error correlation properties. Uncertainty estimates can and should be validated as part of CDR validation when possible. These principles are quite general, but the approach to providing uncertainty information appropriate to different ECVs is varied, as confirmed by a brief review across different ECVs in the CCI. User requirements for uncertainty information can conflict with each other, and a variety of solutions and compromises are possible. The concept of an ensemble CDR as a simple means of communicating rigorous uncertainty information to users is discussed. Our review concludes by providing eight concrete recommendations for good practice in providing and communicating uncertainty in EO-based climate data records.

  18. The Initial Atmospheric Transport (IAT) Code: Description and Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morrow, Charles W.; Bartel, Timothy James

    The Initial Atmospheric Transport (IAT) computer code was developed at Sandia National Laboratories as part of their nuclear launch accident consequences analysis suite of computer codes. The purpose of IAT is to predict the initial puff/plume rise resulting from either a solid rocket propellant or liquid rocket fuel fire. The code generates initial conditions for subsequent atmospheric transport calculations. The Initial Atmospheric Transfer (IAT) code has been compared to two data sets which are appropriate to the design space of space launch accident analyses. The primary model uncertainties are the entrainment coefficients for the extended Taylor model. The Titan 34Dmore » accident (1986) was used to calibrate these entrainment settings for a prototypic liquid propellant accident while the recent Johns Hopkins University Applied Physics Laboratory (JHU/APL, or simply APL) large propellant block tests (2012) were used to calibrate the entrainment settings for prototypic solid propellant accidents. North American Meteorology (NAM )formatted weather data profiles are used by IAT to determine the local buoyancy force balance. The IAT comparisons for the APL solid propellant tests illustrate the sensitivity of the plume elevation to the weather profiles; that is, the weather profile is a dominant factor in determining the plume elevation. The IAT code performed remarkably well and is considered validated for neutral weather conditions.« less

  19. Uncertainty Evaluation of Residential Central Air-conditioning Test System

    NASA Astrophysics Data System (ADS)

    Li, Haoxue

    2018-04-01

    According to national standards, property tests of air-conditioning are required. However, test results could be influenced by the precision of apparatus or measure errors. Therefore, uncertainty evaluation of property tests should be conducted. In this paper, the uncertainties are calculated on the property tests of Xinfei13.6 kW residential central air-conditioning. The evaluation result shows that the property tests are credible.

  20. Validation of aerosol optical depth uncertainties within the ESA Climate Change Initiative

    NASA Astrophysics Data System (ADS)

    Stebel, Kerstin; Povey, Adam; Popp, Thomas; Capelle, Virginie; Clarisse, Lieven; Heckel, Andreas; Kinne, Stefan; Klueser, Lars; Kolmonen, Pekka; de Leeuw, Gerrit; North, Peter R. J.; Pinnock, Simon; Sogacheva, Larisa; Thomas, Gareth; Vandenbussche, Sophie

    2017-04-01

    Uncertainty is a vital component of any climate data record as it provides the context with which to understand the quality of the data and compare it to other measurements. Therefore, pixel-level uncertainties are provided for all aerosol products that have been developed in the framework of the Aerosol_cci project within ESA's Climate Change Initiative (CCI). Validation of these estimated uncertainties is necessary to demonstrate that they provide a useful representation of the distribution of error. We propose a technique for the statistical validation of AOD (aerosol optical depth) uncertainty by comparison to high-quality ground-based observations and present results for ATSR (Along Track Scanning Radiometer) and IASI (Infrared Atmospheric Sounding Interferometer) data records. AOD at 0.55 µm and its uncertainty was calculated with three AOD retrieval algorithms using data from the ATSR instruments (ATSR-2 (1995-2002) and AATSR (2002-2012)). Pixel-level uncertainties were calculated through error propagation (ADV/ASV, ORAC algorithms) or parameterization of the error's dependence on the geophysical retrieval conditions (SU algorithm). Level 2 data are given as super-pixels of 10 km x 10 km. As validation data, we use direct-sun observations of AOD from the AERONET (AErosol RObotic NETwork) and MAN (Maritime Aerosol Network) sun-photometer networks, which are substantially more accurate than satellite retrievals. Neglecting the uncertainty in AERONET observations and possible issues with their ability to represent a satellite pixel area, the error in the retrieval can be approximated by the difference between the satellite and AERONET retrievals (herein referred to as "error"). To evaluate how well the pixel-level uncertainty represents the observed distribution of error, we look at the distribution of the ratio D between the "error" and the ATSR uncertainty. If uncertainties are well represented, D should be normally distributed and 68.3% of values should fall within the range [-1, +1]. A non-zero mean of D indicates the presence of residual systematic errors. If the fraction is smaller than 68%, uncertainties are underestimated; if it is larger, uncertainties are overestimated. For the three ATSR algorithms, we provide statistics and an evaluation at a global scale (separately for land and ocean/coastal regions), for high/low AOD regimes, and seasonal and regional statistics (e.g. Europe, N-Africa, East-Asia, N-America). We assess the long-term stability of the uncertainty estimates over the 17-year time series, and the consistency between ATSR-2 and AATSR results (during their period of overlap). Furthermore, we exploit the possibility to adapt the uncertainty validation concept to the IASI datasets. Ten-year data records (2007-2016) of dust AOD have been generated with four algorithms using IASI observations over the greater Sahara region [80°W - 120°E, 0°N - 40°N]. For validation, the coarse mode AOD at 0.55 μm from the AERONET direct-sun spectral deconvolution algorithm (SDA) product may be used as a proxy for desert dust. The uncertainty validation results for IASI are still tentative, as larger IASI pixel sizes and the conversion of the IASI AOD values from infrared to visible wavelengths for comparison to ground-based observations introduces large uncertainties.

  1. An Uncertainty Quantification Framework for Prognostics and Condition-Based Monitoring

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar; Goebel, Kai

    2014-01-01

    This paper presents a computational framework for uncertainty quantification in prognostics in the context of condition-based monitoring of aerospace systems. The different sources of uncertainty and the various uncertainty quantification activities in condition-based prognostics are outlined in detail, and it is demonstrated that the Bayesian subjective approach is suitable for interpreting uncertainty in online monitoring. A state-space model-based framework for prognostics, that can rigorously account for the various sources of uncertainty, is presented. Prognostics consists of two important steps. First, the state of the system is estimated using Bayesian tracking, and then, the future states of the system are predicted until failure, thereby computing the remaining useful life of the system. The proposed framework is illustrated using the power system of a planetary rover test-bed, which is being developed and studied at NASA Ames Research Center.

  2. Practical implementation of a particle filter data assimilation approach to estimate initial hydrologic conditions and initialize medium-range streamflow forecasts

    NASA Astrophysics Data System (ADS)

    Clark, Elizabeth; Wood, Andy; Nijssen, Bart; Mendoza, Pablo; Newman, Andy; Nowak, Kenneth; Arnold, Jeffrey

    2017-04-01

    In an automated forecast system, hydrologic data assimilation (DA) performs the valuable function of correcting raw simulated watershed model states to better represent external observations, including measurements of streamflow, snow, soil moisture, and the like. Yet the incorporation of automated DA into operational forecasting systems has been a long-standing challenge due to the complexities of the hydrologic system, which include numerous lags between state and output variations. To help demonstrate that such methods can succeed in operational automated implementations, we present results from the real-time application of an ensemble particle filter (PF) for short-range (7 day lead) ensemble flow forecasts in western US river basins. We use the System for Hydromet Applications, Research and Prediction (SHARP), developed by the National Center for Atmospheric Research (NCAR) in collaboration with the University of Washington, U.S. Army Corps of Engineers, and U.S. Bureau of Reclamation. SHARP is a fully automated platform for short-term to seasonal hydrologic forecasting applications, incorporating uncertainty in initial hydrologic conditions (IHCs) and in hydrometeorological predictions through ensemble methods. In this implementation, IHC uncertainty is estimated by propagating an ensemble of 100 temperature and precipitation time series through conceptual and physically-oriented models. The resulting ensemble of derived IHCs exhibits a broad range of possible soil moisture and snow water equivalent (SWE) states. The PF selects and/or weights and resamples the IHCs that are most consistent with external streamflow observations, and uses the particles to initialize a streamflow forecast ensemble driven by ensemble precipitation and temperature forecasts downscaled from the Global Ensemble Forecast System (GEFS). We apply this method in real-time for several basins in the western US that are important for water resources management, and perform a hindcast experiment to evaluate the utility of PF-based data assimilation on streamflow forecasts skill. This presentation describes findings, including a comparison of sequential and non-sequential particle weighting methods.

  3. Uncertainty for calculating transport on Titan: A probabilistic description of bimolecular diffusion parameters

    NASA Astrophysics Data System (ADS)

    Plessis, S.; McDougall, D.; Mandt, K.; Greathouse, T.; Luspay-Kuti, A.

    2015-11-01

    Bimolecular diffusion coefficients are important parameters used by atmospheric models to calculate altitude profiles of minor constituents in an atmosphere. Unfortunately, laboratory measurements of these coefficients were never conducted at temperature conditions relevant to the atmosphere of Titan. Here we conduct a detailed uncertainty analysis of the bimolecular diffusion coefficient parameters as applied to Titan's upper atmosphere to provide a better understanding of the impact of uncertainty for this parameter on models. Because temperature and pressure conditions are much lower than the laboratory conditions in which bimolecular diffusion parameters were measured, we apply a Bayesian framework, a problem-agnostic framework, to determine parameter estimates and associated uncertainties. We solve the Bayesian calibration problem using the open-source QUESO library which also performs a propagation of uncertainties in the calibrated parameters to temperature and pressure conditions observed in Titan's upper atmosphere. Our results show that, after propagating uncertainty through the Massman model, the uncertainty in molecular diffusion is highly correlated to temperature and we observe no noticeable correlation with pressure. We propagate the calibrated molecular diffusion estimate and associated uncertainty to obtain an estimate with uncertainty due to bimolecular diffusion for the methane molar fraction as a function of altitude. Results show that the uncertainty in methane abundance due to molecular diffusion is in general small compared to eddy diffusion and the chemical kinetics description. However, methane abundance is most sensitive to uncertainty in molecular diffusion above 1200 km where the errors are nontrivial and could have important implications for scientific research based on diffusion models in this altitude range.

  4. The modification of generalized uncertainty principle applied in the detection technique of femtosecond laser

    NASA Astrophysics Data System (ADS)

    Li, Ziyi

    2017-12-01

    Generalized uncertainty principle (GUP), also known as the generalized uncertainty relationship, is the modified form of the classical Heisenberg’s Uncertainty Principle in special cases. When we apply quantum gravity theories such as the string theory, the theoretical results suggested that there should be a “minimum length of observation”, which is about the size of the Planck-scale (10-35m). Taking into account the basic scale of existence, we need to fix a new common form of Heisenberg’s uncertainty principle in the thermodynamic system and make effective corrections to statistical physical questions concerning about the quantum density of states. Especially for the condition at high temperature and high energy levels, generalized uncertainty calculations have a disruptive impact on classical statistical physical theories but the present theory of Femtosecond laser is still established on the classical Heisenberg’s Uncertainty Principle. In order to improve the detective accuracy and temporal resolution of the Femtosecond laser, we applied the modified form of generalized uncertainty principle to the wavelength, energy and pulse time of Femtosecond laser in our work. And we designed three typical systems from micro to macro size to estimate the feasibility of our theoretical model and method, respectively in the chemical solution condition, crystal lattice condition and nuclear fission reactor condition.

  5. Effects of relational uncertainty in heightening national identification and reactive approach motivation of Japanese.

    PubMed

    Terashima, Yuto; Takai, Jiro

    2017-03-23

    This study investigated whether relational uncertainty poses uncertainty threat, which causes compensatory behaviours among Japanese. We hypothesised that Japanese, as collectivists, would perceive relational uncertainty to pose uncertainty threat. In two experiments, we manipulated relational uncertainty, and confirmed that participants exhibited compensatory reactions to reduce aversive feelings due to it. In Study 1, we conducted direct comparison between relational uncertainty, independent self-uncertainty and control conditions. The results revealed that participants who were instructed to imagine events pertaining to relational uncertainty heightened national identification as compensation than did participants in the control condition, but independent self-uncertainty did not provoke such effects. In Study 2, we again manipulated relational uncertainty; however, we also manipulated participants' individualism-collectivism cultural orientation through priming, and the analyses yielded a significant interaction effect between these variables. Relational uncertainty evoked reactive approach motivation, a cause for compensatory behaviours, among participants primed with collectivism, but not for individualism. It was concluded that the effect of uncertainty on compensatory behaviour is influenced by cultural priming, and that relational uncertainty is important to Japanese. © 2017 International Union of Psychological Science.

  6. Uncertainty during pain anticipation: the adaptive value of preparatory processes.

    PubMed

    Seidel, Eva-Maria; Pfabigan, Daniela M; Hahn, Andreas; Sladky, Ronald; Grahl, Arvina; Paul, Katharina; Kraus, Christoph; Küblböck, Martin; Kranz, Georg S; Hummer, Allan; Lanzenberger, Rupert; Windischberger, Christian; Lamm, Claus

    2015-02-01

    Anticipatory processes prepare the organism for upcoming experiences. The aim of this study was to investigate neural responses related to anticipation and processing of painful stimuli occurring with different levels of uncertainty. Twenty-five participants (13 females) took part in an electroencephalography and functional magnetic resonance imaging (fMRI) experiment at separate times. A visual cue announced the occurrence of an electrical painful or nonpainful stimulus, delivered with certainty or uncertainty (50% chance), at some point during the following 15 s. During the first 2 s of the anticipation phase, a strong effect of uncertainty was reflected in a pronounced frontal stimulus-preceding negativity (SPN) and increased fMRI activation in higher visual processing areas. In the last 2 s before stimulus delivery, we observed stimulus-specific preparatory processes indicated by a centroparietal SPN and posterior insula activation that was most pronounced for the certain pain condition. Uncertain anticipation was associated with attentional control processes. During stimulation, the results revealed that unexpected painful stimuli produced the strongest activation in the affective pain processing network and a more pronounced offset-P2. Our results reflect that during early anticipation uncertainty is strongly associated with affective mechanisms and seems to be a more salient event compared to certain anticipation. During the last 2 s before stimulation, attentional control mechanisms are initiated related to the increased salience of uncertainty. Furthermore, stimulus-specific preparatory mechanisms during certain anticipation also shaped the response to stimulation, underlining the adaptive value of stimulus-targeted preparatory activity which is less likely when facing an uncertain event. © 2014 Wiley Periodicals, Inc.

  7. Visualizing Uncertainty for Probabilistic Weather Forecasting based on Reforecast Analogs

    NASA Astrophysics Data System (ADS)

    Pelorosso, Leandro; Diehl, Alexandra; Matković, Krešimir; Delrieux, Claudio; Ruiz, Juan; Gröeller, M. Eduard; Bruckner, Stefan

    2016-04-01

    Numerical weather forecasts are prone to uncertainty coming from inaccuracies in the initial and boundary conditions and lack of precision in numerical models. Ensemble of forecasts partially addresses these problems by considering several runs of the numerical model. Each forecast is generated with different initial and boundary conditions and different model configurations [GR05]. The ensembles can be expressed as probabilistic forecasts, which have proven to be very effective in the decision-making processes [DE06]. The ensemble of forecasts represents only some of the possible future atmospheric states, usually underestimating the degree of uncertainty in the predictions [KAL03, PH06]. Hamill and Whitaker [HW06] introduced the "Reforecast Analog Regression" (RAR) technique to overcome the limitations of ensemble forecasting. This technique produces probabilistic predictions based on the analysis of historical forecasts and observations. Visual analytics provides tools for processing, visualizing, and exploring data to get new insights and discover hidden information patterns in an interactive exchange between the user and the application [KMS08]. In this work, we introduce Albero, a visual analytics solution for probabilistic weather forecasting based on the RAR technique. Albero targets at least two different type of users: "forecasters", who are meteorologists working in operational weather forecasting and "researchers", who work in the construction of numerical prediction models. Albero is an efficient tool for analyzing precipitation forecasts, allowing forecasters to make and communicate quick decisions. Our solution facilitates the analysis of a set of probabilistic forecasts, associated statistical data, observations and uncertainty. A dashboard with small-multiples of probabilistic forecasts allows the forecasters to analyze at a glance the distribution of probabilities as a function of time, space, and magnitude. It provides the user with a more accurate measure of forecast uncertainty that could result in better decision-making. It offers different level of abstractions to help with the recalibration of the RAR method. It also has an inspection tool that displays the selected analogs, their observations and statistical data. It gives the users access to inner parts of the method, unveiling hidden information. References [GR05] GNEITING T., RAFTERY A. E.: Weather forecasting with ensemble methods. Science 310, 5746, 248-249, 2005. [KAL03] KALNAY E.: Atmospheric modeling, data assimilation and predictability. Cambridge University Press, 2003. [PH06] PALMER T., HAGEDORN R.: Predictability of weather and climate. Cambridge University Press, 2006. [HW06] HAMILL T. M., WHITAKER J. S.: Probabilistic quantitative precipitation forecasts based on reforecast analogs: Theory and application. Monthly Weather Review 134, 11, 3209-3229, 2006. [DE06] DEITRICK S., EDSALL R.: The influence of uncertainty visualization on decision making: An empirical evaluation. Springer, 2006. [KMS08] KEIM D. A., MANSMANN F., SCHNEIDEWIND J., THOMAS J., ZIEGLER H.: Visual analytics: Scope and challenges. Springer, 2008.

  8. Reduction of uncertainty for estimating runoff with the NRCS CN model by the adaptation to local climatic conditions

    NASA Astrophysics Data System (ADS)

    Durán-Barroso, Pablo; González, Javier; Valdés, Juan B.

    2016-04-01

    Rainfall-runoff quantification is one of the most important tasks in both engineering and watershed management as it allows to identify, forecast and explain watershed response. For that purpose, the Natural Resources Conservation Service Curve Number method (NRCS CN) is the conceptual lumped model more recognized in the field of rainfall-runoff estimation. Furthermore, there is still an ongoing discussion about the procedure to determine the portion of rainfall retained in the watershed before runoff is generated, called as initial abstractions. This concept is computed as a ratio (λ) of the soil potential maximum retention S of the watershed. Initially, this ratio was assumed to be 0.2, but later it has been proposed to be modified to 0.05. However, the actual procedures to convert NRCS CN model parameters obtained under a different hypothesis about λ do not incorporate any adaptation of climatic conditions of each watershed. By this reason, we propose a new simple method for computing model parameters which is adapted to local conditions taking into account regional patterns of climate conditions. After checking the goodness of this procedure against the actual ones in 34 different watersheds located in Ohio and Texas (United States), we concluded that this novel methodology represents the most accurate and efficient alternative to refit the initial abstraction ratio.

  9. Anchorage Arrival Scheduling Under Off-Nominal Weather Conditions

    NASA Technical Reports Server (NTRS)

    Grabbe, Shon; Chan, William N.; Mukherjee, Avijit

    2012-01-01

    Weather can cause flight diversions, passenger delays, additional fuel consumption and schedule disruptions at any high volume airport. The impacts are particularly acute at the Ted Stevens Anchorage International Airport in Anchorage, Alaska due to its importance as a major international portal. To minimize the impacts due to weather, a multi-stage scheduling process is employed that is iteratively executed, as updated aircraft demand and/or airport capacity data become available. The strategic scheduling algorithm assigns speed adjustments for flights that originate outside of Anchorage Center to achieve the proper demand and capacity balance. Similarly, an internal departure-scheduling algorithm assigns ground holds for pre-departure flights that originate from within Anchorage Center. Tactical flight controls in the form of airborne holding are employed to reactively account for system uncertainties. Real-world scenarios that were derived from the January 16, 2012 Anchorage visibility observations and the January 12, 2012 Anchorage arrival schedule were used to test the initial implementation of the scheduling algorithm in fast-time simulation experiments. Although over 90% of the flights in the scenarios arrived at Anchorage without requiring any delay, pre-departure scheduling was the dominant form of control for Anchorage arrivals. Additionally, tactical scheduling was used extensively in conjunction with the pre-departure scheduling to reactively compensate for uncertainties in the arrival demand. For long-haul flights, the strategic scheduling algorithm performed best when the scheduling horizon was greater than 1,000 nmi. With these long scheduling horizons, it was possible to absorb between ten and 12 minutes of delay through speed control alone. Unfortunately, the use of tactical scheduling, which resulted in airborne holding, was found to increase as the strategic scheduling horizon increased because of the additional uncertainty in the arrival times of the aircraft. Findings from these initial experiments indicate that it is possible to schedule arrivals into Anchorage with minimal delays under low-visibility conditions with less disruption to high-cost, international flights.

  10. A Bayesian belief network approach for assessing uncertainty in conceptual site models at contaminated sites

    NASA Astrophysics Data System (ADS)

    Thomsen, Nanna I.; Binning, Philip J.; McKnight, Ursula S.; Tuxen, Nina; Bjerg, Poul L.; Troldborg, Mads

    2016-05-01

    A key component in risk assessment of contaminated sites is in the formulation of a conceptual site model (CSM). A CSM is a simplified representation of reality and forms the basis for the mathematical modeling of contaminant fate and transport at the site. The CSM should therefore identify the most important site-specific features and processes that may affect the contaminant transport behavior at the site. However, the development of a CSM will always be associated with uncertainties due to limited data and lack of understanding of the site conditions. CSM uncertainty is often found to be a major source of model error and it should therefore be accounted for when evaluating uncertainties in risk assessments. We present a Bayesian belief network (BBN) approach for constructing CSMs and assessing their uncertainty at contaminated sites. BBNs are graphical probabilistic models that are effective for integrating quantitative and qualitative information, and thus can strengthen decisions when empirical data are lacking. The proposed BBN approach facilitates a systematic construction of multiple CSMs, and then determines the belief in each CSM using a variety of data types and/or expert opinion at different knowledge levels. The developed BBNs combine data from desktop studies and initial site investigations with expert opinion to assess which of the CSMs are more likely to reflect the actual site conditions. The method is demonstrated on a Danish field site, contaminated with chlorinated ethenes. Four different CSMs are developed by combining two contaminant source zone interpretations (presence or absence of a separate phase contamination) and two geological interpretations (fractured or unfractured clay till). The beliefs in each of the CSMs are assessed sequentially based on data from three investigation stages (a screening investigation, a more detailed investigation, and an expert consultation) to demonstrate that the belief can be updated as more information becomes available.

  11. Trust and Transitions in Modes of Exchange

    ERIC Educational Resources Information Center

    Cheshire, Coye; Gerbasi, Alexandra; Cook, Karen S.

    2010-01-01

    In this study, we investigate the relationship between uncertainty and trust in exogenous shifts in modes of social exchange (i.e., those that are not initiated by the individuals in a given exchange system). We explore how transitions from a high uncertainty environment (reciprocal exchange) to lower-uncertainty environments (nonbinding or…

  12. Uncertainties in Atomic Data and Their Propagation Through Spectral Models. I.

    NASA Technical Reports Server (NTRS)

    Bautista, M. A.; Fivet, V.; Quinet, P.; Dunn, J.; Gull, T. R.; Kallman, T. R.; Mendoza, C.

    2013-01-01

    We present a method for computing uncertainties in spectral models, i.e., level populations, line emissivities, and emission line ratios, based upon the propagation of uncertainties originating from atomic data.We provide analytic expressions, in the form of linear sets of algebraic equations, for the coupled uncertainties among all levels. These equations can be solved efficiently for any set of physical conditions and uncertainties in the atomic data. We illustrate our method applied to spectral models of Oiii and Fe ii and discuss the impact of the uncertainties on atomic systems under different physical conditions. As to intrinsic uncertainties in theoretical atomic data, we propose that these uncertainties can be estimated from the dispersion in the results from various independent calculations. This technique provides excellent results for the uncertainties in A-values of forbidden transitions in [Fe ii]. Key words: atomic data - atomic processes - line: formation - methods: data analysis - molecular data - molecular processes - techniques: spectroscopic

  13. From local uncertainty to global predictions: Making predictions on fractal basins

    PubMed Central

    2018-01-01

    In nonlinear systems long term dynamics is governed by the attractors present in phase space. The presence of a chaotic saddle gives rise to basins of attraction with fractal boundaries and sometimes even to Wada boundaries. These two phenomena involve extreme difficulties in the prediction of the future state of the system. However, we show here that it is possible to make statistical predictions even if we do not have any previous knowledge of the initial conditions or the time series of the system until it reaches its final state. In this work, we develop a general method to make statistical predictions in systems with fractal basins. In particular, we have applied this new method to the Duffing oscillator for a choice of parameters where the system possesses the Wada property. We have computed the statistical properties of the Duffing oscillator for different phase space resolutions, to obtain information about the global dynamics of the system. The key idea is that the fraction of initial conditions that evolve towards each attractor is scale free—which we illustrate numerically. We have also shown numerically how having partial information about the initial conditions of the system does not improve in general the predictions in the Wada regions. PMID:29668687

  14. LCOE Baseline for OE Buoy

    DOE Data Explorer

    Previsic, Mirko; Karthikeyan, Anantha; Lewis, Tony; McCarthy, John

    2017-07-26

    Capex numbers are in $/kW, Opex numbers in $/kW-yr. Cost Estimates provided herein are based on concept design and basic engineering data and have high levels of uncertainties embedded. This reference economic scenario was done for a very large device version of the OE Buoy technology, which is not presently on Ocean Energy's technology development pathway but will be considered in future business plan development. The DOE reference site condition is considered a low power-density site, compared with many of the planned initial deployment locations for the OE Buoy. Many of the sites considered for the initial commercial deployment of the OE buoy feature much higher wave power densities and shorter period waves. Both of these characteristics will improve the OE buoy's commercial viability.

  15. Setting priorities for research on pollution reduction functions of agricultural buffers.

    PubMed

    Dosskey, Michael G

    2002-11-01

    The success of buffer installation initiatives and programs to reduce nonpoint source pollution of streams on agricultural lands will depend the ability of local planners to locate and design buffers for specific circumstances with substantial and predictable results. Current predictive capabilities are inadequate, and major sources of uncertainty remain. An assessment of these uncertainties cautions that there is greater risk of overestimating buffer impact than underestimating it. Priorities for future research are proposed that will lead more quickly to major advances in predictive capabilities. Highest priority is given for work on the surface runoff filtration function, which is almost universally important to the amount of pollution reduction expected from buffer installation and for which there remain major sources of uncertainty for predicting level of impact. Foremost uncertainties surround the extent and consequences of runoff flow concentration and pollutant accumulation. Other buffer functions, including filtration of groundwater nitrate and stabilization of channel erosion sources of sediments, may be important in some regions. However, uncertainty surrounds our ability to identify and quantify the extent of site conditions where buffer installation can substantially reduce stream pollution in these ways. Deficiencies in predictive models reflect gaps in experimental information as well as technology to account for spatial heterogeneity of pollutant sources, pathways, and buffer capabilities across watersheds. Since completion of a comprehensive watershed-scale buffer model is probably far off, immediate needs call for simpler techniques to gage the probable impacts of buffer installation at local scales.

  16. Stimulus uncertainty enhances long-term potentiation-like plasticity in human motor cortex.

    PubMed

    Sale, Martin V; Nydam, Abbey S; Mattingley, Jason B

    2017-03-01

    Plasticity can be induced in human cortex using paired associative stimulation (PAS), which repeatedly and predictably pairs a peripheral electrical stimulus with transcranial magnetic stimulation (TMS) to the contralateral motor region. Many studies have reported small or inconsistent effects of PAS. Given that uncertain stimuli can promote learning, the predictable nature of the stimulation in conventional PAS paradigms might serve to attenuate plasticity induction. Here, we introduced stimulus uncertainty into the PAS paradigm to investigate if it can boost plasticity induction. Across two experimental sessions, participants (n = 28) received a modified PAS paradigm consisting of a random combination of 90 paired stimuli and 90 unpaired (TMS-only) stimuli. Prior to each of these stimuli, participants also received an auditory cue which either reliably predicted whether the upcoming stimulus was paired or unpaired (no uncertainty condition) or did not predict the upcoming stimulus (maximum uncertainty condition). Motor evoked potentials (MEPs) evoked from abductor pollicis brevis (APB) muscle quantified cortical excitability before and after PAS. MEP amplitude increased significantly 15 min following PAS in the maximum uncertainty condition. There was no reliable change in MEP amplitude in the no uncertainty condition, nor between post-PAS MEP amplitudes across the two conditions. These results suggest that stimulus uncertainty may provide a novel means to enhance plasticity induction with the PAS paradigm in human motor cortex. To provide further support to the notion that stimulus uncertainty and prediction error promote plasticity, future studies should further explore the time course of these changes, and investigate what aspects of stimulus uncertainty are critical in boosting plasticity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newman, Jennifer; Clifton, Andrew; Bonin, Timothy

    As wind turbine sizes increase and wind energy expands to more complex and remote sites, remote-sensing devices such as lidars are expected to play a key role in wind resource assessment and power performance testing. The switch to remote-sensing devices represents a paradigm shift in the way the wind industry typically obtains and interprets measurement data for wind energy. For example, the measurement techniques and sources of uncertainty for a remote-sensing device are vastly different from those associated with a cup anemometer on a meteorological tower. Current IEC standards for quantifying remote sensing device uncertainty for power performance testing considermore » uncertainty due to mounting, calibration, and classification of the remote sensing device, among other parameters. Values of the uncertainty are typically given as a function of the mean wind speed measured by a reference device and are generally fixed, leading to climatic uncertainty values that apply to the entire measurement campaign. However, real-world experience and a consideration of the fundamentals of the measurement process have shown that lidar performance is highly dependent on atmospheric conditions, such as wind shear, turbulence, and aerosol content. At present, these conditions are not directly incorporated into the estimated uncertainty of a lidar device. In this presentation, we describe the development of a new dynamic lidar uncertainty framework that adapts to current flow conditions and more accurately represents the actual uncertainty inherent in lidar measurements under different conditions. In this new framework, sources of uncertainty are identified for estimation of the line-of-sight wind speed and reconstruction of the three-dimensional wind field. These sources are then related to physical processes caused by the atmosphere and lidar operating conditions. The framework is applied to lidar data from a field measurement site to assess the ability of the framework to predict errors in lidar-measured wind speed. The results show how uncertainty varies over time and can be used to help select data with different levels of uncertainty for different applications, for example, low uncertainty data for power performance testing versus all data for plant performance monitoring.« less

  18. Condition trees as a mechanism for communicating the meaning of uncertainties

    NASA Astrophysics Data System (ADS)

    Beven, Keith

    2015-04-01

    Uncertainty communication for environmental problems is fraught with difficulty for good epistemic reasons. The fact that most sources of uncertainty are subject to, and often dominated by, epistemic uncertainties means that the unthinking use of probability theory might actually be misleading and lead to false inference (even in some cases where the assumptions of a probabilistic error model might seem to be reasonably valid). This therefore creates problems in communicating the meaning of probabilistic uncertainties of model predictions to potential users (there are many examples in hydrology, hydraulics, climate change and other domains). It is suggested that one way of being more explicit about the meaning of uncertainties is to associate each type of application with a condition tree of assumptions that need to be made in producing an estimate of uncertainty. The condition tree then provides a basis for discussion and communication of assumptions about uncertainties with users. Agreement of assumptions (albeit generally at some institutional level) will provide some buy-in on the part of users, and a basis for commissioning of future studies. Even in some relatively well-defined problems, such as mapping flood risk, such a condition tree can be rather extensive, but by making each step in the tree explicit then an audit trail is established for future reference. This can act to provide focus in the exercise of agreeing more realistic assumptions.

  19. Assessing climate change impact by integrated hydrological modelling

    NASA Astrophysics Data System (ADS)

    Lajer Hojberg, Anker; Jørgen Henriksen, Hans; Olsen, Martin; der Keur Peter, van; Seaby, Lauren Paige; Troldborg, Lars; Sonnenborg, Torben; Refsgaard, Jens Christian

    2013-04-01

    Future climate may have a profound effect on the freshwater cycle, which must be taken into consideration by water management for future planning. Developments in the future climate are nevertheless uncertain, thus adding to the challenge of managing an uncertain system. To support the water managers at various levels in Denmark, the national water resources model (DK-model) (Højberg et al., 2012; Stisen et al., 2012) was used to propagate future climate to hydrological response under considerations of the main sources of uncertainty. The DK-model is a physically based and fully distributed model constructed on the basis of the MIKE SHE/MIKE11 model system describing groundwater and surface water systems and the interaction between the domains. The model has been constructed for the entire 43.000 km2 land area of Denmark only excluding minor islands. Future climate from General Circulation Models (GCM) was downscaled by Regional Climate Models (RCM) by a distribution-based scaling method (Seaby et al., 2012). The same dataset was used to train all combinations of GCM-RCMs and they were found to represent the mean and variance at the seasonal basis equally well. Changes in hydrological response were computed by comparing the short term development from the period 1990 - 2010 to 2021 - 2050, which is the time span relevant for water management. To account for uncertainty in future climate predictions, hydrological response from the DK-model using nine combinations of GCMs and RCMs was analysed for two catchments representing the various hydrogeological conditions in Denmark. Three GCM-RCM combinations displaying high, mean and low future impacts were selected as representative climate models for which climate impact studies were carried out for the entire country. Parameter uncertainty was addressed by sensitivity analysis and was generally found to be of less importance compared to the uncertainty spanned by the GCM-RCM combinations. Analysis of the simulations showed some unexpected results, where climate models predicting the largest increase in net precipitation did not result in the largest increase in groundwater heads. This was found to be the result of different initial conditions (1990 - 2010) for the various climate models. In some areas a combination of a high initial groundwater head and an increase in precipitation towards 2021 - 2050 resulted in a groundwater head raise that reached the drainage or the surface water system. This will increase the exchange from the groundwater to the surface water system, but reduce the raise in groundwater heads. An alternative climate model, with a lower initial head can thus predict a higher increase in the groundwater head, although the increase in precipitation is lower. This illustrates an extra dimension in the uncertainty assessment, namely the climate models capability of simulating the current climatic conditions in a way that can reproduce the observed hydrological response. Højberg, AL, Troldborg, L, Stisen, S, et al. (2012) Stakeholder driven update and improvement of a national water resources model - http://www.sciencedirect.com/science/article/pii/S1364815212002423 Seaby, LP, Refsgaard, JC, Sonnenborg, TO, et al. (2012) Assessment of robustness and significance of climate change signals for an ensemble of distribution-based scaled climate projections (submitted) Journal of Hydrology Stisen, S, Højberg, AL, Troldborg, L et al., (2012): On the importance of appropriate rain-gauge catch correction for hydrological modelling at mid to high latitudes - http://www.hydrol-earth-syst-sci.net/16/4157/2012/

  20. Managing Reform Efforts in Times of Uncertainty: Effects of Principal Support and Leadership on Teachers' Implementation Commitment to Common Core Reform Initiatives

    ERIC Educational Resources Information Center

    Smith, Lee W.

    2016-01-01

    The Common Core State Standards (CCSS) require a major shift in instructional practices among teachers. Such changes cause much uncertainty as teachers' roles and identities begin to change. Major school reform creates difficulty for school leaders who must develop teacher support and dedication to 'top-down' reform initiatives in their…

  1. Assessment of initial soil moisture conditions for event-based rainfall-runoff modelling

    NASA Astrophysics Data System (ADS)

    Tramblay, Yves; Bouvier, Christophe; Martin, Claude; Didon-Lescot, Jean-François; Todorovik, Dragana; Domergue, Jean-Marc

    2010-06-01

    Flash floods are the most destructive natural hazards that occur in the Mediterranean region. Rainfall-runoff models can be very useful for flash flood forecasting and prediction. Event-based models are very popular for operational purposes, but there is a need to reduce the uncertainties related to the initial moisture conditions estimation prior to a flood event. This paper aims to compare several soil moisture indicators: local Time Domain Reflectometry (TDR) measurements of soil moisture, modelled soil moisture through the Interaction-Sol-Biosphère-Atmosphère (ISBA) component of the SIM model (Météo-France), antecedent precipitation and base flow. A modelling approach based on the Soil Conservation Service-Curve Number method (SCS-CN) is used to simulate the flood events in a small headwater catchment in the Cevennes region (France). The model involves two parameters: one for the runoff production, S, and one for the routing component, K. The S parameter can be interpreted as the maximal water retention capacity, and acts as the initial condition of the model, depending on the antecedent moisture conditions. The model was calibrated from a 20-flood sample, and led to a median Nash value of 0.9. The local TDR measurements in the deepest layers of soil (80-140 cm) were found to be the best predictors for the S parameter. TDR measurements averaged over the whole soil profile, outputs of the SIM model, and the logarithm of base flow also proved to be good predictors, whereas antecedent precipitations were found to be less efficient. The good correlations observed between the TDR predictors and the S calibrated values indicate that monitoring soil moisture could help setting the initial conditions for simplified event-based models in small basins.

  2. Amphetamine-induced sensitization and reward uncertainty similarly enhance incentive salience for conditioned cues.

    PubMed

    Robinson, Mike J F; Anselme, Patrick; Suchomel, Kristen; Berridge, Kent C

    2015-08-01

    Amphetamine and stress can sensitize mesolimbic dopamine-related systems. In Pavlovian autoshaping, repeated exposure to uncertainty of reward prediction can enhance motivated sign-tracking or attraction to a discrete reward-predicting cue (lever-conditioned stimulus; CS+), as well as produce cross-sensitization to amphetamine. However, it remains unknown how amphetamine sensitization or repeated restraint stress interact with uncertainty in controlling CS+ incentive salience attribution reflected in sign-tracking. Here rats were tested in 3 successive phases. First, different groups underwent either induction of amphetamine sensitization or repeated restraint stress, or else were not sensitized or stressed as control groups (either saline injections only, or no stress or injection at all). All next received Pavlovian autoshaping training under either certainty conditions (100% CS-UCS association) or uncertainty conditions (50% CS-UCS association and uncertain reward magnitude). During training, rats were assessed for sign-tracking to the CS+ lever versus goal-tracking to the sucrose dish. Finally, all groups were tested for psychomotor sensitization of locomotion revealed by an amphetamine challenge. Our results confirm that reward uncertainty enhanced sign-tracking attraction toward the predictive CS+ lever, at the expense of goal-tracking. We also reported that amphetamine sensitization promoted sign-tracking even in rats trained under CS-UCS certainty conditions, raising them to sign-tracking levels equivalent to the uncertainty group. Combining amphetamine sensitization and uncertainty conditions did not add together to elevate sign-tracking further above the relatively high levels induced by either manipulation alone. In contrast, repeated restraint stress enhanced subsequent amphetamine-elicited locomotion, but did not enhance CS+ attraction. (c) 2015 APA, all rights reserved).

  3. Intermediate Scale Experimental Design to Validate a Subsurface Inverse Theory Applicable to Date-sparse Conditions

    NASA Astrophysics Data System (ADS)

    Jiao, J.; Trautz, A.; Zhang, Y.; Illangasekera, T.

    2017-12-01

    Subsurface flow and transport characterization under data-sparse condition is addressed by a new and computationally efficient inverse theory that simultaneously estimates parameters, state variables, and boundary conditions. Uncertainty in static data can be accounted for while parameter structure can be complex due to process uncertainty. The approach has been successfully extended to inverting transient and unsaturated flows as well as contaminant source identification under unknown initial and boundary conditions. In one example, by sampling numerical experiments simulating two-dimensional steady-state flow in which tracer migrates, a sequential inversion scheme first estimates the flow field and permeability structure before the evolution of tracer plume and dispersivities are jointly estimated. Compared to traditional inversion techniques, the theory does not use forward simulations to assess model-data misfits, thus the knowledge of the difficult-to-determine site boundary condition is not required. To test the general applicability of the theory, data generated during high-precision intermediate-scale experiments (i.e., a scale intermediary to the field and column scales) in large synthetic aquifers can be used. The design of such experiments is not trivial as laboratory conditions have to be selected to mimic natural systems in order to provide useful data, thus requiring a variety of sensors and data collection strategies. This paper presents the design of such an experiment in a synthetic, multi-layered aquifer with dimensions of 242.7 x 119.3 x 7.7 cm3. Different experimental scenarios that will generate data to validate the theory are presented.

  4. In-flight alignment using H ∞ filter for strapdown INS on aircraft.

    PubMed

    Pei, Fu-Jun; Liu, Xuan; Zhu, Li

    2014-01-01

    In-flight alignment is an effective way to improve the accuracy and speed of initial alignment for strapdown inertial navigation system (INS). During the aircraft flight, strapdown INS alignment was disturbed by lineal and angular movements of the aircraft. To deal with the disturbances in dynamic initial alignment, a novel alignment method for SINS is investigated in this paper. In this method, an initial alignment error model of SINS in the inertial frame is established. The observability of the system is discussed by piece-wise constant system (PWCS) theory and observable degree is computed by the singular value decomposition (SVD) theory. It is demonstrated that the system is completely observable, and all the system state parameters can be estimated by optimal filter. Then a H ∞ filter was designed to resolve the uncertainty of measurement noise. The simulation results demonstrate that the proposed algorithm can reach a better accuracy under the dynamic disturbance condition.

  5. Ensemble hydrological forecast efficiency evolution over various issue dates and lead-time: case study for the Cheboksary reservoir (Volga River)

    NASA Astrophysics Data System (ADS)

    Gelfan, Alexander; Moreido, Vsevolod

    2017-04-01

    Ensemble hydrological forecasting allows for describing uncertainty caused by variability of meteorological conditions in the river basin for the forecast lead-time. At the same time, in snowmelt-dependent river basins another significant source of uncertainty relates to variability of initial conditions of the basin (snow water equivalent, soil moisture content, etc.) prior to forecast issue. Accurate long-term hydrological forecast is most crucial for large water management systems, such as the Cheboksary reservoir (the catchment area is 374 000 sq.km) located in the Middle Volga river in Russia. Accurate forecasts of water inflow volume, maximum discharge and other flow characteristics are of great value for this basin, especially before the beginning of the spring freshet season that lasts here from April to June. The semi-distributed hydrological model ECOMAG was used to develop long-term ensemble forecast of daily water inflow into the Cheboksary reservoir. To describe variability of the meteorological conditions and construct ensemble of possible weather scenarios for the lead-time of the forecast, two approaches were applied. The first one utilizes 50 weather scenarios observed in the previous years (similar to the ensemble streamflow prediction (ESP) procedure), the second one uses 1000 synthetic scenarios simulated by a stochastic weather generator. We investigated the evolution of forecast uncertainty reduction, expressed as forecast efficiency, over various consequent forecast issue dates and lead time. We analyzed the Nash-Sutcliffe efficiency of inflow hindcasts for the period 1982 to 2016 starting from 1st of March with 15 days frequency for lead-time of 1 to 6 months. This resulted in the forecast efficiency matrix with issue dates versus lead-time that allows for predictability identification of the basin. The matrix was constructed separately for observed and synthetic weather ensembles.

  6. Robustness for slope stability modelling under deep uncertainty

    NASA Astrophysics Data System (ADS)

    Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten

    2015-04-01

    Landslides can have large negative societal and economic impacts, such as loss of life and damage to infrastructure. However, the ability of slope stability assessment to guide management is limited by high levels of uncertainty in model predictions. Many of these uncertainties cannot be easily quantified, such as those linked to climate change and other future socio-economic conditions, restricting the usefulness of traditional decision analysis tools. Deep uncertainty can be managed more effectively by developing robust, but not necessarily optimal, policies that are expected to perform adequately under a wide range of future conditions. Robust strategies are particularly valuable when the consequences of taking a wrong decision are high as is often the case of when managing natural hazard risks such as landslides. In our work a physically based numerical model of hydrologically induced slope instability (the Combined Hydrology and Stability Model - CHASM) is applied together with robust decision making to evaluate the most important uncertainties (storm events, groundwater conditions, surface cover, slope geometry, material strata and geotechnical properties) affecting slope stability. Specifically, impacts of climate change on long-term slope stability are incorporated, accounting for the deep uncertainty in future climate projections. Our findings highlight the potential of robust decision making to aid decision support for landslide hazard reduction and risk management under conditions of deep uncertainty.

  7. Uncertainty Quantification in Climate Modeling and Projection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qian, Yun; Jackson, Charles; Giorgi, Filippo

    The projection of future climate is one of the most complex problems undertaken by the scientific community. Although scientists have been striving to better understand the physical basis of the climate system and to improve climate models, the overall uncertainty in projections of future climate has not been significantly reduced (e.g., from the IPCC AR4 to AR5). With the rapid increase of complexity in Earth system models, reducing uncertainties in climate projections becomes extremely challenging. Since uncertainties always exist in climate models, interpreting the strengths and limitations of future climate projections is key to evaluating risks, and climate change informationmore » for use in Vulnerability, Impact, and Adaptation (VIA) studies should be provided with both well-characterized and well-quantified uncertainty. The workshop aimed at providing participants, many of them from developing countries, information on strategies to quantify the uncertainty in climate model projections and assess the reliability of climate change information for decision-making. The program included a mixture of lectures on fundamental concepts in Bayesian inference and sampling, applications, and hands-on computer laboratory exercises employing software packages for Bayesian inference, Markov Chain Monte Carlo methods, and global sensitivity analyses. The lectures covered a range of scientific issues underlying the evaluation of uncertainties in climate projections, such as the effects of uncertain initial and boundary conditions, uncertain physics, and limitations of observational records. Progress in quantitatively estimating uncertainties in hydrologic, land surface, and atmospheric models at both regional and global scales was also reviewed. The application of Uncertainty Quantification (UQ) concepts to coupled climate system models is still in its infancy. The Coupled Model Intercomparison Project (CMIP) multi-model ensemble currently represents the primary data for assessing reliability and uncertainties of climate change information. An alternative approach is to generate similar ensembles by perturbing parameters within a single-model framework. One of workshop’s objectives was to give participants a deeper understanding of these approaches within a Bayesian statistical framework. However, there remain significant challenges still to be resolved before UQ can be applied in a convincing way to climate models and their projections.« less

  8. High precision Hugoniot measurements on statically pre-compressed fluid helium

    NASA Astrophysics Data System (ADS)

    Seagle, Christopher T.; Reinhart, William D.; Lopez, Andrew J.; Hickman, Randy J.; Thornhill, Tom F.

    2016-09-01

    The capability for statically pre-compressing fluid targets for Hugoniot measurements utilizing gas gun driven flyer plates has been developed. Pre-compression expands the capability for initial condition control, allowing access to thermodynamic states off the principal Hugoniot. Absolute Hugoniot measurements with an uncertainty less than 3% on density and pressure were obtained on statically pre-compressed fluid helium utilizing a two stage light gas gun. Helium is highly compressible; the locus of shock states resulting from dynamic loading of an initially compressed sample at room temperature is significantly denser than the cryogenic fluid Hugoniot even for relatively modest (0.27-0.38 GPa) initial pressures. The dynamic response of pre-compressed helium in the initial density range of 0.21-0.25 g/cm3 at ambient temperature may be described by a linear shock velocity (us) and particle velocity (up) relationship: us = C0 + sup, with C0 = 1.44 ± 0.14 km/s and s = 1.344 ± 0.025.

  9. Estimating discharge measurement uncertainty using the interpolated variance estimator

    USGS Publications Warehouse

    Cohn, T.; Kiang, J.; Mason, R.

    2012-01-01

    Methods for quantifying the uncertainty in discharge measurements typically identify various sources of uncertainty and then estimate the uncertainty from each of these sources by applying the results of empirical or laboratory studies. If actual measurement conditions are not consistent with those encountered in the empirical or laboratory studies, these methods may give poor estimates of discharge uncertainty. This paper presents an alternative method for estimating discharge measurement uncertainty that uses statistical techniques and at-site observations. This Interpolated Variance Estimator (IVE) estimates uncertainty based on the data collected during the streamflow measurement and therefore reflects the conditions encountered at the site. The IVE has the additional advantage of capturing all sources of random uncertainty in the velocity and depth measurements. It can be applied to velocity-area discharge measurements that use a velocity meter to measure point velocities at multiple vertical sections in a channel cross section.

  10. Simulation's Ensemble is Better Than Ensemble Simulation

    NASA Astrophysics Data System (ADS)

    Yan, X.

    2017-12-01

    Simulation's ensemble is better than ensemble simulation Yan Xiaodong State Key Laboratory of Earth Surface Processes and Resource Ecology (ESPRE) Beijing Normal University,19 Xinjiekouwai Street, Haidian District, Beijing 100875, China Email: yxd@bnu.edu.cnDynamical system is simulated from initial state. However initial state data is of great uncertainty, which leads to uncertainty of simulation. Therefore, multiple possible initial states based simulation has been used widely in atmospheric science, which has indeed been proved to be able to lower the uncertainty, that was named simulation's ensemble because multiple simulation results would be fused . In ecological field, individual based model simulation (forest gap models for example) can be regarded as simulation's ensemble compared with community based simulation (most ecosystem models). In this talk, we will address the advantage of individual based simulation and even their ensembles.

  11. A probabilistic approach towards understanding how planet composition affects plate tectonics - through time and space.

    NASA Astrophysics Data System (ADS)

    Stamenkovic, V.

    2017-12-01

    We focus on the connections between plate tectonics and planet composition — by studying how plate yielding is affected by surface and mantle water, and by variable amounts of Fe, SiC, or radiogenic heat sources within the planet interior. We especially explore whether we can make any robust conclusions if we account for variable initial conditions, current uncertainties in model parameters and the pressure dependence of the viscosity, as well as uncertainties on how a variable composition affects mantle rheology, melting temperatures, and thermal conductivities. We use a 1D thermal evolution model to explore with more than 200,000 simulations the robustness of our results and use our previous results from 3D calculations to help determine the most likely scenario within the uncertainties we still face today. The results that are robust in spite of all uncertainties are that iron-rich mantle rock seems to reduce the efficiency of plate yielding occurring on silicate planets like the Earth if those planets formed along or above mantle solidus and that carbon planets do not seem to be ideal candidates for plate tectonics because of slower creep rates and generally higher thermal conductivities for SiC. All other conclusions depend on not yet sufficiently constrained parameters. For the most likely case based on our current understanding, we find that, within our range of varied planet conditions (1-10 Earth masses), planets with the greatest efficiency of plate yielding are silicate rocky planets of 1 Earth mass with large metallic cores (average density 5500-7000 kg m-3) with minimal mantle concentrations of iron (as little as 0% is preferred) and radiogenic isotopes at formation (up to 10 times less than Earth's initial abundance; less heat sources do not mean no heat sources). Based on current planet formation scenarios and observations of stellar abundances across the Galaxy as well as models of the evolution of the interstellar medium, such planets are suggested to be statistically more common around young stars in the outer disk of the Milky Way. Rocky super-Earths, undifferentiated planets, and still hypothetical carbon planets have the lowest plate yielding efficiencies found in our study. This work aids exoplanet characterization and helps explore the fundamental drivers of plate tectonics.

  12. Does the uncertainty in the representation of terrestrial water flows affect precipitation predictability? A WRF-Hydro ensemble analysis for Central Europe

    NASA Astrophysics Data System (ADS)

    Arnault, Joel; Rummler, Thomas; Baur, Florian; Lerch, Sebastian; Wagner, Sven; Fersch, Benjamin; Zhang, Zhenyu; Kerandi, Noah; Keil, Christian; Kunstmann, Harald

    2017-04-01

    Precipitation predictability can be assessed by the spread within an ensemble of atmospheric simulations being perturbed in the initial, lateral boundary conditions and/or modeled processes within a range of uncertainty. Surface-related processes are more likely to change precipitation when synoptic forcing is weak. This study investigates the effect of uncertainty in the representation of terrestrial water flows on precipitation predictability. The tools used for this investigation are the Weather Research and Forecasting (WRF) model and its hydrologically-enhanced version WRF-Hydro, applied over Central Europe during April-October 2008. The WRF grid is that of COSMO-DE, with a resolution of 2.8 km. In WRF-Hydro, the WRF grid is coupled with a sub-grid at 280 m resolution to resolve lateral terrestrial water flows. Vertical flow uncertainty is considered by modifying the parameter controlling the partitioning between surface runoff and infiltration in WRF, and horizontal flow uncertainty is considered by comparing WRF with WRF-Hydro. Precipitation predictability is deduced from the spread of an ensemble based on three turbulence parameterizations. Model results are validated with E-OBS precipitation and surface temperature, ESA-CCI soil moisture, FLUXNET-MTE surface evaporation and GRDC discharge. It is found that the uncertainty in the representation of terrestrial water flows is more likely to significantly affect precipitation predictability when surface flux spatial variability is high. In comparison to the WRF ensemble, WRF-Hydro slightly improves the adjusted continuous ranked probability score of daily precipitation. The reproduction of observed daily discharge with Nash-Sutcliffe model efficiency coefficients up to 0.91 demonstrates the potential of WRF-Hydro for flood forecasting.

  13. Prioritizing Risks and Uncertainties from Intentional Release of Selected Category A Pathogens

    PubMed Central

    Hong, Tao; Gurian, Patrick L.; Huang, Yin; Haas, Charles N.

    2012-01-01

    This paper synthesizes available information on five Category A pathogens (Bacillus anthracis, Yersinia pestis, Francisella tularensis, Variola major and Lassa) to develop quantitative guidelines for how environmental pathogen concentrations may be related to human health risk in an indoor environment. An integrated model of environmental transport and human health exposure to biological pathogens is constructed which 1) includes the effects of environmental attenuation, 2) considers fomite contact exposure as well as inhalational exposure, and 3) includes an uncertainty analysis to identify key input uncertainties, which may inform future research directions. The findings provide a framework for developing the many different environmental standards that are needed for making risk-informed response decisions, such as when prophylactic antibiotics should be distributed, and whether or not a contaminated area should be cleaned up. The approach is based on the assumption of uniform mixing in environmental compartments and is thus applicable to areas sufficiently removed in time and space from the initial release that mixing has produced relatively uniform concentrations. Results indicate that when pathogens are released into the air, risk from inhalation is the main component of the overall risk, while risk from ingestion (dermal contact for B. anthracis) is the main component of the overall risk when pathogens are present on surfaces. Concentrations sampled from untracked floor, walls and the filter of heating ventilation and air conditioning (HVAC) system are proposed as indicators of previous exposure risk, while samples taken from touched surfaces are proposed as indicators of future risk if the building is reoccupied. A Monte Carlo uncertainty analysis is conducted and input-output correlations used to identify important parameter uncertainties. An approach is proposed for integrating these quantitative assessments of parameter uncertainty with broader, qualitative considerations to identify future research priorities. PMID:22412915

  14. Evaluation of uncertainties in the CRCM-simulated North American climate

    NASA Astrophysics Data System (ADS)

    de Elía, Ramón; Caya, Daniel; Côté, Hélène; Frigon, Anne; Biner, Sébastien; Giguère, Michel; Paquin, Dominique; Harvey, Richard; Plummer, David

    2008-02-01

    This work is a first step in the analysis of uncertainty sources in the RCM-simulated climate over North America. Three main sets of sensitivity studies were carried out: the first estimates the magnitude of internal variability, which is needed to evaluate the significance of changes in the simulated climate induced by any model modification. The second is devoted to the role of CRCM configuration as a source of uncertainty, in particular the sensitivity to nesting technique, domain size, and driving reanalysis. The third study aims to assess the relative importance of the previously estimated sensitivities by performing two additional sensitivity experiments: one, in which the reanalysis driving data is replaced by data generated by the second generation Coupled Global Climate Model (CGCM2), and another, in which a different CRCM version is used. Results show that the internal variability, triggered by differences in initial conditions, is much smaller than the sensitivity to any other source. Results also show that levels of uncertainty originating from liberty of choices in the definition of configuration parameters are comparable among themselves and are smaller than those due to the choice of CGCM or CRCM version used. These results suggest that uncertainty originated by the CRCM configuration latitude (freedom of choice among domain sizes, nesting techniques and reanalysis dataset), although important, does not seem to be a major obstacle to climate downscaling. Finally, with the aim of evaluating the combined effect of the different uncertainties, the ensemble spread is estimated for a subset of the analysed simulations. Results show that downscaled surface temperature is in general more uncertain in the northern regions, while precipitation is more uncertain in the central and eastern US.

  15. Using dry and wet year hydroclimatic extremes to guide future hydrologic projections

    NASA Astrophysics Data System (ADS)

    Oni, Stephen; Futter, Martyn; Ledesma, Jose; Teutschbein, Claudia; Buttle, Jim; Laudon, Hjalmar

    2016-07-01

    There are growing numbers of studies on climate change impacts on forest hydrology, but limited attempts have been made to use current hydroclimatic variabilities to constrain projections of future climatic conditions. Here we used historical wet and dry years as a proxy for expected future extreme conditions in a boreal catchment. We showed that runoff could be underestimated by at least 35 % when dry year parameterizations were used for wet year conditions. Uncertainty analysis showed that behavioural parameter sets from wet and dry years separated mainly on precipitation-related parameters and to a lesser extent on parameters related to landscape processes, while uncertainties inherent in climate models (as opposed to differences in calibration or performance metrics) appeared to drive the overall uncertainty in runoff projections under dry and wet hydroclimatic conditions. Hydrologic model calibration for climate impact studies could be based on years that closely approximate anticipated conditions to better constrain uncertainty in projecting extreme conditions in boreal and temperate regions.

  16. Communicating Uncertain Science to the Public: How Amount and Source of Uncertainty Impact Fatalism, Backlash, and Overload

    PubMed Central

    Jensen, Jakob D.; Pokharel, Manusheela; Scherr, Courtney L.; King, Andy J.; Brown, Natasha; Jones, Christina

    2016-01-01

    Public dissemination of scientific research often focuses on the finding (e.g., nanobombs kill lung cancer) rather than the uncertainty/limitations (e.g., in mice). Adults (N = 880) participated in an experiment where they read a manipulated news report about cancer research (a) that contained either low or high uncertainty (b) that was attributed to the scientists responsible for the research (disclosure condition) or an unaffiliated scientist (dueling condition). Compared to the dueling condition, the disclosure condition triggered less prevention-focused cancer fatalism and nutritional backlash. PMID:26973157

  17. Communicating Uncertain Science to the Public: How Amount and Source of Uncertainty Impact Fatalism, Backlash, and Overload.

    PubMed

    Jensen, Jakob D; Pokharel, Manusheela; Scherr, Courtney L; King, Andy J; Brown, Natasha; Jones, Christina

    2017-01-01

    Public dissemination of scientific research often focuses on the finding (e.g., nanobombs kill lung cancer) rather than the uncertainty/limitations (e.g., in mice). Adults (n = 880) participated in an experiment where they read a manipulated news report about cancer research (a) that contained either low or high uncertainty (b) that was attributed to the scientists responsible for the research (disclosure condition) or an unaffiliated scientist (dueling condition). Compared to the dueling condition, the disclosure condition triggered less prevention-focused cancer fatalism and nutritional backlash. © 2016 Society for Risk Analysis.

  18. Framing of Uncertainty in Scientific Publications: Towards Recommendations for Decision Support

    NASA Astrophysics Data System (ADS)

    Guillaume, J. H. A.; Helgeson, C.; Elsawah, S.; Jakeman, A. J.; Kummu, M.

    2016-12-01

    Uncertainty is recognised as an essential issue in environmental decision making and decision support. As modellers, we notably use a variety of tools and techniques within an analysis, for example related to uncertainty quantification and model validation. We also address uncertainty by how we present results. For example, experienced modellers are careful to distinguish robust conclusions from those that need further work, and the precision of quantitative results is tailored to their accuracy. In doing so, the modeller frames how uncertainty should be interpreted by their audience. This is an area which extends beyond modelling to fields such as philosophy of science, semantics, discourse analysis, intercultural communication and rhetoric. We propose that framing of uncertainty deserves greater attention in the context of decision support, and that there are opportunities in this area for fundamental research, synthesis and knowledge transfer, development of teaching curricula, and significant advances in managing uncertainty in decision making. This presentation reports preliminary results of a study of framing practices. Specifically, we analyse the framing of uncertainty that is visible in the abstracts from a corpus of scientific articles. We do this through textual analysis of the content and structure of those abstracts. Each finding that appears in an abstract is classified according to the uncertainty framing approach used, using a classification scheme that was iteratively revised based on reflection and comparison amongst three coders. This analysis indicates how frequently the different framing approaches are used, and provides initial insights into relationships between frames, how the frames relate to interpretation of uncertainty, and how rhetorical devices are used by modellers to communicate uncertainty in their work. We propose initial hypotheses for how the resulting insights might influence decision support, and help advance decision making to better address uncertainty.

  19. Entropic uncertainty relations in the Heisenberg XXZ model and its controlling via filtering operations

    NASA Astrophysics Data System (ADS)

    Ming, Fei; Wang, Dong; Shi, Wei-Nan; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu

    2018-04-01

    The uncertainty principle is recognized as an elementary ingredient of quantum theory and sets up a significant bound to predict outcome of measurement for a couple of incompatible observables. In this work, we develop dynamical features of quantum memory-assisted entropic uncertainty relations (QMA-EUR) in a two-qubit Heisenberg XXZ spin chain with an inhomogeneous magnetic field. We specifically derive the dynamical evolutions of the entropic uncertainty with respect to the measurement in the Heisenberg XXZ model when spin A is initially correlated with quantum memory B. It has been found that the larger coupling strength J of the ferromagnetism ( J < 0 ) and the anti-ferromagnetism ( J > 0 ) chains can effectively degrade the measuring uncertainty. Besides, it turns out that the higher temperature can induce the inflation of the uncertainty because the thermal entanglement becomes relatively weak in this scenario, and there exists a distinct dynamical behavior of the uncertainty when an inhomogeneous magnetic field emerges. With the growing magnetic field | B | , the variation of the entropic uncertainty will be non-monotonic. Meanwhile, we compare several different optimized bounds existing with the initial bound proposed by Berta et al. and consequently conclude Adabi et al.'s result is optimal. Moreover, we also investigate the mixedness of the system of interest, dramatically associated with the uncertainty. Remarkably, we put forward a possible physical interpretation to explain the evolutionary phenomenon of the uncertainty. Finally, we take advantage of a local filtering operation to steer the magnitude of the uncertainty. Therefore, our explorations may shed light on the entropic uncertainty under the Heisenberg XXZ model and hence be of importance to quantum precision measurement over solid state-based quantum information processing.

  20. Pain uncertainty in patients with fibromyalgia, yoga practitioners, and healthy volunteers.

    PubMed

    Bradshaw, David H; Donaldson, Gary W; Okifuji, Akiko

    2012-01-01

    Uncertainty about potentially painful events affects how pain is experienced. Individuals with fibromyalgia (FM) often exhibit anxiety and catastrophic thoughts regarding pain and difficulties dealing with pain uncertainty. The effects of pain uncertainty in predictably high odds (HO), predictably low odds (LO), and even odds (EO) conditions on subjective ratings of pain (PR) and skin conductance responses (SCR) following the administration of a painful stimulus were examined for individuals with fibromyalgia (IWFM), healthy volunteers (HVs), and yoga practitioners (YPs). We hypothesized IWFM would demonstrate the greatest physiological reactivity to pain uncertainty, followed by HVs and YPs, respectively. Nine IWFM, 7 YPs, and 10 HVs participated. Custom contrast estimates comparing responses for HO, LO, and EO pain conditions showed higher SCR for IWFM (CE = 1.27, p = 0.01) but not for HVs or for YPs. PR for the EO condition were significantly greater than for HO and LO conditions for IWFM (CE = 0.60, p = 0.012) but not for HVs or YPs. YPs had lower SCR and PR than did HVs. Results show that uncertainty regarding pain increases the experience of pain, whereas certainty regarding pain may reduce pain ratings for individuals with fibromyalgia.

  1. Technical notes: A detailed study for the provision of measurement uncertainty and traceability for goniospectrometers

    NASA Astrophysics Data System (ADS)

    Peltoniemi, Jouni I.; Hakala, Teemu; Suomalainen, Juha; Honkavaara, Eija; Markelin, Lauri; Gritsevich, Maria; Eskelinen, Juho; Jaanson, Priit; Ikonen, Erkki

    2014-10-01

    The measurement uncertainty and traceability of the Finnish Geodetic Institutess field gonio-spectro-polarimeter FIGIFIGO have been assessed. First, the reference standard (Spectralon sample) was measured at the National Standard Laboratory of MIKES-Aalto. This standard was transferred to FGIs field reference standard (larger Spectralon sample), and from that to the unmanned aerial vehicle (UAV), reference standards (1 m2 plates). The reflectance measurement uncertainty of FIGIFIGO has been estimated to be 0.01 in ideal laboratory conditions, but about 0.02-0.05 in typical field conditions, larger at larger solar or observation zenith angles. Target specific uncertainties can increase total uncertainty even to 0.1-0.2. The angular reading uncertainty is between 1° and 3°, depending on user selection, and the polarisation uncertainty is around 0.01. For UAV, the transferred reflectance uncertainty is about 0.05-0.1, depending on, how ideal the measurement conditions are. The design concept of FIGIFIGO has been proved to have a number of advantages, such as a well-adopted user-friendly interface, a high level of automation and excellent suitability for the field measurements. It is a perfect instrument for collection of reference data on a given target in natural (and well-recorded) conditions. In addition to the strong points of FIGIFIGO, the current study reveals several issues that need further attention, such as the field of view, illumination quality, polarisation calibration, Spectralon reflectance and polarisation properties in the 1000-2400 nm range.

  2. A New Framework for Quantifying Lidar Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newman, Jennifer, F.; Clifton, Andrew; Bonin, Timothy A.

    2017-03-24

    As wind turbine sizes increase and wind energy expands to more complex and remote sites, remote sensing devices such as lidars are expected to play a key role in wind resource assessment and power performance testing. The switch to remote sensing devices represents a paradigm shift in the way the wind industry typically obtains and interprets measurement data for wind energy. For example, the measurement techniques and sources of uncertainty for a remote sensing device are vastly different from those associated with a cup anemometer on a meteorological tower. Current IEC standards discuss uncertainty due to mounting, calibration, and classificationmore » of the remote sensing device, among other parameters. Values of the uncertainty are typically given as a function of the mean wind speed measured by a reference device. However, real-world experience has shown that lidar performance is highly dependent on atmospheric conditions, such as wind shear, turbulence, and aerosol content. At present, these conditions are not directly incorporated into the estimated uncertainty of a lidar device. In this presentation, we propose the development of a new lidar uncertainty framework that adapts to current flow conditions and more accurately represents the actual uncertainty inherent in lidar measurements under different conditions. In this new framework, sources of uncertainty are identified for estimation of the line-of-sight wind speed and reconstruction of the three-dimensional wind field. These sources are then related to physical processes caused by the atmosphere and lidar operating conditions. The framework is applied to lidar data from an operational wind farm to assess the ability of the framework to predict errors in lidar-measured wind speed.« less

  3. "The more you know, the more you realise it is really challenging to do": Tensions and uncertainties in person-centred support for people with long-term conditions.

    PubMed

    Entwistle, Vikki A; Cribb, Alan; Watt, Ian S; Skea, Zoë C; Owens, John; Morgan, Heather M; Christmas, Simon

    2018-03-30

    To identify and examine tensions and uncertainties in person-centred approaches to self-management support - approaches that take patients seriously as moral agents and orient support to enable them to live (and die) well on their own terms. Interviews with 26 UK clinicians about working with people with diabetes or Parkinson's disease, conducted within a broader interdisciplinary project on self-management support. The analysis reported here was informed by philosophical reasoning and discussions with stakeholders. Person-centred approaches require clinicians to balance tensions between the many things that can matter in life, and their own and each patient's perspectives on these. Clinicians must ensure that their supportive efforts do not inadvertently disempower people. When attending to someone's particular circumstances and perspectives, they sometimes face intractable uncertainties, including about what is most important to the person and what, realistically, the person can or could do and achieve. The kinds of professional judgement that person-centred working necessitates are not always acknowledged and supported. Practical and ethical tensions are inherent in person-centred support and need to be better understood and addressed. Professional development and service improvement initiatives should recognise these tensions and uncertainties and support clinicians to navigate them well. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  4. Investment, regulation, and uncertainty: managing new plant breeding techniques.

    PubMed

    Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose

    2014-01-01

    As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases.   This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline.

  5. Investment, regulation, and uncertainty

    PubMed Central

    Smyth, Stuart J; McDonald, Jillian; Falck-Zepeda, Jose

    2014-01-01

    As with any technological innovation, time refines the technology, improving upon the original version of the innovative product. The initial GM crops had single traits for either herbicide tolerance or insect resistance. Current varieties have both of these traits stacked together and in many cases other abiotic and biotic traits have also been stacked. This innovation requires investment. While this is relatively straight forward, certain conditions need to exist such that investments can be facilitated. The principle requirement for investment is that regulatory frameworks render consistent and timely decisions. If the certainty of regulatory outcomes weakens, the potential for changes in investment patterns increases.   This article provides a summary background to the leading plant breeding technologies that are either currently being used to develop new crop varieties or are in the pipeline to be applied to plant breeding within the next few years. Challenges for existing regulatory systems are highlighted. Utilizing an option value approach from investment literature, an assessment of uncertainty regarding the regulatory approval for these varying techniques is undertaken. This research highlights which technology development options have the greatest degree of uncertainty and hence, which ones might be expected to see an investment decline. PMID:24499745

  6. The Pliocene Model Intercomparison Project (PlioMIP) Phase 2: scientific objectives and experimental design

    NASA Astrophysics Data System (ADS)

    Haywood, Alan M.; Dowsett, Harry J.; Dolan, Aisling M.; Rowley, David; Abe-Ouchi, Ayako; Otto-Bliesner, Bette; Chandler, Mark A.; Hunter, Stephen J.; Lunt, Daniel J.; Pound, Matthew; Salzmann, Ulrich

    2016-03-01

    The Pliocene Model Intercomparison Project (PlioMIP) is a co-ordinated international climate modelling initiative to study and understand climate and environments of the Late Pliocene, as well as their potential relevance in the context of future climate change. PlioMIP examines the consistency of model predictions in simulating Pliocene climate and their ability to reproduce climate signals preserved by geological climate archives. Here we provide a description of the aim and objectives of the next phase of the model intercomparison project (PlioMIP Phase 2), and we present the experimental design and boundary conditions that will be utilized for climate model experiments in Phase 2. Following on from PlioMIP Phase 1, Phase 2 will continue to be a mechanism for sampling structural uncertainty within climate models. However, Phase 1 demonstrated the requirement to better understand boundary condition uncertainties as well as uncertainty in the methodologies used for data-model comparison. Therefore, our strategy for Phase 2 is to utilize state-of-the-art boundary conditions that have emerged over the last 5 years. These include a new palaeogeographic reconstruction, detailing ocean bathymetry and land-ice surface topography. The ice surface topography is built upon the lessons learned from offline ice sheet modelling studies. Land surface cover has been enhanced by recent additions of Pliocene soils and lakes. Atmospheric reconstructions of palaeo-CO2 are emerging on orbital timescales, and these are also incorporated into PlioMIP Phase 2. New records of surface and sea surface temperature change are being produced that will be more temporally consistent with the boundary conditions and forcings used within models. Finally we have designed a suite of prioritized experiments that tackle issues surrounding the basic understanding of the Pliocene and its relevance in the context of future climate change in a discrete way.

  7. The Pliocene Model Intercomparison Project (PlioMIP) Phase 2: Scientific Objectives and Experimental Design

    NASA Technical Reports Server (NTRS)

    Haywood, Alan M.; Dowsett, Harry J.; Dolan, Aisling M.; Rowley, David; Abe-Ouchi, Ayako; Otto-Bliesner, Bette; Chandler, Mark A.; Hunter, Stephen J.; Lunt, Daniel J.; Pound, Matthew; hide

    2016-01-01

    The Pliocene Model Intercomparison Project (PlioMIP) is a co-ordinated international climate modelling initiative to study and understand climate and environments of the Late Pliocene, as well as their potential relevance in the context of future climate change. PlioMIP examines the consistency of model predictions in simulating Pliocene climate and their ability to reproduce climate signals preserved by geological climate archives. Here we provide a description of the aim and objectives of the next phase of the model intercomparison project (PlioMIP Phase 2), and we present the experimental design and boundary conditions that will be utilized for climate model experiments in Phase 2. Following on from PlioMIP Phase 1, Phase 2 will continue to be a mechanism for sampling structural uncertainty within climate models. However, Phase 1 demonstrated the requirement to better understand boundary condition uncertainties as well as uncertainty in the methodologies used for data-model comparison. Therefore, our strategy for Phase 2 is to utilize state-of-the-art boundary conditions that have emerged over the last 5 years. These include a new palaeogeographic reconstruction, detailing ocean bathymetry and land-ice surface topography. The ice surface topography is built upon the lessons learned from offline ice sheet modelling studies. Land surface cover has been enhanced by recent additions of Pliocene soils and lakes. Atmospheric reconstructions of palaeo-CO2 are emerging on orbital timescales, and these are also incorporated into PlioMIP Phase 2. New records of surface and sea surface temperature change are being produced that will be more temporally consistent with the boundary conditions and forcings used within models. Finally we have designed a suite of prioritized experiments that tackle issues surrounding the basic understanding of the Pliocene and its relevance in the context of future climate change in a discrete way.

  8. Impact of Land Model Calibration on Coupled Land-Atmosphere Prediction

    NASA Technical Reports Server (NTRS)

    Santanello, Joseph A., Jr.; Kumar, Sujay V.; Peters-Lidard, Christa D.; Harrison, Ken; Zhou, Shujia

    2012-01-01

    Land-atmosphere (L-A) interactions play a critical role in determining the diurnal evolution of both planetary boundary layer (PBL) and land surface heat and moisture budgets, as well as controlling feedbacks with clouds and precipitation that lead to the persistence of dry and wet regimes. Recent efforts to quantify the strength of L-A coupling in prediction models have produced diagnostics that integrate across both the land and PBL components of the system. In this study, we examine the impact of improved specification of land surface states, anomalies, and fluxes on coupled WRF forecasts during the summers of extreme dry and wet land surface conditions in the U.S. Southern Great Plains. The improved land initialization and surface flux parameterizations are obtained through calibration of the Noah land surface model using the new optimization and uncertainty estimation subsystem in NASA's Land Information System (LIS-OPT/UE). The impact of the calibration on the a) spinup of the land surface used as initial conditions, and b) the simulated heat and moisture states and fluxes of the coupled WRF simulations is then assessed. Changes in ambient weather and land-atmosphere coupling are evaluated along with measures of uncertainty propagation into the forecasts. In addition, the sensitivity of this approach to the period of calibration (dry, wet, average) is investigated. Results indicate that the offline calibration leads to systematic improvements in land-PBL fluxes and near-surface temperature and humidity, and in the process provide guidance on the questions of what, how, and when to calibrate land surface models for coupled model prediction.

  9. Orion Handling Qualities During ISS Rendezvous and Docking

    NASA Technical Reports Server (NTRS)

    Hart, Jeremy J.; Stephens, J. P.; Spehar, P.; Bilimoria, K.; Foster, C.; Gonzalex, R.; Sullivan, K.; Jackson, B.; Brazzel, J.; Hart, J.

    2011-01-01

    The Orion spacecraft was designed to rendezvous with multiple vehicles in low earth orbit (LEO) and beyond. To perform the required rendezvous and docking task, Orion must provide enough control authority to perform coarse translational maneuvers while maintaining precision to perform the delicate docking corrections. While Orion has autonomous docking capabilities, it is expected that final approach and docking operations with the International Space Station (ISS) will initially be performed in a manual mode. A series of evaluations was conducted by NASA and Lockheed Martin at the Johnson Space Center to determine the handling qualities (HQ) of the Orion spacecraft during different docking and rendezvous conditions using the Cooper-Harper scale. This paper will address the specifics of the handling qualities methodology, vehicle configuration, scenarios flown, data collection tools, and subject ratings and comments. The initial Orion HQ assessment examined Orion docking to the ISS. This scenario demonstrates the Translational Hand Controller (THC) handling qualities of Orion. During this initial assessment, two different scenarios were evaluated. The first was a nominal docking approach to a stable ISS, with Orion initializing with relative position dispersions and a closing rate of approximately 0.1 ft/sec. The second docking scenario was identical to the first, except the attitude motion of the ISS was modeled to simulate a stress case ( 1 degree deadband per axis and 0.01 deg/sec rate deadband per axis). For both scenarios, subjects started each run on final approach at a docking port-to-port range of 20 ft. Subjects used the THC in pulse mode with cues from the docking camera image, window views, and range and range rate data displayed on the Orion display units. As in the actual design, the attitude of the Orion vehicle was held by the automated flight control system at 0.5 degree deadband per axis. Several error sources were modeled including Reaction Control System (RCS) jet angular and position misalignment, RCS thrust magnitude uncertainty, RCS jet force direction uncertainty due to self plume impingement, and Orion center of mass uncertainty.

  10. Observability considerations for multi-sensor and product fusion: Bias, information content, and validation (Invited)

    NASA Astrophysics Data System (ADS)

    Reid, J. S.; Zhang, J.; Hyer, E. J.; Campbell, J. R.; Christopher, S. A.; Ferrare, R. A.; Leptoukh, G. G.; Stackhouse, P. W.

    2009-12-01

    With the successful development of many aerosol products from the NASA A-train as well as new operational geostationary and polar orbiting sensors, the scientific community now has a host of new parameters to use in their analyses. The variety and quality of products has reached a point where the community has moved from basic observation-based science to sophisticated multi-component research that addresses the complex atmospheric environment. In order for these satellite data contribute to the science their uncertainty levels must move from semi-quantitative to quantitative. Initial attempts to quantify uncertainties have led to some recent debate in the community as to the efficacy of aerosol products from current and future NASA satellite sensors. In an effort to understand the state of satellite product fidelity, the Naval Research Laboratory and a newly reformed Global Energy and Water Cycle Experiment (GEWEX) aerosol panel have both initiated assessments of the nature of aerosol remote sensing uncertainty and bias. In this talk we go over areas of specific concern based on the authors’ experiences with the data, emphasizing the multi-sensor problem. We first enumerate potential biases, including retrieval, sampling/contextual, and cognitive bias. We show examples of how these biases can subsequently lead to the pitfalls of correlated/compensating errors, tautology, and confounding. The nature of bias is closely related to the information content of the sensor signal and its subsequent application to the derived aerosol quantity of interest (e.g., optical depth, flux, index of refraction, etc.). Consequently, purpose-specific validation methods must be employed, especially when generating multi-sensor products. Indeed, cloud and lower boundary condition biases in particular complicate the more typical methods of regressional bias elimination and histogram matching. We close with a discussion of sequestration of uncertainty in multi-sensor applications of these products in both pair-wise and fused fashions.

  11. Communicating Uncertainty to the Public During Volcanic Unrest and Eruption -A Case Study From the 2004-2005 Eruption of Mount St. Helens, USA

    NASA Astrophysics Data System (ADS)

    Gardner, C. A.; Pallister, J. S.

    2005-12-01

    The earthquake swarm beneath Mount St. Helens that began on 23 September 2004 did not initially appear different from previous swarms (none of which culminated in an eruption) that had occurred beneath the volcano since the end of the 1980-1986 eruptions. Three days into the swarm, however, a burst of larger-magnitude earthquakes indicated that this swarm was indeed different and prompted the U.S. Geological Survey's Cascades Volcano Observatory (CVO) to issue a change in alert level, the first time such a change had been issued in the Cascades in over 18 years. From then on, the unrest accelerated quickly as did the need to communicate the developing conditions to the public and public officials, often in the spotlight of intense media attention. Within three weeks of the onset of unrest, magma reached the surface. Since mid-October 2004, lava has been extruding through a glacier within the crater of Mount St. Helens, forming a 60 Mm3 dome by August 2005. The rapid onset of the eruption required a rapid ramping up of communication within and among the scientific, emergency-response and land-management communities, as well as the reestablishment of protocols that had not been rigorously tested for 18 years. Early on, daily meetings of scientists from CVO and the University of Washington's Pacific Northwest Seismograph Network were established to discuss incoming monitoring data and to develop a consensus on the likely course of activity, hazard potential and the uncertainty inherent in these forecasts. Subgroups developed scenario maps to describe the range of activity likely under different eruptive behaviors and sizes, and assessed short- and long-term probabilities of eruption, explosivity and hazardous events by employing a probability-tree methodology. Resultant consensual information has been communicated to a variety of groups using established alert levels for ground-based and aviation communities, daily updates and media briefings, postings on the worldwide web, teleconferences, and meetings with land and emergency managers. Initial concerns revolved around the questions of if and when an eruption would occur, whether it would be explosive, and how large-all questions without definitive answers. As the eruption progresses, concerns have transformed to whether the eruptive behavior will change and how long the eruption will last-also questions lacking definitive answers. We have found it important in communicating our uncertainty to the public to articulate how we came to our conclusions and why our answers cannot be more definitive. We have also found that framing volcanic uncertainty in terms of more common analogies (e.g. knowing that conditions are right for development of a tornado, but not being able to predict exactly when a funnel cloud will form, precisely where it will touch down, or how severe the damage will be) appears to help the public and public officials understand volcanic uncertainty better. As the eruption continues and people become more accustomed to the activity, we find an increasingly more knowledgeable public who can better understand and deal with uncertainty. Also, it is clear that establishing interagency relationships by developing volcano response plans before a crisis greatly facilitates a successful response. A critical component of this planning is discussing uncertainties inherent during volcanic crises such that when unrest begins, the concept of, and reasons behind uncertainty are already well understood.

  12. Parameter Uncertainty on AGCM-simulated Tropical Cyclones

    NASA Astrophysics Data System (ADS)

    He, F.

    2015-12-01

    This work studies the parameter uncertainty on tropical cyclone (TC) simulations in Atmospheric General Circulation Models (AGCMs) using the Reed-Jablonowski TC test case, which is illustrated in Community Atmosphere Model (CAM). It examines the impact from 24 parameters across the physical parameterization schemes that represent the convection, turbulence, precipitation and cloud processes in AGCMs. The one-at-a-time (OAT) sensitivity analysis method first quantifies their relative importance on TC simulations and identifies the key parameters to the six different TC characteristics: intensity, precipitation, longwave cloud radiative forcing (LWCF), shortwave cloud radiative forcing (SWCF), cloud liquid water path (LWP) and ice water path (IWP). Then, 8 physical parameters are chosen and perturbed using the Latin-Hypercube Sampling (LHS) method. The comparison between OAT ensemble run and LHS ensemble run shows that the simulated TC intensity is mainly affected by the parcel fractional mass entrainment rate in Zhang-McFarlane (ZM) deep convection scheme. The nonlinear interactive effect among different physical parameters is negligible on simulated TC intensity. In contrast, this nonlinear interactive effect plays a significant role in other simulated tropical cyclone characteristics (precipitation, LWCF, SWCF, LWP and IWP) and greatly enlarge their simulated uncertainties. The statistical emulator Extended Multivariate Adaptive Regression Splines (EMARS) is applied to characterize the response functions for nonlinear effect. Last, we find that the intensity uncertainty caused by physical parameters is in a degree comparable to uncertainty caused by model structure (e.g. grid) and initial conditions (e.g. sea surface temperature, atmospheric moisture). These findings suggest the importance of using the perturbed physics ensemble (PPE) method to revisit tropical cyclone prediction under climate change scenario.

  13. Amphetamine-induced sensitization and reward uncertainty similarly enhance incentive salience for conditioned cues

    PubMed Central

    Robinson, Mike J.F.; Anselme, Patrick; Suchomel, Kristen; Berridge, Kent C.

    2015-01-01

    Amphetamine and stress can sensitize mesolimbic dopamine-related systems. In Pavlovian autoshaping, repeated exposure to uncertainty of reward prediction can enhance motivated sign-tracking or attraction to a discrete reward-predicting cue (lever CS+), as well as produce cross-sensitization to amphetamine. However, it remains unknown how amphetamine-sensitization or repeated restraint stress interact with uncertainty in controlling CS+ incentive salience attribution reflected in sign-tracking. Here rats were tested in three successive phases. First, different groups underwent either induction of amphetamine sensitization or repeated restraint stress, or else were not sensitized or stressed as control groups (either saline injections only, or no stress or injection at all). All next received Pavlovian autoshaping training under either certainty conditions (100% CS-UCS association) or uncertainty conditions (50% CS-UCS association and uncertain reward magnitude). During training, rats were assessed for sign-tracking to the lever CS+ versus goal-tracking to the sucrose dish. Finally, all groups were tested for psychomotor sensitization of locomotion revealed by an amphetamine challenge. Our results confirm that reward uncertainty enhanced sign-tracking attraction toward the predictive CS+ lever, at the expense of goal-tracking. We also report that amphetamine sensitization promoted sign-tracking even in rats trained under CS-UCS certainty conditions, raising them to sign-tracking levels equivalent to the uncertainty group. Combining amphetamine sensitization and uncertainty conditions together did not add together to elevate sign-tracking further above the relatively high levels induced by either manipulation alone. In contrast, repeated restraint stress enhanced subsequent amphetamine-elicited locomotion, but did not enhance CS+ attraction. PMID:26076340

  14. Dynamics of entropic uncertainty for atoms immersed in thermal fluctuating massless scalar field

    NASA Astrophysics Data System (ADS)

    Huang, Zhiming

    2018-04-01

    In this article, the dynamics of quantum memory-assisted entropic uncertainty relation for two atoms immersed in a thermal bath of fluctuating massless scalar field is investigated. The master equation that governs the system evolution process is derived. It is found that the mixedness is closely associated with entropic uncertainty. For equilibrium state, the tightness of uncertainty vanishes. For the initial maximum entangled state, the tightness of uncertainty undergoes a slight increase and then declines to zero with evolution time. It is found that temperature can increase the uncertainty, but two-atom separation does not always increase the uncertainty. The uncertainty evolves to different relatively stable values for different temperatures and converges to a fixed value for different two-atom distances with evolution time. Furthermore, weak measurement reversal is employed to control the entropic uncertainty.

  15. Assessment of the Gaussian Covariance Approximation over an Earth-Asteroid Encounter Period

    NASA Technical Reports Server (NTRS)

    Mattern, Daniel

    2017-01-01

    In assessing the risk an asteroid may pose to the Earth, the asteroids state is often predicted for many years, often decades. Only by accounting for the asteroids initial state uncertainty can a measure of the risk be calculated. With the asteroids state uncertainty growing as a function of the initial velocity uncertainty, orbit velocity at the last state update, and the time from the last update to the epoch of interest, the asteroids position uncertainties can grow to many times the size of the Earth when propagated to the encounter risk corridor. This paper examines the merits of propagating the asteroids state covariance as an analytical matrix. The results of this study help to bound the efficacy of applying different metrics for assessing the risk an asteroid poses to the Earth. Additionally, this work identifies a criterion for when different covariance propagation methods are needed to continue predictions after an Earth-encounter period.

  16. Tests of oceanic stochastic parameterisation in a seasonal forecast system.

    NASA Astrophysics Data System (ADS)

    Cooper, Fenwick; Andrejczuk, Miroslaw; Juricke, Stephan; Zanna, Laure; Palmer, Tim

    2015-04-01

    Over seasonal time scales, our aim is to compare the relative impact of ocean initial condition and model uncertainty, upon the ocean forecast skill and reliability. Over seasonal timescales we compare four oceanic stochastic parameterisation schemes applied in a 1x1 degree ocean model (NEMO) with a fully coupled T159 atmosphere (ECMWF IFS). The relative impacts upon the ocean of the resulting eddy induced activity, wind forcing and typical initial condition perturbations are quantified. Following the historical success of stochastic parameterisation in the atmosphere, two of the parameterisations tested were multiplicitave in nature: A stochastic variation of the Gent-McWilliams scheme and a stochastic diffusion scheme. We also consider a surface flux parameterisation (similar to that introduced by Williams, 2012), and stochastic perturbation of the equation of state (similar to that introduced by Brankart, 2013). The amplitude of the stochastic term in the Williams (2012) scheme was set to the physically reasonable amplitude considered in that paper. The amplitude of the stochastic term in each of the other schemes was increased to the limits of model stability. As expected, variability was increased. Up to 1 month after initialisation, ensemble spread induced by stochastic parameterisation is greater than that induced by the atmosphere, whilst being smaller than the initial condition perturbations currently used at ECMWF. After 1 month, the wind forcing becomes the dominant source of model ocean variability, even at depth.

  17. The cerebellum and decision making under uncertainty.

    PubMed

    Blackwood, Nigel; Ffytche, Dominic; Simmons, Andrew; Bentall, Richard; Murray, Robin; Howard, Robert

    2004-06-01

    This study aimed to identify the neural basis of probabilistic reasoning, a type of inductive inference that aids decision making under conditions of uncertainty. Eight normal subjects performed two separate two-alternative-choice tasks (the balls in a bottle and personality survey tasks) while undergoing functional magnetic resonance imaging (fMRI). The experimental conditions within each task were chosen so that they differed only in their requirement to make a decision under conditions of uncertainty (probabilistic reasoning and frequency determination required) or under conditions of certainty (frequency determination required). The same visual stimuli and motor responses were used in the experimental conditions. We provide evidence that the neo-cerebellum, in conjunction with the premotor cortex, inferior parietal lobule and medial occipital cortex, mediates the probabilistic inferences that guide decision making under uncertainty. We hypothesise that the neo-cerebellum constructs internal working models of uncertain events in the external world, and that such probabilistic models subserve the predictive capacity central to induction. Copyright 2004 Elsevier B.V.

  18. Improving snow density estimation for mapping SWE with Lidar snow depth: assessment of uncertainty in modeled density and field sampling strategies in NASA SnowEx

    NASA Astrophysics Data System (ADS)

    Raleigh, M. S.; Smyth, E.; Small, E. E.

    2017-12-01

    The spatial distribution of snow water equivalent (SWE) is not sufficiently monitored with either remotely sensed or ground-based observations for water resources management. Recent applications of airborne Lidar have yielded basin-wide mapping of SWE when combined with a snow density model. However, in the absence of snow density observations, the uncertainty in these SWE maps is dominated by uncertainty in modeled snow density rather than in Lidar measurement of snow depth. Available observations tend to have a bias in physiographic regime (e.g., flat open areas) and are often insufficient in number to support testing of models across a range of conditions. Thus, there is a need for targeted sampling strategies and controlled model experiments to understand where and why different snow density models diverge. This will enable identification of robust model structures that represent dominant processes controlling snow densification, in support of basin-scale estimation of SWE with remotely-sensed snow depth datasets. The NASA SnowEx mission is a unique opportunity to evaluate sampling strategies of snow density and to quantify and reduce uncertainty in modeled snow density. In this presentation, we present initial field data analyses and modeling results over the Colorado SnowEx domain in the 2016-2017 winter campaign. We detail a framework for spatially mapping the uncertainty in snowpack density, as represented across multiple models. Leveraging the modular SUMMA model, we construct a series of physically-based models to assess systematically the importance of specific process representations to snow density estimates. We will show how models and snow pit observations characterize snow density variations with forest cover in the SnowEx domains. Finally, we will use the spatial maps of density uncertainty to evaluate the selected locations of snow pits, thereby assessing the adequacy of the sampling strategy for targeting uncertainty in modeled snow density.

  19. Visual Semiotics & Uncertainty Visualization: An Empirical Study.

    PubMed

    MacEachren, A M; Roth, R E; O'Brien, J; Li, B; Swingley, D; Gahegan, M

    2012-12-01

    This paper presents two linked empirical studies focused on uncertainty visualization. The experiments are framed from two conceptual perspectives. First, a typology of uncertainty is used to delineate kinds of uncertainty matched with space, time, and attribute components of data. Second, concepts from visual semiotics are applied to characterize the kind of visual signification that is appropriate for representing those different categories of uncertainty. This framework guided the two experiments reported here. The first addresses representation intuitiveness, considering both visual variables and iconicity of representation. The second addresses relative performance of the most intuitive abstract and iconic representations of uncertainty on a map reading task. Combined results suggest initial guidelines for representing uncertainty and discussion focuses on practical applicability of results.

  20. Mars approach for global sensitivity analysis of differential equation models with applications to dynamics of influenza infection.

    PubMed

    Lee, Yeonok; Wu, Hulin

    2012-01-01

    Differential equation models are widely used for the study of natural phenomena in many fields. The study usually involves unknown factors such as initial conditions and/or parameters. It is important to investigate the impact of unknown factors (parameters and initial conditions) on model outputs in order to better understand the system the model represents. Apportioning the uncertainty (variation) of output variables of a model according to the input factors is referred to as sensitivity analysis. In this paper, we focus on the global sensitivity analysis of ordinary differential equation (ODE) models over a time period using the multivariate adaptive regression spline (MARS) as a meta model based on the concept of the variance of conditional expectation (VCE). We suggest to evaluate the VCE analytically using the MARS model structure of univariate tensor-product functions which is more computationally efficient. Our simulation studies show that the MARS model approach performs very well and helps to significantly reduce the computational cost. We present an application example of sensitivity analysis of ODE models for influenza infection to further illustrate the usefulness of the proposed method.

  1. Shock Layer Radiation Modeling and Uncertainty for Mars Entry

    NASA Technical Reports Server (NTRS)

    Johnston, Christopher O.; Brandis, Aaron M.; Sutton, Kenneth

    2012-01-01

    A model for simulating nonequilibrium radiation from Mars entry shock layers is presented. A new chemical kinetic rate model is developed that provides good agreement with recent EAST and X2 shock tube radiation measurements. This model includes a CO dissociation rate that is a factor of 13 larger than the rate used widely in previous models. Uncertainties in the proposed rates are assessed along with uncertainties in translational-vibrational relaxation modeling parameters. The stagnation point radiative flux uncertainty due to these flowfield modeling parameter uncertainties is computed to vary from 50 to 200% for a range of free-stream conditions, with densities ranging from 5e-5 to 5e-4 kg/m3 and velocities ranging from of 6.3 to 7.7 km/s. These conditions cover the range of anticipated peak radiative heating conditions for proposed hypersonic inflatable aerodynamic decelerators (HIADs). Modeling parameters for the radiative spectrum are compiled along with a non-Boltzmann rate model for the dominant radiating molecules, CO, CN, and C2. A method for treating non-local absorption in the non-Boltzmann model is developed, which is shown to result in up to a 50% increase in the radiative flux through absorption by the CO 4th Positive band. The sensitivity of the radiative flux to the radiation modeling parameters is presented and the uncertainty for each parameter is assessed. The stagnation point radiative flux uncertainty due to these radiation modeling parameter uncertainties is computed to vary from 18 to 167% for the considered range of free-stream conditions. The total radiative flux uncertainty is computed as the root sum square of the flowfield and radiation parametric uncertainties, which results in total uncertainties ranging from 50 to 260%. The main contributors to these significant uncertainties are the CO dissociation rate and the CO heavy-particle excitation rates. Applying the baseline flowfield and radiation models developed in this work, the radiative heating for the Mars Pathfinder probe is predicted to be nearly 20 W/cm2. In contrast to previous studies, this value is shown to be significant relative to the convective heating.

  2. Ensemble-based flash-flood modelling: Taking into account hydrodynamic parameters and initial soil moisture uncertainties

    NASA Astrophysics Data System (ADS)

    Edouard, Simon; Vincendon, Béatrice; Ducrocq, Véronique

    2018-05-01

    Intense precipitation events in the Mediterranean often lead to devastating flash floods (FF). FF modelling is affected by several kinds of uncertainties and Hydrological Ensemble Prediction Systems (HEPS) are designed to take those uncertainties into account. The major source of uncertainty comes from rainfall forcing and convective-scale meteorological ensemble prediction systems can manage it for forecasting purpose. But other sources are related to the hydrological modelling part of the HEPS. This study focuses on the uncertainties arising from the hydrological model parameters and initial soil moisture with aim to design an ensemble-based version of an hydrological model dedicated to Mediterranean fast responding rivers simulations, the ISBA-TOP coupled system. The first step consists in identifying the parameters that have the strongest influence on FF simulations by assuming perfect precipitation. A sensitivity study is carried out first using a synthetic framework and then for several real events and several catchments. Perturbation methods varying the most sensitive parameters as well as initial soil moisture allow designing an ensemble-based version of ISBA-TOP. The first results of this system on some real events are presented. The direct perspective of this work will be to drive this ensemble-based version with the members of a convective-scale meteorological ensemble prediction system to design a complete HEPS for FF forecasting.

  3. NAIRAS aircraft radiation model development, dose climatology, and initial validation.

    PubMed

    Mertens, Christopher J; Meier, Matthias M; Brown, Steven; Norman, Ryan B; Xu, Xiaojing

    2013-10-01

    [1] The Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) is a real-time, global, physics-based model used to assess radiation exposure to commercial aircrews and passengers. The model is a free-running physics-based model in the sense that there are no adjustment factors applied to nudge the model into agreement with measurements. The model predicts dosimetric quantities in the atmosphere from both galactic cosmic rays (GCR) and solar energetic particles, including the response of the geomagnetic field to interplanetary dynamical processes and its subsequent influence on atmospheric dose. The focus of this paper is on atmospheric GCR exposure during geomagnetically quiet conditions, with three main objectives. First, provide detailed descriptions of the NAIRAS GCR transport and dosimetry methodologies. Second, present a climatology of effective dose and ambient dose equivalent rates at typical commercial airline altitudes representative of solar cycle maximum and solar cycle minimum conditions and spanning the full range of geomagnetic cutoff rigidities. Third, conduct an initial validation of the NAIRAS model by comparing predictions of ambient dose equivalent rates with tabulated reference measurement data and recent aircraft radiation measurements taken in 2008 during the minimum between solar cycle 23 and solar cycle 24. By applying the criterion of the International Commission on Radiation Units and Measurements (ICRU) on acceptable levels of aircraft radiation dose uncertainty for ambient dose equivalent greater than or equal to an annual dose of 1 mSv, the NAIRAS model is within 25% of the measured data, which fall within the ICRU acceptable uncertainty limit of 30%. The NAIRAS model predictions of ambient dose equivalent rate are generally within 50% of the measured data for any single-point comparison. The largest differences occur at low latitudes and high cutoffs, where the radiation dose level is low. Nevertheless, analysis suggests that these single-point differences will be within 30% when a new deterministic pion-initiated electromagnetic cascade code is integrated into NAIRAS, an effort which is currently underway.

  4. NAIRAS aircraft radiation model development, dose climatology, and initial validation

    NASA Astrophysics Data System (ADS)

    Mertens, Christopher J.; Meier, Matthias M.; Brown, Steven; Norman, Ryan B.; Xu, Xiaojing

    2013-10-01

    The Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) is a real-time, global, physics-based model used to assess radiation exposure to commercial aircrews and passengers. The model is a free-running physics-based model in the sense that there are no adjustment factors applied to nudge the model into agreement with measurements. The model predicts dosimetric quantities in the atmosphere from both galactic cosmic rays (GCR) and solar energetic particles, including the response of the geomagnetic field to interplanetary dynamical processes and its subsequent influence on atmospheric dose. The focus of this paper is on atmospheric GCR exposure during geomagnetically quiet conditions, with three main objectives. First, provide detailed descriptions of the NAIRAS GCR transport and dosimetry methodologies. Second, present a climatology of effective dose and ambient dose equivalent rates at typical commercial airline altitudes representative of solar cycle maximum and solar cycle minimum conditions and spanning the full range of geomagnetic cutoff rigidities. Third, conduct an initial validation of the NAIRAS model by comparing predictions of ambient dose equivalent rates with tabulated reference measurement data and recent aircraft radiation measurements taken in 2008 during the minimum between solar cycle 23 and solar cycle 24. By applying the criterion of the International Commission on Radiation Units and Measurements (ICRU) on acceptable levels of aircraft radiation dose uncertainty for ambient dose equivalent greater than or equal to an annual dose of 1 mSv, the NAIRAS model is within 25% of the measured data, which fall within the ICRU acceptable uncertainty limit of 30%. The NAIRAS model predictions of ambient dose equivalent rate are generally within 50% of the measured data for any single-point comparison. The largest differences occur at low latitudes and high cutoffs, where the radiation dose level is low. Nevertheless, analysis suggests that these single-point differences will be within 30% when a new deterministic pion-initiated electromagnetic cascade code is integrated into NAIRAS, an effort which is currently underway.

  5. NAIRAS aircraft radiation model development, dose climatology, and initial validation

    PubMed Central

    Mertens, Christopher J; Meier, Matthias M; Brown, Steven; Norman, Ryan B; Xu, Xiaojing

    2013-01-01

    [1] The Nowcast of Atmospheric Ionizing Radiation for Aviation Safety (NAIRAS) is a real-time, global, physics-based model used to assess radiation exposure to commercial aircrews and passengers. The model is a free-running physics-based model in the sense that there are no adjustment factors applied to nudge the model into agreement with measurements. The model predicts dosimetric quantities in the atmosphere from both galactic cosmic rays (GCR) and solar energetic particles, including the response of the geomagnetic field to interplanetary dynamical processes and its subsequent influence on atmospheric dose. The focus of this paper is on atmospheric GCR exposure during geomagnetically quiet conditions, with three main objectives. First, provide detailed descriptions of the NAIRAS GCR transport and dosimetry methodologies. Second, present a climatology of effective dose and ambient dose equivalent rates at typical commercial airline altitudes representative of solar cycle maximum and solar cycle minimum conditions and spanning the full range of geomagnetic cutoff rigidities. Third, conduct an initial validation of the NAIRAS model by comparing predictions of ambient dose equivalent rates with tabulated reference measurement data and recent aircraft radiation measurements taken in 2008 during the minimum between solar cycle 23 and solar cycle 24. By applying the criterion of the International Commission on Radiation Units and Measurements (ICRU) on acceptable levels of aircraft radiation dose uncertainty for ambient dose equivalent greater than or equal to an annual dose of 1 mSv, the NAIRAS model is within 25% of the measured data, which fall within the ICRU acceptable uncertainty limit of 30%. The NAIRAS model predictions of ambient dose equivalent rate are generally within 50% of the measured data for any single-point comparison. The largest differences occur at low latitudes and high cutoffs, where the radiation dose level is low. Nevertheless, analysis suggests that these single-point differences will be within 30% when a new deterministic pion-initiated electromagnetic cascade code is integrated into NAIRAS, an effort which is currently underway. PMID:26213513

  6. Chaotic dynamics of Comet 1P/Halley: Lyapunov exponent and survival time expectancy

    NASA Astrophysics Data System (ADS)

    Muñoz-Gutiérrez, M. A.; Reyes-Ruiz, M.; Pichardo, B.

    2015-03-01

    The orbital elements of Comet Halley are known to a very high precision, suggesting that the calculation of its future dynamical evolution is straightforward. In this paper we seek to characterize the chaotic nature of the present day orbit of Comet Halley and to quantify the time-scale over which its motion can be predicted confidently. In addition, we attempt to determine the time-scale over which its present day orbit will remain stable. Numerical simulations of the dynamics of test particles in orbits similar to that of Comet Halley are carried out with the MERCURY 6.2 code. On the basis of these we construct survival time maps to assess the absolute stability of Halley's orbit, frequency analysis maps to study the variability of the orbit, and we calculate the Lyapunov exponent for the orbit for variations in initial conditions at the level of the present day uncertainties in our knowledge of its orbital parameters. On the basis of our calculations of the Lyapunov exponent for Comet Halley, the chaotic nature of its motion is demonstrated. The e-folding time-scale for the divergence of initially very similar orbits is approximately 70 yr. The sensitivity of the dynamics on initial conditions is also evident in the self-similarity character of the survival time and frequency analysis maps in the vicinity of Halley's orbit, which indicates that, on average, it is unstable on a time-scale of hundreds of thousands of years. The chaotic nature of Halley's present day orbit implies that a precise determination of its motion, at the level of the present-day observational uncertainty, is difficult to predict on a time-scale of approximately 100 yr. Furthermore, we also find that the ejection of Halley from the Solar system or its collision with another body could occur on a time-scale as short as 10 000 yr.

  7. Aspects of Geodesical Motion with Fisher-Rao Metric: Classical and Quantum

    NASA Astrophysics Data System (ADS)

    Ciaglia, Florio M.; Cosmo, Fabio Di; Felice, Domenico; Mancini, Stefano; Marmo, Giuseppe; Pérez-Pardo, Juan M.

    The purpose of this paper is to exploit the geometric structure of quantum mechanics and of statistical manifolds to study the qualitative effect that the quantum properties have in the statistical description of a system. We show that the end points of geodesics in the classical setting coincide with the probability distributions that minimise Shannon’s entropy, i.e. with distributions of zero dispersion. In the quantum setting this happens only for particular initial conditions, which in turn correspond to classical submanifolds. This result can be interpreted as a geometric manifestation of the uncertainty principle.

  8. Simulation of Stochastic Processes by Coupled ODE-PDE

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2008-01-01

    A document discusses the emergence of randomness in solutions of coupled, fully deterministic ODE-PDE (ordinary differential equations-partial differential equations) due to failure of the Lipschitz condition as a new phenomenon. It is possible to exploit the special properties of ordinary differential equations (represented by an arbitrarily chosen, dynamical system) coupled with the corresponding Liouville equations (used to describe the evolution of initial uncertainties in terms of joint probability distribution) in order to simulate stochastic processes with the proscribed probability distributions. The important advantage of the proposed approach is that the simulation does not require a random-number generator.

  9. A practical method to assess model sensitivity and parameter uncertainty in C cycle models

    NASA Astrophysics Data System (ADS)

    Delahaies, Sylvain; Roulstone, Ian; Nichols, Nancy

    2015-04-01

    The carbon cycle combines multiple spatial and temporal scales, from minutes to hours for the chemical processes occurring in plant cells to several hundred of years for the exchange between the atmosphere and the deep ocean and finally to millennia for the formation of fossil fuels. Together with our knowledge of the transformation processes involved in the carbon cycle, many Earth Observation systems are now available to help improving models and predictions using inverse modelling techniques. A generic inverse problem consists in finding a n-dimensional state vector x such that h(x) = y, for a given N-dimensional observation vector y, including random noise, and a given model h. The problem is well posed if the three following conditions hold: 1) there exists a solution, 2) the solution is unique and 3) the solution depends continuously on the input data. If at least one of these conditions is violated the problem is said ill-posed. The inverse problem is often ill-posed, a regularization method is required to replace the original problem with a well posed problem and then a solution strategy amounts to 1) constructing a solution x, 2) assessing the validity of the solution, 3) characterizing its uncertainty. The data assimilation linked ecosystem carbon (DALEC) model is a simple box model simulating the carbon budget allocation for terrestrial ecosystems. Intercomparison experiments have demonstrated the relative merit of various inverse modelling strategies (MCMC, ENKF) to estimate model parameters and initial carbon stocks for DALEC using eddy covariance measurements of net ecosystem exchange of CO2 and leaf area index observations. Most results agreed on the fact that parameters and initial stocks directly related to fast processes were best estimated with narrow confidence intervals, whereas those related to slow processes were poorly estimated with very large uncertainties. While other studies have tried to overcome this difficulty by adding complementary data streams or by considering longer observation windows no systematic analysis has been carried out so far to explain the large differences among results. We consider adjoint based methods to investigate inverse problems using DALEC and various data streams. Using resolution matrices we study the nature of the inverse problems (solution existence, uniqueness and stability) and show how standard regularization techniques affect resolution and stability properties. Instead of using standard prior information as a penalty term in the cost function to regularize the problems we constraint the parameter space using ecological balance conditions and inequality constraints. The efficiency and rapidity of this approach allows us to compute ensembles of solutions to the inverse problems from which we can establish the robustness of the variational method and obtain non Gaussian posterior distributions for the model parameters and initial carbon stocks.

  10. Seasonal streamflow prediction using ensemble streamflow prediction technique for the Rangitata and Waitaki River basins on the South Island of New Zealand

    NASA Astrophysics Data System (ADS)

    Singh, Shailesh Kumar

    2014-05-01

    Streamflow forecasts are essential for making critical decision for optimal allocation of water supplies for various demands that include irrigation for agriculture, habitat for fisheries, hydropower production and flood warning. The major objective of this study is to explore the Ensemble Streamflow Prediction (ESP) based forecast in New Zealand catchments and to highlights the present capability of seasonal flow forecasting of National Institute of Water and Atmospheric Research (NIWA). In this study a probabilistic forecast framework for ESP is presented. The basic assumption in ESP is that future weather pattern were experienced historically. Hence, past forcing data can be used with current initial condition to generate an ensemble of prediction. Small differences in initial conditions can result in large difference in the forecast. The initial state of catchment can be obtained by continuously running the model till current time and use this initial state with past forcing data to generate ensemble of flow for future. The approach taken here is to run TopNet hydrological models with a range of past forcing data (precipitation, temperature etc.) with current initial conditions. The collection of runs is called the ensemble. ESP give probabilistic forecasts for flow. From ensemble members the probability distributions can be derived. The probability distributions capture part of the intrinsic uncertainty in weather or climate. An ensemble stream flow prediction which provide probabilistic hydrological forecast with lead time up to 3 months is presented for Rangitata, Ahuriri, and Hooker and Jollie rivers in South Island of New Zealand. ESP based seasonal forecast have better skill than climatology. This system can provide better over all information for holistic water resource management.

  11. Targeting the right input data to improve crop modeling at global level

    NASA Astrophysics Data System (ADS)

    Adam, M.; Robertson, R.; Gbegbelegbe, S.; Jones, J. W.; Boote, K. J.; Asseng, S.

    2012-12-01

    Designed for location-specific simulations, the use of crop models at a global level raises important questions. Crop models are originally premised on small unit areas where environmental conditions and management practices are considered homogeneous. Specific information describing soils, climate, management, and crop characteristics are used in the calibration process. However, when scaling up for global application, we rely on information derived from geographical information systems and weather generators. To run crop models at broad, we use a modeling platform that assumes a uniformly generated grid cell as a unit area. Specific weather, specific soil and specific management practices for each crop are represented for each of the cell grids. Studies on the impacts of the uncertainties of weather information and climate change on crop yield at a global level have been carried out (Osborne et al, 2007, Nelson et al., 2010, van Bussel et al, 2011). Detailed information on soils and management practices at global level are very scarce but recognized to be of critical importance (Reidsma et al., 2009). Few attempts to assess the impact of their uncertainties on cropping systems performances can be found. The objectives of this study are (i) to determine sensitivities of a crop model to soil and management practices, inputs most relevant to low input rainfed cropping systems, and (ii) to define hotspots of sensitivity according to the input data. We ran DSSAT v4.5 globally (CERES-CROPSIM) to simulate wheat yields at 45arc-minute resolution. Cultivar parameters were calibrated and validated for different mega-environments (results not shown). The model was run for nitrogen-limited production systems. This setting was chosen as the most representative to simulate actual yield (especially for low-input rainfed agricultural systems) and assumes crop growth to be free of any pest and diseases damages. We conducted a sensitivity analysis on contrasting management practices, initial soil conditions, and soil characteristics information. Management practices were represented by planting date and the amount of fertilizer, initial conditions estimates for initial nitrogen, soil water, and stable soil carbon, and soil information is based on a simplified version of the WISE database, characterized by soil organic matter, texture and soil depth. We considered these factors as the most important determinants of nutrient supply to crops during their growing season. Our first global results demonstrate that the model is most sensitive to the initial conditions in terms of soil carbon and nitrogen (CN): wheat yields decreased by 45% when soil CN is null and increase by 15% when twice the soil CN content of the reference run is used. The yields did not appear to be very sensitive to initial soil water conditions, varying from 0% yield increase when initial soil water is set to wilting point to 6% yield increase when it was set to field capacity. They are slightly sensitive to nitrogen application: 8% yield decrease when no N is applied to 9% yield increase when 150 kg.ha-1 is applied. However, with closer examination of results, the model is more sensitive to nitrogen application than to initial soil CN content in Vietnam, Thailand and Japan compared to the rest of the world. More analyses per region and results on the planting dates and soil properties will be presented.

  12. Quantifying Groundwater Model Uncertainty

    NASA Astrophysics Data System (ADS)

    Hill, M. C.; Poeter, E.; Foglia, L.

    2007-12-01

    Groundwater models are characterized by the (a) processes simulated, (b) boundary conditions, (c) initial conditions, (d) method of solving the equation, (e) parameterization, and (f) parameter values. Models are related to the system of concern using data, some of which form the basis of observations used most directly, through objective functions, to estimate parameter values. Here we consider situations in which parameter values are determined by minimizing an objective function. Other methods of model development are not considered because their ad hoc nature generally prohibits clear quantification of uncertainty. Quantifying prediction uncertainty ideally includes contributions from (a) to (f). The parameter values of (f) tend to be continuous with respect to both the simulated equivalents of the observations and the predictions, while many aspects of (a) through (e) are discrete. This fundamental difference means that there are options for evaluating the uncertainty related to parameter values that generally do not exist for other aspects of a model. While the methods available for (a) to (e) can be used for the parameter values (f), the inferential methods uniquely available for (f) generally are less computationally intensive and often can be used to considerable advantage. However, inferential approaches require calculation of sensitivities. Whether the numerical accuracy and stability of the model solution required for accurate sensitivities is more broadly important to other model uses is an issue that needs to be addressed. Alternative global methods can require 100 or even 1,000 times the number of runs needed by inferential methods, though methods of reducing the number of needed runs are being developed and tested. Here we present three approaches for quantifying model uncertainty and investigate their strengths and weaknesses. (1) Represent more aspects as parameters so that the computationally efficient methods can be broadly applied. This approach is attainable through universal model analysis software such as UCODE-2005, PEST, and joint use of these programs, which allow many aspects of a model to be defined as parameters. (2) Use highly parameterized models to quantify aspects of (e). While promising, this approach implicitly includes parameterizations that may be considered unreasonable if investigated explicitly, so that resulting measures of uncertainty may be too large. (3) Use a combination of inferential and global methods that can be facilitated using the new software MMA (Multi-Model Analysis), which is constructed using the JUPITER API. Here we consider issues related to the model discrimination criteria calculated by MMA.

  13. Decomposition Technique for Remaining Useful Life Prediction

    NASA Technical Reports Server (NTRS)

    Saha, Bhaskar (Inventor); Goebel, Kai F. (Inventor); Saxena, Abhinav (Inventor); Celaya, Jose R. (Inventor)

    2014-01-01

    The prognostic tool disclosed here decomposes the problem of estimating the remaining useful life (RUL) of a component or sub-system into two separate regression problems: the feature-to-damage mapping and the operational conditions-to-damage-rate mapping. These maps are initially generated in off-line mode. One or more regression algorithms are used to generate each of these maps from measurements (and features derived from these), operational conditions, and ground truth information. This decomposition technique allows for the explicit quantification and management of different sources of uncertainty present in the process. Next, the maps are used in an on-line mode where run-time data (sensor measurements and operational conditions) are used in conjunction with the maps generated in off-line mode to estimate both current damage state as well as future damage accumulation. Remaining life is computed by subtracting the instance when the extrapolated damage reaches the failure threshold from the instance when the prediction is made.

  14. Which climate change path are we following? Bad news from Scots pine

    PubMed Central

    D’Andrea, Ettore; Rezaie, Negar; Cammarano, Mario; Matteucci, Giorgio

    2017-01-01

    Current expectations on future climate derive from coordinated experiments, which compile many climate models for sampling the entire uncertainty related to emission scenarios, initial conditions, and modelling process. Quantifying this uncertainty is important for taking decisions that are robust under a wide range of possible future conditions. Nevertheless, if uncertainty is too large, it can prevent from planning specific and effective measures. For this reason, reducing the spectrum of the possible scenarios to a small number of one or a few models that actually represent the climate pathway influencing natural ecosystems would substantially increase our planning capacity. Here we adopt a multidisciplinary approach based on the comparison of observed and expected spatial patterns of response to climate change in order to identify which specific models, among those included in the CMIP5, catch the real climate variation driving the response of natural ecosystems. We used dendrochronological analyses for determining the geographic pattern of recent growth trends for three European species of trees. At the same time, we modelled the climatic niche for the same species and forecasted the suitability variation expected across Europe under each different GCM. Finally, we estimated how well each GCM explains the real response of ecosystems, by comparing the expected variation with the observed growth trends. Doing this, we identified four climatic models that are coherent with the observed trends. These models are close to the highest range limit of the climatic variations expected by the ensemble of the CMIP5 models, suggesting that current predictions of climate change impacts on ecosystems could be underestimated. PMID:29252985

  15. Which climate change path are we following? Bad news from Scots pine.

    PubMed

    Bombi, Pierluigi; D'Andrea, Ettore; Rezaie, Negar; Cammarano, Mario; Matteucci, Giorgio

    2017-01-01

    Current expectations on future climate derive from coordinated experiments, which compile many climate models for sampling the entire uncertainty related to emission scenarios, initial conditions, and modelling process. Quantifying this uncertainty is important for taking decisions that are robust under a wide range of possible future conditions. Nevertheless, if uncertainty is too large, it can prevent from planning specific and effective measures. For this reason, reducing the spectrum of the possible scenarios to a small number of one or a few models that actually represent the climate pathway influencing natural ecosystems would substantially increase our planning capacity. Here we adopt a multidisciplinary approach based on the comparison of observed and expected spatial patterns of response to climate change in order to identify which specific models, among those included in the CMIP5, catch the real climate variation driving the response of natural ecosystems. We used dendrochronological analyses for determining the geographic pattern of recent growth trends for three European species of trees. At the same time, we modelled the climatic niche for the same species and forecasted the suitability variation expected across Europe under each different GCM. Finally, we estimated how well each GCM explains the real response of ecosystems, by comparing the expected variation with the observed growth trends. Doing this, we identified four climatic models that are coherent with the observed trends. These models are close to the highest range limit of the climatic variations expected by the ensemble of the CMIP5 models, suggesting that current predictions of climate change impacts on ecosystems could be underestimated.

  16. Validation of the Spatial Accuracy of the ExacTracRTM Adaptive Gating System

    NASA Astrophysics Data System (ADS)

    Twork, Gregory

    Stereotactic body radiation therapy (SBRT) is a method of treatment that is used in extracranial locations, including the abdominal and thoracic cavities, as well as spinal and paraspinal locations. At the McGill University Health Centre, liver SBRT treatments include gating, which places the treatment beam on a duty cycle controlled by tracking of fiducial markers moving with the patient's breathing cycle. Respiratory gated treatments aim to spare normal tissue, while delivering a dose properly to a moving target. The ExacTracRTM system (BrainLAB AG Germany) is an image-guided radiotherapy system consisting of a combination of infra-red (IR) cameras and dual kilovoltage (kV) X-ray tubes. The IR system is used to track patient positioning and respiratory motion, while the kV X-rays are used to determine a positional shift based on internal anatomy or fiducial markers. In order to validate the system's ability to treat under gating conditions, each step of the SBRT process was evaluated quantitatively. Initially the system was tested under ideal static conditions, followed by a study including gated parameters. The uncertainties of the isocenters, positioning algorithm, planning computed tomography (CT) and four dimensional CT (4DCT) scans, gating window size and tumor motion were evaluated for their contributions to the total uncertainty in treatment. The mechanical isocenter and 4DCT were found to be the largest sources of uncertainty. However, for tumors with large internal amplitudes (>2.25 cm) that are treated with large gating windows (>30%) the gating parameters can contribute more than 1.1 +/- 1.8 mm.

  17. EnKF with closed-eye period - bridging intermittent model structural errors in soil hydrology

    NASA Astrophysics Data System (ADS)

    Bauser, Hannes H.; Jaumann, Stefan; Berg, Daniel; Roth, Kurt

    2017-04-01

    The representation of soil water movement exposes uncertainties in all model components, namely dynamics, forcing, subscale physics and the state itself. Especially model structural errors in the description of the dynamics are difficult to represent and can lead to an inconsistent estimation of the other components. We address the challenge of a consistent aggregation of information for a manageable specific hydraulic situation: a 1D soil profile with TDR-measured water contents during a time period of less than 2 months. We assess the uncertainties for this situation and detect initial condition, soil hydraulic parameters, small-scale heterogeneity, upper boundary condition, and (during rain events) the local equilibrium assumption by the Richards equation as the most important ones. We employ an iterative Ensemble Kalman Filter (EnKF) with an augmented state. Based on a single rain event, we are able to reduce all uncertainties directly, except for the intermittent violation of the local equilibrium assumption. We detect these times by analyzing the temporal evolution of estimated parameters. By introducing a closed-eye period - during which we do not estimate parameters, but only guide the state based on measurements - we can bridge these times. The introduced closed-eye period ensured constant parameters, suggesting that they resemble the believed true material properties. The closed-eye period improves predictions during periods when the local equilibrium assumption is met, but consequently worsens predictions when the assumption is violated. Such a prediction requires a description of the dynamics during local non-equilibrium phases, which remains an open challenge.

  18. Canadian snow and sea ice: assessment of snow, sea ice, and related climate processes in Canada's Earth system model and climate-prediction system

    NASA Astrophysics Data System (ADS)

    Kushner, Paul J.; Mudryk, Lawrence R.; Merryfield, William; Ambadan, Jaison T.; Berg, Aaron; Bichet, Adéline; Brown, Ross; Derksen, Chris; Déry, Stephen J.; Dirkson, Arlan; Flato, Greg; Fletcher, Christopher G.; Fyfe, John C.; Gillett, Nathan; Haas, Christian; Howell, Stephen; Laliberté, Frédéric; McCusker, Kelly; Sigmond, Michael; Sospedra-Alfonso, Reinel; Tandon, Neil F.; Thackeray, Chad; Tremblay, Bruno; Zwiers, Francis W.

    2018-04-01

    The Canadian Sea Ice and Snow Evolution (CanSISE) Network is a climate research network focused on developing and applying state-of-the-art observational data to advance dynamical prediction, projections, and understanding of seasonal snow cover and sea ice in Canada and the circumpolar Arctic. This study presents an assessment from the CanSISE Network of the ability of the second-generation Canadian Earth System Model (CanESM2) and the Canadian Seasonal to Interannual Prediction System (CanSIPS) to simulate and predict snow and sea ice from seasonal to multi-decadal timescales, with a focus on the Canadian sector. To account for observational uncertainty, model structural uncertainty, and internal climate variability, the analysis uses multi-source observations, multiple Earth system models (ESMs) in Phase 5 of the Coupled Model Intercomparison Project (CMIP5), and large initial-condition ensembles of CanESM2 and other models. It is found that the ability of the CanESM2 simulation to capture snow-related climate parameters, such as cold-region surface temperature and precipitation, lies within the range of currently available international models. Accounting for the considerable disagreement among satellite-era observational datasets on the distribution of snow water equivalent, CanESM2 has too much springtime snow mass over Canada, reflecting a broader northern hemispheric positive bias. Biases in seasonal snow cover extent are generally less pronounced. CanESM2 also exhibits retreat of springtime snow generally greater than observational estimates, after accounting for observational uncertainty and internal variability. Sea ice is biased low in the Canadian Arctic, which makes it difficult to assess the realism of long-term sea ice trends there. The strengths and weaknesses of the modelling system need to be understood as a practical tradeoff: the Canadian models are relatively inexpensive computationally because of their moderate resolution, thus enabling their use in operational seasonal prediction and for generating large ensembles of multidecadal simulations. Improvements in climate-prediction systems like CanSIPS rely not just on simulation quality but also on using novel observational constraints and the ready transfer of research to an operational setting. Improvements in seasonal forecasting practice arising from recent research include accurate initialization of snow and frozen soil, accounting for observational uncertainty in forecast verification, and sea ice thickness initialization using statistical predictors available in real time.

  19. Adaptive robust fault-tolerant control for linear MIMO systems with unmatched uncertainties

    NASA Astrophysics Data System (ADS)

    Zhang, Kangkang; Jiang, Bin; Yan, Xing-Gang; Mao, Zehui

    2017-10-01

    In this paper, two novel fault-tolerant control design approaches are proposed for linear MIMO systems with actuator additive faults, multiplicative faults and unmatched uncertainties. For time-varying multiplicative and additive faults, new adaptive laws and additive compensation functions are proposed. A set of conditions is developed such that the unmatched uncertainties are compensated by actuators in control. On the other hand, for unmatched uncertainties with their projection in unmatched space being not zero, based on a (vector) relative degree condition, additive functions are designed to compensate for the uncertainties from output channels in the presence of actuator faults. The developed fault-tolerant control schemes are applied to two aircraft systems to demonstrate the efficiency of the proposed approaches.

  20. Model uncertainties of local-thermodynamic-equilibrium K-shell spectroscopy

    NASA Astrophysics Data System (ADS)

    Nagayama, T.; Bailey, J. E.; Mancini, R. C.; Iglesias, C. A.; Hansen, S. B.; Blancard, C.; Chung, H. K.; Colgan, J.; Cosse, Ph.; Faussurier, G.; Florido, R.; Fontes, C. J.; Gilleron, F.; Golovkin, I. E.; Kilcrease, D. P.; Loisel, G.; MacFarlane, J. J.; Pain, J.-C.; Rochau, G. A.; Sherrill, M. E.; Lee, R. W.

    2016-09-01

    Local-thermodynamic-equilibrium (LTE) K-shell spectroscopy is a common tool to diagnose electron density, ne, and electron temperature, Te, of high-energy-density (HED) plasmas. Knowing the accuracy of such diagnostics is important to provide quantitative conclusions of many HED-plasma research efforts. For example, Fe opacities were recently measured at multiple conditions at the Sandia National Laboratories Z machine (Bailey et al., 2015), showing significant disagreement with modeled opacities. Since the plasma conditions were measured using K-shell spectroscopy of tracer Mg (Nagayama et al., 2014), one concern is the accuracy of the inferred Fe conditions. In this article, we investigate the K-shell spectroscopy model uncertainties by analyzing the Mg spectra computed with 11 different models at the same conditions. We find that the inferred conditions differ by ±20-30% in ne and ±2-4% in Te depending on the choice of spectral model. Also, we find that half of the Te uncertainty comes from ne uncertainty. To refine the accuracy of the K-shell spectroscopy, it is important to scrutinize and experimentally validate line-shape theory. We investigate the impact of the inferred ne and Te model uncertainty on the Fe opacity measurements. Its impact is small and does not explain the reported discrepancies.

  1. Incorporating uncertainty in RADTRAN 6.0 input files.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dennis, Matthew L.; Weiner, Ruth F.; Heames, Terence John

    Uncertainty may be introduced into RADTRAN analyses by distributing input parameters. The MELCOR Uncertainty Engine (Gauntt and Erickson, 2004) has been adapted for use in RADTRAN to determine the parameter shape and minimum and maximum of the distribution, to sample on the distribution, and to create an appropriate RADTRAN batch file. Coupling input parameters is not possible in this initial application. It is recommended that the analyst be very familiar with RADTRAN and able to edit or create a RADTRAN input file using a text editor before implementing the RADTRAN Uncertainty Analysis Module. Installation of the MELCOR Uncertainty Engine ismore » required for incorporation of uncertainty into RADTRAN. Gauntt and Erickson (2004) provides installation instructions as well as a description and user guide for the uncertainty engine.« less

  2. Patients' and partners' perspectives of chronic illness and its management.

    PubMed

    Checton, Maria G; Greene, Kathryn; Magsamen-Conrad, Kate; Venetis, Maria K

    2012-06-01

    This study is framed in theories of illness uncertainty (Babrow, A. S., 2007, Problematic integration theory. In B. B. Whaley & W. Samter (Eds.), Explaining communication: Contemporary theories and exemplars (pp. 181-200). Mahwah, NJ: Erlbaum; Babrow & Matthias, 2009; Brashers, D. E., 2007, A theory of communication and uncertainty management. In B. B. Whaley & W. Samter (Eds.), Explaining communication: Contemporary theories and exemplars (pp. 201-218). Mahwah, NJ: Erlbaum; Hogan, T. P., & Brashers, D. E. (2009). The theory of communication and uncertainty management: Implications for the wider realm of information behavior. In T. D. Afifi & W. A. Afifi (Eds.), Uncertainty and information regulation in interpersonal contexts: Theories and applications, (pp. 45-66). New York, NY: Routledge; Mishel, M. H. (1999). Uncertainty in chronic illness. Annual Review of Nursing Research, 17, 269-294; Mishel, M. H., & Clayton, M. F., 2003, Theories of uncertainty. In M. J. Smith & P. R. Liehr (Eds.), Middle range theory for nursing (pp. 25-48). New York, NY: Springer) and health information management (Afifi, W. A., & Weiner, J. L., 2004, Toward a theory of motivated information management. Communication Theory, 14, 167-190. doi:10.1111/j.1468-2885.2004.tb00310.x; Greene, K., 2009, An integrated model of health disclosure decision-making. In T. D. Afifi & W. A. Afifi (Eds.), Uncertainty and information regulation in interpersonal contexts: Theories and applications (pp. 226-253). New York, NY: Routledge) and examines how couples experience uncertainty and interference related to one partner's chronic health condition. Specifically, a model is hypothesized in which illness uncertainty (i.e., stigma, prognosis, and symptom) and illness interference predict communication efficacy and health condition management. Participants include 308 dyads in which one partner has a chronic health condition. Data were analyzed using structural equation modeling. Results indicate that there are significant differences in (a) how patients and partners experience illness uncertainty and illness interference and (b) how appraisals of illness uncertainty and illness interference influence communication efficacy and health condition management. We discuss the findings and implications of the study.

  3. Optimal strategies for throwing accurately

    PubMed Central

    2017-01-01

    The accuracy of throwing in games and sports is governed by how errors in planning and initial conditions are propagated by the dynamics of the projectile. In the simplest setting, the projectile path is typically described by a deterministic parabolic trajectory which has the potential to amplify noisy launch conditions. By analysing how parabolic trajectories propagate errors, we show how to devise optimal strategies for a throwing task demanding accuracy. Our calculations explain observed speed–accuracy trade-offs, preferred throwing style of overarm versus underarm, and strategies for games such as dart throwing, despite having left out most biological complexities. As our criteria for optimal performance depend on the target location, shape and the level of uncertainty in planning, they also naturally suggest an iterative scheme to learn throwing strategies by trial and error. PMID:28484641

  4. Optimal strategies for throwing accurately

    NASA Astrophysics Data System (ADS)

    Venkadesan, M.; Mahadevan, L.

    2017-04-01

    The accuracy of throwing in games and sports is governed by how errors in planning and initial conditions are propagated by the dynamics of the projectile. In the simplest setting, the projectile path is typically described by a deterministic parabolic trajectory which has the potential to amplify noisy launch conditions. By analysing how parabolic trajectories propagate errors, we show how to devise optimal strategies for a throwing task demanding accuracy. Our calculations explain observed speed-accuracy trade-offs, preferred throwing style of overarm versus underarm, and strategies for games such as dart throwing, despite having left out most biological complexities. As our criteria for optimal performance depend on the target location, shape and the level of uncertainty in planning, they also naturally suggest an iterative scheme to learn throwing strategies by trial and error.

  5. Aerodynamic characterisation and trajectory simulations for the Ariane-5 booster recovery system

    NASA Astrophysics Data System (ADS)

    Meiboom, F. P.

    One of the most critical aspects of the early phases of the development of the Ariane-5 booster recovery system was the determination of the behavior of the booster during its atmospheric reentry, since this behavior determines the start conditions for the parachute system elements. A combination of wind-tunnel tests (subsonic and supersonic) and analytical methods was applied to define the aerodynamic characteristics of the booster. This aerodynamic characterization in combination with information of the ascent trajectory, atmospheric properties and booster mass and inertia were used as input for the 6-DOF trajectory simulations of the vehicle. Uncertainties in aerodynamic properties and deviations in atmospheric and booster properties were incorporated to define the range of initial conditions for the parachute system, utilizing stochastic (Monte-Carlo) methods.

  6. Recent developments of the NESSUS probabilistic structural analysis computer program

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  7. Guaranteeing robustness of structural condition monitoring to environmental variability

    NASA Astrophysics Data System (ADS)

    Van Buren, Kendra; Reilly, Jack; Neal, Kyle; Edwards, Harry; Hemez, François

    2017-01-01

    Advances in sensor deployment and computational modeling have allowed significant strides to be recently made in the field of Structural Health Monitoring (SHM). One widely used SHM strategy is to perform a vibration analysis where a model of the structure's pristine (undamaged) condition is compared with vibration response data collected from the physical structure. Discrepancies between model predictions and monitoring data can be interpreted as structural damage. Unfortunately, multiple sources of uncertainty must also be considered in the analysis, including environmental variability, unknown model functional forms, and unknown values of model parameters. Not accounting for these sources of uncertainty can lead to false-positives or false-negatives in the structural condition assessment. To manage the uncertainty, we propose a robust SHM methodology that combines three technologies. A time series algorithm is trained using "baseline" data to predict the vibration response, compare predictions to actual measurements collected on a potentially damaged structure, and calculate a user-defined damage indicator. The second technology handles the uncertainty present in the problem. An analysis of robustness is performed to propagate this uncertainty through the time series algorithm and obtain the corresponding bounds of variation of the damage indicator. The uncertainty description and robustness analysis are both inspired by the theory of info-gap decision-making. Lastly, an appropriate "size" of the uncertainty space is determined through physical experiments performed in laboratory conditions. Our hypothesis is that examining how the uncertainty space changes throughout time might lead to superior diagnostics of structural damage as compared to only monitoring the damage indicator. This methodology is applied to a portal frame structure to assess if the strategy holds promise for robust SHM. (Publication approved for unlimited, public release on October-28-2015, LA-UR-15-28442, unclassified.)

  8. Advanced Modeling and Uncertainty Quantification for Flight Dynamics; Interim Results and Challenges

    NASA Technical Reports Server (NTRS)

    Hyde, David C.; Shweyk, Kamal M.; Brown, Frank; Shah, Gautam

    2014-01-01

    As part of the NASA Vehicle Systems Safety Technologies (VSST), Assuring Safe and Effective Aircraft Control Under Hazardous Conditions (Technical Challenge #3), an effort is underway within Boeing Research and Technology (BR&T) to address Advanced Modeling and Uncertainty Quantification for Flight Dynamics (VSST1-7). The scope of the effort is to develop and evaluate advanced multidisciplinary flight dynamics modeling techniques, including integrated uncertainties, to facilitate higher fidelity response characterization of current and future aircraft configurations approaching and during loss-of-control conditions. This approach is to incorporate multiple flight dynamics modeling methods for aerodynamics, structures, and propulsion, including experimental, computational, and analytical. Also to be included are techniques for data integration and uncertainty characterization and quantification. This research shall introduce new and updated multidisciplinary modeling and simulation technologies designed to improve the ability to characterize airplane response in off-nominal flight conditions. The research shall also introduce new techniques for uncertainty modeling that will provide a unified database model comprised of multiple sources, as well as an uncertainty bounds database for each data source such that a full vehicle uncertainty analysis is possible even when approaching or beyond Loss of Control boundaries. Methodologies developed as part of this research shall be instrumental in predicting and mitigating loss of control precursors and events directly linked to causal and contributing factors, such as stall, failures, damage, or icing. The tasks will include utilizing the BR&T Water Tunnel to collect static and dynamic data to be compared to the GTM extended WT database, characterizing flight dynamics in off-nominal conditions, developing tools for structural load estimation under dynamic conditions, devising methods for integrating various modeling elements into a real-time simulation capability, generating techniques for uncertainty modeling that draw data from multiple modeling sources, and providing a unified database model that includes nominal plus increments for each flight condition. This paper presents status of testing in the BR&T water tunnel and analysis of the resulting data and efforts to characterize these data using alternative modeling methods. Program challenges and issues are also presented.

  9. Improved ensemble-mean forecasting of ENSO events by a zero-mean stochastic error model of an intermediate coupled model

    NASA Astrophysics Data System (ADS)

    Zheng, Fei; Zhu, Jiang

    2017-04-01

    How to design a reliable ensemble prediction strategy with considering the major uncertainties of a forecasting system is a crucial issue for performing an ensemble forecast. In this study, a new stochastic perturbation technique is developed to improve the prediction skills of El Niño-Southern Oscillation (ENSO) through using an intermediate coupled model. We first estimate and analyze the model uncertainties from the ensemble Kalman filter analysis results through assimilating the observed sea surface temperatures. Then, based on the pre-analyzed properties of model errors, we develop a zero-mean stochastic model-error model to characterize the model uncertainties mainly induced by the missed physical processes of the original model (e.g., stochastic atmospheric forcing, extra-tropical effects, Indian Ocean Dipole). Finally, we perturb each member of an ensemble forecast at each step by the developed stochastic model-error model during the 12-month forecasting process, and add the zero-mean perturbations into the physical fields to mimic the presence of missing processes and high-frequency stochastic noises. The impacts of stochastic model-error perturbations on ENSO deterministic predictions are examined by performing two sets of 21-yr hindcast experiments, which are initialized from the same initial conditions and differentiated by whether they consider the stochastic perturbations. The comparison results show that the stochastic perturbations have a significant effect on improving the ensemble-mean prediction skills during the entire 12-month forecasting process. This improvement occurs mainly because the nonlinear terms in the model can form a positive ensemble-mean from a series of zero-mean perturbations, which reduces the forecasting biases and then corrects the forecast through this nonlinear heating mechanism.

  10. Improving Forecasts Through Realistic Uncertainty Estimates: A Novel Data Driven Method for Model Uncertainty Quantification in Data Assimilation

    NASA Astrophysics Data System (ADS)

    Pathiraja, S. D.; Moradkhani, H.; Marshall, L. A.; Sharma, A.; Geenens, G.

    2016-12-01

    Effective combination of model simulations and observations through Data Assimilation (DA) depends heavily on uncertainty characterisation. Many traditional methods for quantifying model uncertainty in DA require some level of subjectivity (by way of tuning parameters or by assuming Gaussian statistics). Furthermore, the focus is typically on only estimating the first and second moments. We propose a data-driven methodology to estimate the full distributional form of model uncertainty, i.e. the transition density p(xt|xt-1). All sources of uncertainty associated with the model simulations are considered collectively, without needing to devise stochastic perturbations for individual components (such as model input, parameter and structural uncertainty). A training period is used to derive the distribution of errors in observed variables conditioned on hidden states. Errors in hidden states are estimated from the conditional distribution of observed variables using non-linear optimization. The theory behind the framework and case study applications are discussed in detail. Results demonstrate improved predictions and more realistic uncertainty bounds compared to a standard perturbation approach.

  11. Uncertainties in Integrated Climate Change Impact Assessments by Sub-setting GCMs Based on Annual as well as Crop Growing Period under Rice Based Farming System of Indo-Gangetic Plains of India

    NASA Astrophysics Data System (ADS)

    Pillai, S. N.; Singh, H.; Panwar, A. S.; Meena, M. S.; Singh, S. V.; Singh, B.; Paudel, G. P.; Baigorria, G. A.; Ruane, A. C.; McDermid, S.; Boote, K. J.; Porter, C.; Valdivia, R. O.

    2016-12-01

    Integrated assessment of climate change impact on agricultural productivity is a challenge to the scientific community due to uncertainties of input data, particularly the climate, soil, crop calibration and socio-economic dataset. However, the uncertainty due to selection of GCMs is the major source due to complex underlying processes involved in initial as well as the boundary conditions dealt in solving the air-sea interactions. Under Agricultural Modeling Intercomparison and Improvement Project (AgMIP), the Indo-Gangetic Plains Regional Research Team investigated the uncertainties caused due to selection of GCMs through sub-setting based on annual as well as crop-growth period of rice-wheat systems in AgMIP Integrated Assessment methodology. The AgMIP Phase II protocols were used to study the linking of climate-crop-economic models for two study sites Meerut and Karnal to analyse the sensitivity of current production systems to climate change. Climate Change Projections were made using 29 CMIP5 GCMs under RCP4.5 and RCP 8.5 during mid-century period (2040-2069). Two crop models (APSIM & DSSAT) were used. TOA-MD economic model was used for integrated assessment. Based on RAPs (Representative Agricultural Pathways), some of the parameters, which are not possible to get through modeling, derived from literature and interactions with stakeholders incorporated into the TOA-MD model for integrated assessment.

  12. Parameterization of Model Validating Sets for Uncertainty Bound Optimizations. Revised

    NASA Technical Reports Server (NTRS)

    Lim, K. B.; Giesy, D. P.

    2000-01-01

    Given measurement data, a nominal model and a linear fractional transformation uncertainty structure with an allowance on unknown but bounded exogenous disturbances, easily computable tests for the existence of a model validating uncertainty set are given. Under mild conditions, these tests are necessary and sufficient for the case of complex, nonrepeated, block-diagonal structure. For the more general case which includes repeated and/or real scalar uncertainties, the tests are only necessary but become sufficient if a collinearity condition is also satisfied. With the satisfaction of these tests, it is shown that a parameterization of all model validating sets of plant models is possible. The new parameterization is used as a basis for a systematic way to construct or perform uncertainty tradeoff with model validating uncertainty sets which have specific linear fractional transformation structure for use in robust control design and analysis. An illustrative example which includes a comparison of candidate model validating sets is given.

  13. Effects of directional uncertainty on visually-guided joystick pointing.

    PubMed

    Berryhill, Marian; Kveraga, Kestutis; Hughes, Howard C

    2005-02-01

    Reaction times generally follow the predictions of Hick's law as stimulus-response uncertainty increases, although notable exceptions include the oculomotor system. Saccadic and smooth pursuit eye movement reaction times are independent of stimulus-response uncertainty. Previous research showed that joystick pointing to targets, a motor analog of saccadic eye movements, is only modestly affected by increased stimulus-response uncertainty; however, a no-uncertainty condition (simple reaction time to 1 possible target) was not included. Here, we re-evaluate manual joystick pointing including a no-uncertainty condition. Analysis indicated simple joystick pointing reaction times were significantly faster than choice reaction times. Choice reaction times (2, 4, or 8 possible target locations) only slightly increased as the number of possible targets increased. These data suggest that, as with joystick tracking (a motor analog of smooth pursuit eye movements), joystick pointing is more closely approximated by a simple/choice step function than the log function predicted by Hick's law.

  14. High precision Hugoniot measurements on statically pre-compressed fluid helium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seagle, Christopher T.; Reinhart, William D.; Lopez, Andrew J.

    Here we describe how the capability for statically pre-compressing fluid targets for Hugoniot measurements utilizing gas gun driven flyer plates has been developed. Pre-compression expands the capability for initial condition control, allowing access to thermodynamic states off the principal Hugoniot. Absolute Hugoniot measurements with an uncertainty less than 3% on density and pressure were obtained on statically pre-compressed fluid helium utilizing a two stage light gas gun. Helium is highly compressible; the locus of shock states resulting from dynamic loading of an initially compressed sample at room temperature is significantly denser than the cryogenic fluid Hugoniot even for relatively modestmore » (0.27–0.38 GPa) initial pressures. Lastly, the dynamic response of pre-compressed helium in the initial density range of 0.21–0.25 g/cm3 at ambient temperature may be described by a linear shock velocity (us) and particle velocity (u p) relationship: u s = C 0 + su p, with C 0 = 1.44 ± 0.14 km/s and s = 1.344 ± 0.025.« less

  15. High precision Hugoniot measurements on statically pre-compressed fluid helium

    DOE PAGES

    Seagle, Christopher T.; Reinhart, William D.; Lopez, Andrew J.; ...

    2016-09-27

    Here we describe how the capability for statically pre-compressing fluid targets for Hugoniot measurements utilizing gas gun driven flyer plates has been developed. Pre-compression expands the capability for initial condition control, allowing access to thermodynamic states off the principal Hugoniot. Absolute Hugoniot measurements with an uncertainty less than 3% on density and pressure were obtained on statically pre-compressed fluid helium utilizing a two stage light gas gun. Helium is highly compressible; the locus of shock states resulting from dynamic loading of an initially compressed sample at room temperature is significantly denser than the cryogenic fluid Hugoniot even for relatively modestmore » (0.27–0.38 GPa) initial pressures. Lastly, the dynamic response of pre-compressed helium in the initial density range of 0.21–0.25 g/cm3 at ambient temperature may be described by a linear shock velocity (us) and particle velocity (u p) relationship: u s = C 0 + su p, with C 0 = 1.44 ± 0.14 km/s and s = 1.344 ± 0.025.« less

  16. Reward uncertainty enhances incentive salience attribution as sign-tracking

    PubMed Central

    Anselme, Patrick; Robinson, Mike J. F.; Berridge, Kent C.

    2014-01-01

    Conditioned stimuli (CSs) come to act as motivational magnets following repeated association with unconditioned stimuli (UCSs) such as sucrose rewards. By traditional views, the more reliably predictive a Pavlovian CS-UCS association, the more the CS becomes attractive. However, in some cases, less predictability might equal more motivation. Here we examined the effect of introducing uncertainty in CS-UCS association on CS strength as an attractive motivation magnet. In the present study, Experiment 1 assessed the effects of Pavlovian predictability versus uncertainty about reward probability and/or reward magnitude on the acquisition and expression of sign-tracking (ST) and goal-tracking (GT) responses in an autoshaping procedure. Results suggested that uncertainty produced strongest incentive salience expressed as sign-tracking. Experiment 2 examined whether a within-individual temporal shift from certainty to uncertainty conditions could produce a stronger CS motivational magnet when uncertainty began, and found that sign-tracking still increased after the shift. Overall, our results support earlier reports that ST responses become more pronounced in the presence of uncertainty regarding CS-UCS associations, especially when uncertainty combines both probability and magnitude. These results suggest that Pavlovian uncertainty, although diluting predictability, is still able to enhance the incentive motivational power of particular CSs. PMID:23078951

  17. When, not if: the inescapability of an uncertain climate future.

    PubMed

    Ballard, Timothy; Lewandowsky, Stephan

    2015-11-28

    Climate change projections necessarily involve uncertainty. Analysis of the physics and mathematics of the climate system reveals that greater uncertainty about future temperature increases is nearly always associated with greater expected damages from climate change. In contrast to those normative constraints, uncertainty is frequently cited in public discourse as a reason to delay mitigative action. This failure to understand the actual implications of uncertainty may incur notable future costs. It is therefore important to communicate uncertainty in a way that improves people's understanding of climate change risks. We examined whether responses to projections were influenced by whether the projection emphasized uncertainty in the outcome or in its time of arrival. We presented participants with statements and graphs indicating projected increases in temperature, sea levels, ocean acidification and a decrease in arctic sea ice. In the uncertain-outcome condition, statements reported the upper and lower confidence bounds of the projected outcome at a fixed time point. In the uncertain time-of-arrival condition, statements reported the upper and lower confidence bounds of the projected time of arrival for a fixed outcome. Results suggested that people perceived the threat as more serious and were more likely to encourage mitigative action in the time-uncertain condition than in the outcome-uncertain condition. This finding has implications for effectively communicating the climate change risks to policy-makers and the general public. © 2015 The Author(s).

  18. In-Flight Alignment Using H ∞ Filter for Strapdown INS on Aircraft

    PubMed Central

    Pei, Fu-Jun; Liu, Xuan; Zhu, Li

    2014-01-01

    In-flight alignment is an effective way to improve the accuracy and speed of initial alignment for strapdown inertial navigation system (INS). During the aircraft flight, strapdown INS alignment was disturbed by lineal and angular movements of the aircraft. To deal with the disturbances in dynamic initial alignment, a novel alignment method for SINS is investigated in this paper. In this method, an initial alignment error model of SINS in the inertial frame is established. The observability of the system is discussed by piece-wise constant system (PWCS) theory and observable degree is computed by the singular value decomposition (SVD) theory. It is demonstrated that the system is completely observable, and all the system state parameters can be estimated by optimal filter. Then a H ∞ filter was designed to resolve the uncertainty of measurement noise. The simulation results demonstrate that the proposed algorithm can reach a better accuracy under the dynamic disturbance condition. PMID:24511300

  19. Robust fixed-time synchronization for uncertain complex-valued neural networks with discontinuous activation functions.

    PubMed

    Ding, Xiaoshuai; Cao, Jinde; Alsaedi, Ahmed; Alsaadi, Fuad E; Hayat, Tasawar

    2017-06-01

    This paper is concerned with the fixed-time synchronization for a class of complex-valued neural networks in the presence of discontinuous activation functions and parameter uncertainties. Fixed-time synchronization not only claims that the considered master-slave system realizes synchronization within a finite time segment, but also requires a uniform upper bound for such time intervals for all initial synchronization errors. To accomplish the target of fixed-time synchronization, a novel feedback control procedure is designed for the slave neural networks. By means of the Filippov discontinuity theories and Lyapunov stability theories, some sufficient conditions are established for the selection of control parameters to guarantee synchronization within a fixed time, while an upper bound of the settling time is acquired as well, which allows to be modulated to predefined values independently on initial conditions. Additionally, criteria of modified controller for assurance of fixed-time anti-synchronization are also derived for the same system. An example is included to illustrate the proposed methodologies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Evaluating Uncertainty of Runoff Simulation using SWAT model of the Feilaixia Watershed in China Based on the GLUE Method

    NASA Astrophysics Data System (ADS)

    Chen, X.; Huang, G.

    2017-12-01

    In recent years, distributed hydrological models have been widely used in storm water management, water resources protection and so on. Therefore, how to evaluate the uncertainty of the model reasonably and efficiently becomes a hot topic today. In this paper, the soil and water assessment tool (SWAT) model is constructed for the study area of China's Feilaixia watershed, and the uncertainty of the runoff simulation is analyzed by GLUE method deeply. Taking the initial parameter range of GLUE method as the research core, the influence of different initial parameter ranges on model uncertainty is studied. In this paper, two sets of parameter ranges are chosen as the object of study, the first one (range 1) is recommended by SWAT-CUP and the second one (range 2) is calibrated by SUFI-2. The results showed that under the same number of simulations (10,000 times), the overall uncertainty obtained by the range 2 is less than the range 1. Specifically, the "behavioral" parameter sets for the range 2 is 10000 and for the range 1 is 4448. In the calibration and the validation, the ratio of P-factor to R-factor for range 1 is 1.387 and 1.391, and for range 2 is 1.405 and 1.462 respectively. In addition, the simulation result of range 2 is better with the NS and R2 slightly higher than range 1. Therefore, it can be concluded that using the parameter range calibrated by SUFI-2 as the initial parameter range for the GLUE is a way to effectively capture and evaluate the simulation uncertainty.

  1. The Role of Economic Uncertainty on the Block Economic Value - a New Valuation Approach / Rola Czynnika Niepewności Przy Obliczaniu Wskaźnika Rentowności - Nowe Podejście

    NASA Astrophysics Data System (ADS)

    Dehghani, H.; Ataee-Pour, M.

    2012-12-01

    The block economic value (EV) is one of the most important parameters in mine evaluation. This parameter can affect significant factors such as mining sequence, final pit limit and net present value. Nowadays, the aim of open pit mine planning is to define optimum pit limits and an optimum life of mine production scheduling that maximizes the pit value under some technical and operational constraints. Therefore, it is necessary to calculate the block economic value at the first stage of the mine planning process, correctly. Unrealistic block economic value estimation may cause the mining project managers to make the wrong decision and thus may impose inexpiable losses to the project. The effective parameters such as metal price, operating cost, grade and so forth are always assumed certain in the conventional methods of EV calculation. While, obviously, these parameters have uncertain nature. Therefore, usually, the conventional methods results are far from reality. In order to solve this problem, a new technique is used base on an invented binomial tree which is developed in this research. This method can calculate the EV and project PV under economic uncertainty. In this paper, the EV and project PV were initially determined using Whittle formula based on certain economic parameters and a multivariate binomial tree based on the economic uncertainties such as the metal price and cost uncertainties. Finally the results were compared. It is concluded that applying the metal price and cost uncertainties causes the calculated block economic value and net present value to be more realistic than certain conditions.

  2. Potentialities of ensemble strategies for flood forecasting over the Milano urban area

    NASA Astrophysics Data System (ADS)

    Ravazzani, Giovanni; Amengual, Arnau; Ceppi, Alessandro; Homar, Víctor; Romero, Romu; Lombardi, Gabriele; Mancini, Marco

    2016-08-01

    Analysis of ensemble forecasting strategies, which can provide a tangible backing for flood early warning procedures and mitigation measures over the Mediterranean region, is one of the fundamental motivations of the international HyMeX programme. Here, we examine two severe hydrometeorological episodes that affected the Milano urban area and for which the complex flood protection system of the city did not completely succeed. Indeed, flood damage have exponentially increased during the last 60 years, due to industrial and urban developments. Thus, the improvement of the Milano flood control system needs a synergism between structural and non-structural approaches. First, we examine how land-use changes due to urban development have altered the hydrological response to intense rainfalls. Second, we test a flood forecasting system which comprises the Flash-flood Event-based Spatially distributed rainfall-runoff Transformation, including Water Balance (FEST-WB) and the Weather Research and Forecasting (WRF) models. Accurate forecasts of deep moist convection and extreme precipitation are difficult to be predicted due to uncertainties arising from the numeric weather prediction (NWP) physical parameterizations and high sensitivity to misrepresentation of the atmospheric state; however, two hydrological ensemble prediction systems (HEPS) have been designed to explicitly cope with uncertainties in the initial and lateral boundary conditions (IC/LBCs) and physical parameterizations of the NWP model. No substantial differences in skill have been found between both ensemble strategies when considering an enhanced diversity of IC/LBCs for the perturbed initial conditions ensemble. Furthermore, no additional benefits have been found by considering more frequent LBCs in a mixed physics ensemble, as ensemble spread seems to be reduced. These findings could help to design the most appropriate ensemble strategies before these hydrometeorological extremes, given the computational cost of running such advanced HEPSs for operational purposes.

  3. Retrospective estimation of the electric and magnetic field exposure conditions in in vitro experimental reports reveal considerable potential for uncertainty.

    PubMed

    Portelli, Lucas A; Falldorf, Karsten; Thuróczy, György; Cuppen, Jan

    2018-04-01

    Experiments on cell cultures exposed to extremely low frequency (ELF, 3-300 Hz) magnetic fields are often subject to multiple sources of uncertainty associated with specific electric and magnetic field exposure conditions. Here we systemically quantify these uncertainties based on exposure conditions described in a group of bioelectromagnetic experimental reports for a representative sampling of the existing literature. The resulting uncertainties, stemming from insufficient, ambiguous, or erroneous description, design, implementation, or validation of the experimental methods and systems, were often substantial enough to potentially make any successful reproduction of the original experimental conditions difficult or impossible. Without making any assumption about the true biological relevance of ELF electric and magnetic fields, these findings suggest another contributing factor which may add to the overall variability and irreproducibility traditionally associated with experimental results of in vitro exposures to low-level ELF magnetic fields. Bioelectromagnetics. 39:231-243, 2018. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  4. The uncertainty cascade in flood risk assessment under changing climatic conditions - the Biala Tarnowska case study

    NASA Astrophysics Data System (ADS)

    Doroszkiewicz, Joanna; Romanowicz, Renata

    2016-04-01

    Uncertainty in the results of the hydraulic model is not only associated with the limitations of that model and the shortcomings of data. An important factor that has a major impact on the uncertainty of the flood risk assessment in a changing climate conditions is associated with the uncertainty of future climate scenarios (IPCC WG I, 2013). Future climate projections provided by global climate models are used to generate future runoff required as an input to hydraulic models applied in the derivation of flood risk maps. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the uncertainty of future climate projections, an uncertainty of flow routing model, the propagation of that uncertainty through the hydraulic model, and finally, the uncertainty related to the derivation of flood risk maps. One of the aims of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the process, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-section. The study shows that the application of the simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Acknowledgements: This work was supported by the project CHIHE (Climate Change Impact on Hydrological Extremes), carried out in the Institute of Geophysics Polish Academy of Sciences, funded by Norway Grants (contract No. Pol-Nor/196243/80/2013). The hydro-meteorological observations were provided by the Institute of Meteorology and Water Management (IMGW), Poland.

  5. Uncertainty in Wildfire Behavior

    NASA Astrophysics Data System (ADS)

    Finney, M.; Cohen, J. D.

    2013-12-01

    The challenge of predicting or modeling fire behavior is well recognized by scientists and managers who attempt predictions of fire spread rate or growth. At the scale of the spreading fire, the uncertainty in winds, moisture, fuel structure, and fire location make accurate predictions difficult, and the non-linear response of fire spread to these conditions means that average behavior is poorly represented by average environmental parameters. Even more difficult are estimations of threshold behaviors (e.g. spread/no-spread, crown fire initiation, ember generation and spotting) because the fire responds as a step-function to small changes in one or more environmental variables, translating to dynamical feedbacks and unpredictability. Recent research shows that ignition of fuel particles, itself a threshold phenomenon, depends on flame contact which is absolutely not steady or uniform. Recent studies of flame structure in both spreading and stationary fires reveals that much of the non-steadiness of the flames as they contact fuel particles results from buoyant instabilities that produce quasi-periodic flame structures. With fuel particle ignition produced by time-varying heating and short-range flame contact, future improvements in fire behavior modeling will likely require statistical approaches to deal with the uncertainty at all scales, including the level of heat transfer, the fuel arrangement, and weather.

  6. Climate impacts on human livelihoods: where uncertainty matters in projections of water availability

    NASA Astrophysics Data System (ADS)

    Lissner, T. K.; Reusser, D. E.; Schewe, J.; Lakes, T.; Kropp, J. P.

    2014-10-01

    Climate change will have adverse impacts on many different sectors of society, with manifold consequences for human livelihoods and well-being. However, a systematic method to quantify human well-being and livelihoods across sectors is so far unavailable, making it difficult to determine the extent of such impacts. Climate impact analyses are often limited to individual sectors (e.g. food or water) and employ sector-specific target measures, while systematic linkages to general livelihood conditions remain unexplored. Further, recent multi-model assessments have shown that uncertainties in projections of climate impacts deriving from climate and impact models, as well as greenhouse gas scenarios, are substantial, posing an additional challenge in linking climate impacts with livelihood conditions. This article first presents a methodology to consistently measure what is referred to here as AHEAD (Adequate Human livelihood conditions for wEll-being And Development). Based on a trans-disciplinary sample of concepts addressing human well-being and livelihoods, the approach measures the adequacy of conditions of 16 elements. We implement the method at global scale, using results from the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP) to show how changes in water availability affect the fulfilment of AHEAD at national resolution. In addition, AHEAD allows for the uncertainty of climate and impact model projections to be identified and differentiated. We show how the approach can help to put the substantial inter-model spread into the context of country-specific livelihood conditions by differentiating where the uncertainty about water scarcity is relevant with regard to livelihood conditions - and where it is not. The results indicate that livelihood conditions are compromised by water scarcity in 34 countries. However, more often, AHEAD fulfilment is limited through other elements. The analysis shows that the water-specific uncertainty ranges of the model output are outside relevant thresholds for AHEAD for 65 out of 111 countries, and therefore do not contribute to the overall uncertainty about climate change impacts on livelihoods. In 46 of the countries in the analysis, water-specific uncertainty is relevant to AHEAD. The AHEAD method presented here, together with first results, forms an important step towards making scientific results more applicable for policy decisions.

  7. How much swamp are we talking here?: Propagating uncertainty about the area of coastal wetlands into the U.S. greenhouse gas inventory

    NASA Astrophysics Data System (ADS)

    Holmquist, J. R.; Crooks, S.; Windham-Myers, L.; Megonigal, P.; Weller, D.; Lu, M.; Bernal, B.; Byrd, K. B.; Morris, J. T.; Troxler, T.; McCombs, J.; Herold, N.

    2017-12-01

    Stable coastal wetlands can store substantial amounts of carbon (C) that can be released when they are degraded or eroded. The EPA recently incorporated coastal wetland net-storage and emissions within the Agricultural Forested and Other Land Uses category of the U.S. National Greenhouse Gas Inventory (NGGI). This was a seminal analysis, but its quantification of uncertainty needs improvement. We provide a value-added analysis by estimating that uncertainty, focusing initially on the most basic assumption, the area of coastal wetlands. We considered three sources: uncertainty in the areas of vegetation and salinity subclasses, uncertainty in the areas of changing or stable wetlands, and uncertainty in the inland extent of coastal wetlands. The areas of vegetation and salinity subtypes, as well as stable or changing, were estimated from 2006 and 2010 maps derived from Landsat imagery by the Coastal Change Analysis Program (C-CAP). We generated unbiased area estimates and confidence intervals for C-CAP, taking into account mapped area, proportional areas of commission and omission errors, as well as the number of observations. We defined the inland extent of wetlands as all land below the current elevation of twice monthly highest tides. We generated probabilistic inundation maps integrating wetland-specific bias and random error in light-detection and ranging elevation maps, with the spatially explicit random error in tidal surfaces generated from tide gauges. This initial uncertainty analysis will be extended to calculate total propagated uncertainty in the NGGI by including the uncertainties in the amount of C lost from eroded and degraded wetlands, stored annually in stable wetlands, and emitted in the form of methane by tidal freshwater wetlands.

  8. Agriculture-driven deforestation in the tropics from 1990-2015: emissions, trends and uncertainties

    NASA Astrophysics Data System (ADS)

    Carter, Sarah; Herold, Martin; Avitabile, Valerio; de Bruin, Sytze; De Sy, Veronique; Kooistra, Lammert; Rufino, Mariana C.

    2018-01-01

    Limited data exists on emissions from agriculture-driven deforestation, and available data are typically uncertain. In this paper, we provide comparable estimates of emissions from both all deforestation and agriculture-driven deforestation, with uncertainties for 91 countries across the tropics between 1990 and 2015. Uncertainties associated with input datasets (activity data and emissions factors) were used to combine the datasets, where most certain datasets contribute the most. This method utilizes all the input data, while minimizing the uncertainty of the emissions estimate. The uncertainty of input datasets was influenced by the quality of the data, the sample size (for sample-based datasets), and the extent to which the timeframe of the data matches the period of interest. Area of deforestation, and the agriculture-driver factor (extent to which agriculture drives deforestation), were the most uncertain components of the emissions estimates, thus improvement in the uncertainties related to these estimates will provide the greatest reductions in uncertainties of emissions estimates. Over the period of the study, Latin America had the highest proportion of deforestation driven by agriculture (78%), and Africa had the lowest (62%). Latin America had the highest emissions from agriculture-driven deforestation, and these peaked at 974 ± 148 Mt CO2 yr-1 in 2000-2005. Africa saw a continuous increase in emissions between 1990 and 2015 (from 154 ± 21-412 ± 75 Mt CO2 yr-1), so mitigation initiatives could be prioritized there. Uncertainties for emissions from agriculture-driven deforestation are ± 62.4% (average over 1990-2015), and uncertainties were highest in Asia and lowest in Latin America. Uncertainty information is crucial for transparency when reporting, and gives credibility to related mitigation initiatives. We demonstrate that uncertainty data can also be useful when combining multiple open datasets, so we recommend new data providers to include this information.

  9. Interpreting Repeated Temperature-Depth Profiles for Groundwater Flow

    NASA Astrophysics Data System (ADS)

    Bense, Victor F.; Kurylyk, Barret L.; van Daal, Jonathan; van der Ploeg, Martine J.; Carey, Sean K.

    2017-10-01

    Temperature can be used to trace groundwater flows due to thermal disturbances of subsurface advection. Prior hydrogeological studies that have used temperature-depth profiles to estimate vertical groundwater fluxes have either ignored the influence of climate change by employing steady-state analytical solutions or applied transient techniques to study temperature-depth profiles recorded at only a single point in time. Transient analyses of a single profile are predicated on the accurate determination of an unknown profile at some time in the past to form the initial condition. In this study, we use both analytical solutions and a numerical model to demonstrate that boreholes with temperature-depth profiles recorded at multiple times can be analyzed to either overcome the uncertainty associated with estimating unknown initial conditions or to form an additional check for the profile fitting. We further illustrate that the common approach of assuming a linear initial temperature-depth profile can result in significant errors for groundwater flux estimates. Profiles obtained from a borehole in the Veluwe area, Netherlands in both 1978 and 2016 are analyzed for an illustrative example. Since many temperature-depth profiles were collected in the late 1970s and 1980s, these previously profiled boreholes represent a significant and underexploited opportunity to obtain repeat measurements that can be used for similar analyses at other sites around the world.

  10. Encouraging Uncertainty in the "Scientific Method": Promoting Understanding in the Processes of Science with Preservice Teachers

    ERIC Educational Resources Information Center

    Melville, Wayne; Bartley, Anthony; Fazio, Xavier

    2012-01-01

    Teachers' feelings of uncertainty are an overlooked, though crucial, condition necessary for the promotion of educational change. This article investigates the feelings of uncertainty that preservice teachers have toward the conduct of science as inquiry and the extent to which methods courses can confront and embrace those uncertainties. Our work…

  11. Multimodel hydrological ensemble forecasts for the Baskatong catchment in Canada using the TIGGE database.

    NASA Astrophysics Data System (ADS)

    Tito Arandia Martinez, Fabian

    2014-05-01

    Adequate uncertainty assessment is an important issue in hydrological modelling. An important issue for hydropower producers is to obtain ensemble forecasts which truly grasp the uncertainty linked to upcoming streamflows. If properly assessed, this uncertainty can lead to optimal reservoir management and energy production (ex. [1]). The meteorological inputs to the hydrological model accounts for an important part of the total uncertainty in streamflow forecasting. Since the creation of the THORPEX initiative and the TIGGE database, access to meteorological ensemble forecasts from nine agencies throughout the world have been made available. This allows for hydrological ensemble forecasts based on multiple meteorological ensemble forecasts. Consequently, both the uncertainty linked to the architecture of the meteorological model and the uncertainty linked to the initial condition of the atmosphere can be accounted for. The main objective of this work is to show that a weighted combination of meteorological ensemble forecasts based on different atmospheric models can lead to improved hydrological ensemble forecasts, for horizons from one to ten days. This experiment is performed for the Baskatong watershed, a head subcatchment of the Gatineau watershed in the province of Quebec, in Canada. Baskatong watershed is of great importance for hydro-power production, as it comprises the main reservoir for the Gatineau watershed, on which there are six hydropower plants managed by Hydro-Québec. Since the 70's, they have been using pseudo ensemble forecast based on deterministic meteorological forecasts to which variability derived from past forecasting errors is added. We use a combination of meteorological ensemble forecasts from different models (precipitation and temperature) as the main inputs for hydrological model HSAMI ([2]). The meteorological ensembles from eight of the nine agencies available through TIGGE are weighted according to their individual performance and combined to form a grand ensemble. Results show that the hydrological forecasts derived from the grand ensemble perform better than the pseudo ensemble forecasts actually used operationally at Hydro-Québec. References: [1] M. Verbunt, A. Walser, J. Gurtz et al., "Probabilistic flood forecasting with a limited-area ensemble prediction system: Selected case studies," Journal of Hydrometeorology, vol. 8, no. 4, pp. 897-909, Aug, 2007. [2] N. Evora, Valorisation des prévisions météorologiques d'ensemble, Institu de recherceh d'Hydro-Québec 2005. [3] V. Fortin, Le modèle météo-apport HSAMI: historique, théorie et application, Institut de recherche d'Hydro-Québec, 2000.

  12. What do we need to measure, how much, and where? A quantitative assessment of terrestrial data needs across North American biomes through data-model fusion and sampling optimization

    NASA Astrophysics Data System (ADS)

    Dietze, M. C.; Davidson, C. D.; Desai, A. R.; Feng, X.; Kelly, R.; Kooper, R.; LeBauer, D. S.; Mantooth, J.; McHenry, K.; Serbin, S. P.; Wang, D.

    2012-12-01

    Ecosystem models are designed to synthesize our current understanding of how ecosystems function and to predict responses to novel conditions, such as climate change. Reducing uncertainties in such models can thus improve both basic scientific understanding and our predictive capacity, but rarely have the models themselves been employed in the design of field campaigns. In the first part of this paper we provide a synthesis of uncertainty analyses conducted using the Predictive Ecosystem Analyzer (PEcAn) ecoinformatics workflow on the Ecosystem Demography model v2 (ED2). This work spans a number of projects synthesizing trait databases and using Bayesian data assimilation techniques to incorporate field data across temperate forests, grasslands, agriculture, short rotation forestry, boreal forests, and tundra. We report on a number of data needs that span a wide array diverse biomes, such as the need for better constraint on growth respiration. We also identify other data needs that are biome specific, such as reproductive allocation in tundra, leaf dark respiration in forestry and early-successional trees, and root allocation and turnover in mid- and late-successional trees. Future data collection needs to balance the unequal distribution of past measurements across biomes (temperate biased) and processes (aboveground biased) with the sensitivities of different processes. In the second part we present the development of a power analysis and sampling optimization module for the the PEcAn system. This module uses the results of variance decomposition analyses to estimate the further reduction in model predictive uncertainty for different sample sizes of different variables. By assigning a cost to each measurement type, we apply basic economic theory to optimize the reduction in model uncertainty for any total expenditure, or to determine the cost required to reduce uncertainty to a given threshold. Using this system we find that sampling switches among multiple measurement types but favors those with no prior measurements due to the need integrate over prior uncertainty in within and among site variability. When starting from scratch in a new system, the optimal design favors initial measurements of SLA due to high sensitivity and low cost. The value of many data types, such as photosynthetic response curves, depends strongly on whether one includes initial equipment costs or just per-sample costs. Similarly, sampling at previously measured locations is favored when infrastructure costs are high, otherwise across-site sampling is favored over intensive sampling except when within-site variability strongly dominates.

  13. Californium interrogation prompt neutron (CIPN) instrument for non-destructive assay of spent nuclear fuel – design concept and experimental demonstration

    DOE PAGES

    Henzlova, Daniela; Menlove, Howard Olsen; Rael, Carlos D.; ...

    2015-10-09

    Our paper presents results of the first experimental demonstration of the Californium Interrogation Prompt Neutron (CIPN) instrument developed within a multi-year effort launched by the Next Generation Safeguards Initiative Spent Fuel Project of the United States Department of Energy. The goals of this project focused on developing viable non-destructive assay techniques with capabilities to improve an independent verification of spent fuel assembly characteristics. For this purpose, the CIPN instrument combines active and passive neutron interrogation, along with passive gamma-ray measurements, to provide three independent observables. We describe the initial feasibility demonstration of the CIPN instrument, which involved measurements of fourmore » pressurized-water-reactor spent fuel assemblies with different levels of burnup and two initial enrichments. The measurements were performed at the Post-Irradiation Examination Facility at the Korea Atomic Energy Institute in the Republic of Korea. The key aim of the demonstration was to evaluate CIPN instrument performance under realistic deployment conditions, with the focus on a detailed assessment of systematic uncertainties that are best evaluated experimentally. The measurements revealed good positioning reproducibility, as well as a high degree of insensitivity of the CIPN instrument's response to irregularities in a radial burnup profile. Systematic uncertainty of individual CIPN instrument signals due to assembly rotation was found to be <4.5%, even for assemblies with fairly extreme gradients in the radial burnup profile. Lastly, these features suggest that the CIPN instrument is capable of providing a good representation of assembly average characteristics, independent of assembly orientation in the instrument.« less

  14. Californium interrogation prompt neutron (CIPN) instrument for non-destructive assay of spent nuclear fuel – design concept and experimental demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henzlova, Daniela; Menlove, Howard Olsen; Rael, Carlos D.

    Our paper presents results of the first experimental demonstration of the Californium Interrogation Prompt Neutron (CIPN) instrument developed within a multi-year effort launched by the Next Generation Safeguards Initiative Spent Fuel Project of the United States Department of Energy. The goals of this project focused on developing viable non-destructive assay techniques with capabilities to improve an independent verification of spent fuel assembly characteristics. For this purpose, the CIPN instrument combines active and passive neutron interrogation, along with passive gamma-ray measurements, to provide three independent observables. We describe the initial feasibility demonstration of the CIPN instrument, which involved measurements of fourmore » pressurized-water-reactor spent fuel assemblies with different levels of burnup and two initial enrichments. The measurements were performed at the Post-Irradiation Examination Facility at the Korea Atomic Energy Institute in the Republic of Korea. The key aim of the demonstration was to evaluate CIPN instrument performance under realistic deployment conditions, with the focus on a detailed assessment of systematic uncertainties that are best evaluated experimentally. The measurements revealed good positioning reproducibility, as well as a high degree of insensitivity of the CIPN instrument's response to irregularities in a radial burnup profile. Systematic uncertainty of individual CIPN instrument signals due to assembly rotation was found to be <4.5%, even for assemblies with fairly extreme gradients in the radial burnup profile. Lastly, these features suggest that the CIPN instrument is capable of providing a good representation of assembly average characteristics, independent of assembly orientation in the instrument.« less

  15. Initial Results of Aperture Area Comparisons for Exo-Atmospheric Total Solar Irradiance Measurements

    NASA Technical Reports Server (NTRS)

    Johnson, B. Carol; Litorja, Maritoni; Fowler, Joel B.; Butler, James J.

    2009-01-01

    In the measurement of exo-atmospheric total solar irradiance (TSI), instrument aperture area is a critical component in converting solar radiant flux to irradiance. In a May 2000 calibration workshop for the Total Irradiance Monitor (TIM) on the Earth Observing System (EOS) Solar Radiation and Climate Experiment (SORCE), the solar irradiance measurement community recommended that NASA and NISI coordinate an aperture area measurement comparison to quantify and validate aperture area uncertainties and their overall effect on TSI uncertainties. From May 2003 to February 2006, apertures from 4 institutions with links to the historical TSI database were measured by NIST and the results were compared to the aperture area determined by each institution. The initial results of these comparisons are presented and preliminary assessments of the participants' uncertainties are discussed.

  16. Measuring the uncertainties of discharge measurements: interlaboratory experiments in hydrometry

    NASA Astrophysics Data System (ADS)

    Le Coz, Jérôme; Blanquart, Bertrand; Pobanz, Karine; Dramais, Guillaume; Pierrefeu, Gilles; Hauet, Alexandre; Despax, Aurélien

    2015-04-01

    Quantifying the uncertainty of streamflow data is key for hydrological sciences. The conventional uncertainty analysis based on error propagation techniques is restricted by the absence of traceable discharge standards and by the weight of difficult-to-predict errors related to the operator, procedure and measurement environment. Field interlaboratory experiments recently emerged as an efficient, standardized method to 'measure' the uncertainties of a given streamgauging technique in given measurement conditions. Both uncertainty approaches are compatible and should be developed jointly in the field of hydrometry. In the recent years, several interlaboratory experiments have been reported by different hydrological services. They involved different streamgauging techniques, including acoustic profilers (ADCP), current-meters and handheld radars (SVR). Uncertainty analysis was not always their primary goal: most often, testing the proficiency and homogeneity of instruments, makes and models, procedures and operators was the original motivation. When interlaboratory experiments are processed for uncertainty analysis, once outliers have been discarded all participants are assumed to be equally skilled and to apply the same streamgauging technique in equivalent conditions. A universal requirement is that all participants simultaneously measure the same discharge, which shall be kept constant within negligible variations. To our best knowledge, we were the first to apply the interlaboratory method for computing the uncertainties of streamgauging techniques, according to the authoritative international documents (ISO standards). Several specific issues arise due to the measurements conditions in outdoor canals and rivers. The main limitation is that the best available river discharge references are usually too uncertain to quantify the bias of the streamgauging technique, i.e. the systematic errors that are common to all participants in the experiment. A reference or a sensitivity analysis to the fixed parameters of the streamgauging technique remain very useful for estimating the uncertainty related to the (non quantified) bias correction. In the absence of a reference, the uncertainty estimate is referenced to the average of all discharge measurements in the interlaboratory experiment, ignoring the technique bias. Simple equations can be used to assess the uncertainty of the uncertainty results, as a function of the number of participants and of repeated measurements. The interlaboratory method was applied to several interlaboratory experiments on ADCPs and currentmeters mounted on wading rods, in streams of different sizes and aspects, with 10 to 30 instruments, typically. The uncertainty results were consistent with the usual expert judgment and highly depended on the measurement environment. Approximately, the expanded uncertainties (within the 95% probability interval) were ±5% to ±10% for ADCPs in good or poor conditions, and ±10% to ±15% for currentmeters in shallow creeks. Due to the specific limitations related to a slow measurement process and to small, natural streams, uncertainty results for currentmeters were more uncertain than for ADCPs, for which the site-specific errors were significantly evidenced. The proposed method can be applied to a wide range of interlaboratory experiments conducted in contrasted environments for different streamgauging techniques, in a standardized way. Ideally, an international open database would enhance the investigation of hydrological data uncertainties, according to the characteristics of the measurement conditions and procedures. Such a dataset could be used for implementing and validating uncertainty propagation methods in hydrometry.

  17. Propagation of uncertainty in nasal spray in vitro performance models using Monte Carlo simulation: Part II. Error propagation during product performance modeling.

    PubMed

    Guo, Changning; Doub, William H; Kauffman, John F

    2010-08-01

    Monte Carlo simulations were applied to investigate the propagation of uncertainty in both input variables and response measurements on model prediction for nasal spray product performance design of experiment (DOE) models in the first part of this study, with an initial assumption that the models perfectly represent the relationship between input variables and the measured responses. In this article, we discard the initial assumption, and extended the Monte Carlo simulation study to examine the influence of both input variable variation and product performance measurement variation on the uncertainty in DOE model coefficients. The Monte Carlo simulations presented in this article illustrate the importance of careful error propagation during product performance modeling. Our results show that the error estimates based on Monte Carlo simulation result in smaller model coefficient standard deviations than those from regression methods. This suggests that the estimated standard deviations from regression may overestimate the uncertainties in the model coefficients. Monte Carlo simulations provide a simple software solution to understand the propagation of uncertainty in complex DOE models so that design space can be specified with statistically meaningful confidence levels. (c) 2010 Wiley-Liss, Inc. and the American Pharmacists Association

  18. Decision-Making under Criteria Uncertainty

    NASA Astrophysics Data System (ADS)

    Kureychik, V. M.; Safronenkova, I. B.

    2018-05-01

    Uncertainty is an essential part of a decision-making procedure. The paper deals with the problem of decision-making under criteria uncertainty. In this context, decision-making under uncertainty, types and conditions of uncertainty were examined. The decision-making problem under uncertainty was formalized. A modification of the mathematical decision support method under uncertainty via ontologies was proposed. A critical distinction of the developed method is ontology usage as its base elements. The goal of this work is a development of a decision-making method under criteria uncertainty with the use of ontologies in the area of multilayer board designing. This method is oriented to improvement of technical-economic values of the examined domain.

  19. Decomposing the uncertainty in climate impact projections of Dynamic Vegetation Models: a test with the forest models LANDCLIM and FORCLIM

    NASA Astrophysics Data System (ADS)

    Cailleret, Maxime; Snell, Rebecca; von Waldow, Harald; Kotlarski, Sven; Bugmann, Harald

    2015-04-01

    Different levels of uncertainty should be considered in climate impact projections by Dynamic Vegetation Models (DVMs), particularly when it comes to managing climate risks. Such information is useful to detect the key processes and uncertainties in the climate model - impact model chain and may be used to support recommendations for future improvements in the simulation of both climate and biological systems. In addition, determining which uncertainty source is dominant is an important aspect to recognize the limitations of climate impact projections by a multi-model ensemble mean approach. However, to date, few studies have clarified how each uncertainty source (baseline climate data, greenhouse gas emission scenario, climate model, and DVM) affects the projection of ecosystem properties. Focusing on one greenhouse gas emission scenario, we assessed the uncertainty in the projections of a forest landscape model (LANDCLIM) and a stand-scale forest gap model (FORCLIM) that is caused by linking climate data with an impact model. LANDCLIM was used to assess the uncertainty in future landscape properties of the Visp valley in Switzerland that is due to (i) the use of different 'baseline' climate data (gridded data vs. data from weather stations), and (ii) differences in climate projections among 10 GCM-RCM chains. This latter point was also considered for the projections of future forest properties by FORCLIM at several sites along an environmental gradient in Switzerland (14 GCM-RCM chains), for which we also quantified the uncertainty caused by (iii) the model chain specific statistical properties of the climate time-series, and (iv) the stochasticity of the demographic processes included in the model, e.g., the annual number of saplings that establish, or tree mortality. Using methods of variance decomposition analysis, we found that (i) The use of different baseline climate data strongly impacts the prediction of forest properties at the lowest and highest, but not so much at medium elevations. (ii) Considering climate change, the variability that is due to the GCM-RCM chains is much greater than the variability induced by the uncertainty in the initial climatic conditions. (iii) The uncertainties caused by the intrinsic stochasticity in the DVMs and by the random generation of the climate time-series are negligible. Overall, our results indicate that DVMs are quite sensitive to the climate data, highlighting particularly (1) the limitations of using one single multi-model average climate change scenario in climate impact studies and (2) the need to better consider the uncertainty in climate model outputs for projecting future vegetation changes.

  20. Thyroglobulin assay in fluids from lymph node fine needle-aspiration washout: influence of pre-analytical conditions.

    PubMed

    Casson, Florence Boux de; Moal, Valérie; Gauchez, Anne-Sophie; Moineau, Marie-Pierre; Sault, Corinne; Schlageter, Marie-Hélène; Massart, Catherine

    2017-04-01

    The aim of this study was to evaluate the pre-analytical factors contributing to uncertainty in thyroglobulin measurement in fluids from fine-needle aspiration (FNA) washout of cervical lymph nodes. We studied pre-analytical stability, in different conditions, of 41 samples prepared with concentrated solutions of thyroglobulin (FNA washout or certified standard) diluted in physiological saline solution or buffer containing 6% albumin. In this buffer, over time, no changes in thyroglobulin concentrations were observed in all storage conditions tested. In albumin free saline solution, thyroglobulin recovery rates depended on initial sample concentrations and on modalities of their conservation (in conventional storage tubes, recovery mean was 56% after 3 hours-storage at room temperature and 19% after 24 hours-storage for concentrations ranged from 2 to 183 μg/L; recovery was 95%, after 3 hours or 24 hours-storage at room temperature, for a concentration of 5,656 μg/L). We show here that these results are due to non-specific adsorption of thyroglobulin in storage tubes, which depends on sample protein concentrations. We also show that possible contamination of fluids from FNA washout by plasma proteins do not always adequately prevent this adsorption. In conclusion, non-specific adsorption in storage tubes strongly contributes to uncertainty in thyroglobulin measurement in physiological saline solution. It is therefore recommended, for FNA washout, to use a buffer containing proteins provided by the laboratory.

  1. The impact of the uncertainty in the initial soil moisture condition of irrigated areas on the spatiotemporal characteristics of convective activity in Central Greece

    NASA Astrophysics Data System (ADS)

    Kotsopoulos, Stylianos; Ioannis, Tegoulias; Ioannis, Pytharoulis; Stergios, Kartsios; Dimitrios, Bampzelis; Theodore, Karacostas

    2015-04-01

    The region of Thessaly is the second largest plain in Greece and has a vital role in the financial life of the country, because of its significant agricultural production. The intensive and extensive cultivation of irrigated crops, in combination with the population increase and the alteration of precipitation patterns due to climate change, often leading the region to experience severe drought conditions, especially during the warm period of the year. The aim of the DAPHNE project is to tackle the problem of drought in this area by means of Weather Modification.In the framework of the project DAPHNE, the numerical weather prediction model WRF-ARW 3.5.1 is used to provide operational forecasts and hindcasts for the region of Thessaly. The goal of this study is to investigate the impact of the uncertainty in the initial soil moisture condition of irrigated areas, on the spatiotemporal characteristics of convective activity in the region of interest. To this end, six cases under the six most frequent synoptic conditions, which are associated with convective activity in the region of interest, are utilized, considering six different soil moisture initialization scenarios. In the first scenario (Control Run), the model is initialized with the surface soil moisture of the ECMWF analysis data, that usually does not take into account the modification of soil moisture due to agricultural activity in the area of interest. In the other five scenarios (Experiment 1,2,3,4,5) the soil moisture in the upper soil layers of the study area are modified from -50% to 50% of field capacity (-50%FC, -25%FC, FC, 25%FC, 50%FC),for the irrigated cropland.Three model domains, covering Europe, the Mediterranean Sea and northern Africa (d01), the wider area of Greece (d02) and central Greece - Thessaly region (d03) are used at horizontal grid-spacings of 15km, 5km and 1km respectively. ECMWF operational analyses at 6-hourly intervals (0.25ox0.25o lat.-long.) are imported as initial and boundary conditions of the coarse domain, while in the vertical, all nests employ 39 sigma levels (up to 50 hPa) with increased resolution in the boundary layer. Microphysical processes are represented by WSM6 scheme, sub-grid scale convection by Kain-Fritsch scheme, longwave and shortwave radiation by RRTMG scheme, surface layer by Monin-Obukhov (MM5), boundary layer by Yonsei University and soil surface scheme by NOAH Unified model. The model numerical results are evaluated against surface precipitation data and data obtained using a C-band (5cm) weather radar located in the centre of the innermost domain. Acknowledgements: This research is co-financed by the European Union (European Regional Development Fund) and Greek national funds, through the action "COOPERATION 2011: Partnerships of Production and Research Institutions in Focused Research and Technology Sectors" (contract number 11SYN_8_1088 - DAPHNE) in the framework of the operational programme "Competitiveness and Entrepreneurship" and Regions in Transition (OPC II, NSRF 2007-2013).

  2. Can hydraulic-modelled rating curves reduce uncertainty in high flow data?

    NASA Astrophysics Data System (ADS)

    Westerberg, Ida; Lam, Norris; Lyon, Steve W.

    2017-04-01

    Flood risk assessments rely on accurate discharge data records. Establishing a reliable rating curve for calculating discharge from stage at a gauging station normally takes years of data collection efforts. Estimation of high flows is particularly difficult as high flows occur rarely and are often practically difficult to gauge. Hydraulically-modelled rating curves can be derived based on as few as two concurrent stage-discharge and water-surface slope measurements at different flow conditions. This means that a reliable rating curve can, potentially, be derived much faster than a traditional rating curve based on numerous stage-discharge gaugings. In this study we compared the uncertainty in discharge data that resulted from these two rating curve modelling approaches. We applied both methods to a Swedish catchment, accounting for uncertainties in the stage-discharge gauging and water-surface slope data for the hydraulic model and in the stage-discharge gauging data and rating-curve parameters for the traditional method. We focused our analyses on high-flow uncertainty and the factors that could reduce this uncertainty. In particular, we investigated which data uncertainties were most important, and at what flow conditions the gaugings should preferably be taken. First results show that the hydraulically-modelled rating curves were more sensitive to uncertainties in the calibration measurements of discharge than water surface slope. The uncertainty of the hydraulically-modelled rating curves were lowest within the range of the three calibration stage-discharge gaugings (i.e. between median and two-times median flow) whereas uncertainties were higher outside of this range. For instance, at the highest observed stage of the 24-year stage record, the 90% uncertainty band was -15% to +40% of the official rating curve. Additional gaugings at high flows (i.e. four to five times median flow) would likely substantially reduce those uncertainties. These first results show the potential of the hydraulically-modelled curves, particularly where the calibration gaugings are of high quality and cover a wide range of flow conditions.

  3. Parameter uncertainty and nonstationarity in regional extreme rainfall frequency analysis in Qu River Basin, East China

    NASA Astrophysics Data System (ADS)

    Zhu, Q.; Xu, Y. P.; Gu, H.

    2014-12-01

    Traditionally, regional frequency analysis methods were developed for stationary environmental conditions. Nevertheless, recent studies have identified significant changes in hydrological records, leading to the 'death' of stationarity. Besides, uncertainty in hydrological frequency analysis is persistent. This study aims to investigate the impact of one of the most important uncertainty sources, parameter uncertainty, together with nonstationarity, on design rainfall depth in Qu River Basin, East China. A spatial bootstrap is first proposed to analyze the uncertainty of design rainfall depth estimated by regional frequency analysis based on L-moments and estimated on at-site scale. Meanwhile, a method combining the generalized additive models with 30-year moving window is employed to analyze non-stationarity existed in the extreme rainfall regime. The results show that the uncertainties of design rainfall depth with 100-year return period under stationary conditions estimated by regional spatial bootstrap can reach 15.07% and 12.22% with GEV and PE3 respectively. On at-site scale, the uncertainties can reach 17.18% and 15.44% with GEV and PE3 respectively. In non-stationary conditions, the uncertainties of maximum rainfall depth (corresponding to design rainfall depth) with 0.01 annual exceedance probability (corresponding to 100-year return period) are 23.09% and 13.83% with GEV and PE3 respectively. Comparing the 90% confidence interval, the uncertainty of design rainfall depth resulted from parameter uncertainty is less than that from non-stationarity frequency analysis with GEV, however, slightly larger with PE3. This study indicates that the spatial bootstrap can be successfully applied to analyze the uncertainty of design rainfall depth on both regional and at-site scales. And the non-stationary analysis shows that the differences between non-stationary quantiles and their stationary equivalents are important for decision makes of water resources management and risk management.

  4. Uncertainty Evaluation of Measurements with Pyranometers and Pyrheliometers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Konings, Jorgen; Habte, Aron

    2016-01-03

    Evaluating photovoltaic (PV) cells, modules, arrays and systems performance of solar energy relies on accurate measurement of the available solar radiation resources. Solar radiation resources are measured using radiometers such as pyranometers (global horizontal irradiance) and pyrheliometers (direct normal irradiance). The accuracy of solar radiation data measured by radiometers depends not only on the specification of the instrument but also on a) the calibration procedure, b) the measurement conditions and maintenance, and c) the environmental conditions. Therefore, statements about the overall measurement uncertainty can only be made on an individual basis, taking all relevant factors into account. This paper providesmore » guidelines and recommended procedures for estimating the uncertainty in measurements by radiometers using the Guide to the Expression of Uncertainty (GUM) Method. Special attention is paid to the concept of data availability and its link to uncertainty evaluation.« less

  5. Analysis of uncertainties in turbine metal temperature predictions

    NASA Technical Reports Server (NTRS)

    Stepka, F. S.

    1980-01-01

    An analysis was conducted to examine the extent to which various factors influence the accuracy of analytically predicting turbine blade metal temperatures and to determine the uncertainties in these predictions for several accuracies of the influence factors. The advanced turbofan engine gas conditions of 1700 K and 40 atmospheres were considered along with those of a highly instrumented high temperature turbine test rig and a low temperature turbine rig that simulated the engine conditions. The analysis showed that the uncertainty in analytically predicting local blade temperature was as much as 98 K, or 7.6 percent of the metal absolute temperature, with current knowledge of the influence factors. The expected reductions in uncertainties in the influence factors with additional knowledge and tests should reduce the uncertainty in predicting blade metal temperature to 28 K, or 2.1 percent of the metal absolute temperature.

  6. MOMENTS OF UNCERTAINTY: ETHICAL CONSIDERATIONS AND EMERGING CONTAMINANTS

    PubMed Central

    Cordner, Alissa; Brown, Phil

    2013-01-01

    Science on emerging environmental health threats involves numerous ethical concerns related to scientific uncertainty about conducting, interpreting, communicating, and acting upon research findings, but the connections between ethical decision making and scientific uncertainty are under-studied in sociology. Under conditions of scientific uncertainty, researcher conduct is not fully prescribed by formal ethical codes of conduct, increasing the importance of ethical reflection by researchers, conflicts over research conduct, and reliance on informal ethical standards. This paper draws on in-depth interviews with scientists, regulators, activists, industry representatives, and fire safety experts to explore ethical considerations of moments of uncertainty using a case study of flame retardants, chemicals widely used in consumer products with potential negative health and environmental impacts. We focus on the uncertainty that arises in measuring people’s exposure to these chemicals through testing of their personal environments or bodies. We identify four sources of ethical concerns relevant to scientific uncertainty: 1) choosing research questions or methods, 2) interpreting scientific results, 3) communicating results to multiple publics, and 4) applying results for policy-making. This research offers lessons about professional conduct under conditions of uncertainty, ethical research practice, democratization of scientific knowledge, and science’s impact on policy. PMID:24249964

  7. Mass-loss rates of cool stars

    NASA Astrophysics Data System (ADS)

    Katrien Els Decin, Leen

    2015-08-01

    Over much of the initial mass function, stars lose a significant fraction of their mass through a stellar wind during the late stages of their evolution when being a (super)giant star. As of today, we can not yet predict the mass-loss rate during the (super)giant phase for a given star with specific stellar parameters from first principles. This uncertainty directly impacts the accuracy of current stellar evolution and population synthesis models that predict the enrichment of the interstellar medium by these stellar winds. Efforts to establish the link between the initial physical and chemical conditions at stellar birth and the mass-loss rate during the (super)giant phase have proceeded on two separate tracks: (1) more detailed studies of the chemical and morpho-kinematical structure of the stellar winds of (super)giant stars in our own Milky Way by virtue of the proximity, and (2) large scale and statistical studies of a (large) sample of stars in other galaxies (such as the LMC and SMC) and globular clusters eliminating the uncertainty on the distance estimate and providing insight into the dependence of the mass-loss rate on the metallicity. In this review, I will present recent results of both tracks, will show how recent measurements confirm (some) theoretical predictions, but also how results from the first track admonish of common misconceptions inherent in the often more simplified analysis used to analyse the large samples from track 2.

  8. Assessing uncertainties in global cropland futures using a conditional probabilistic modelling framework

    NASA Astrophysics Data System (ADS)

    Engström, Kerstin; Olin, Stefan; Rounsevell, Mark D. A.; Brogaard, Sara; van Vuuren, Detlef P.; Alexander, Peter; Murray-Rust, Dave; Arneth, Almut

    2016-11-01

    We present a modelling framework to simulate probabilistic futures of global cropland areas that are conditional on the SSP (shared socio-economic pathway) scenarios. Simulations are based on the Parsimonious Land Use Model (PLUM) linked with the global dynamic vegetation model LPJ-GUESS (Lund-Potsdam-Jena General Ecosystem Simulator) using socio-economic data from the SSPs and climate data from the RCPs (representative concentration pathways). The simulated range of global cropland is 893-2380 Mha in 2100 (± 1 standard deviation), with the main uncertainties arising from differences in the socio-economic conditions prescribed by the SSP scenarios and the assumptions that underpin the translation of qualitative SSP storylines into quantitative model input parameters. Uncertainties in the assumptions for population growth, technological change and cropland degradation were found to be the most important for global cropland, while uncertainty in food consumption had less influence on the results. The uncertainties arising from climate variability and the differences between climate change scenarios do not strongly affect the range of global cropland futures. Some overlap occurred across all of the conditional probabilistic futures, except for those based on SSP3. We conclude that completely different socio-economic and climate change futures, although sharing low to medium population development, can result in very similar cropland areas on the aggregated global scale.

  9. Analysis and Testing of a LIDAR-Based Approach to Terrain Relative Navigation for Precise Lunar Landing

    NASA Technical Reports Server (NTRS)

    Johnson, Andrew E.; Ivanov, Tonislav I.

    2011-01-01

    To increase safety and land near pre-deployed resources, future NASA missions to the moon will require precision landing. A LIDAR-based terrain relative navigation (TRN) approach can achieve precision landing under any lighting conditions. This paper presents results from processing flash lidar and laser altimeter field test data that show LIDAR TRN can obtain position estimates less than 90m while automatically detecting and eliminating incorrect measurements using internal metrics on terrain relief and data correlation. Sensitivity studies show that the algorithm has no degradation in matching performance with initial position uncertainties up to 1.6 km

  10. Moving Beyond 2% Uncertainty: A New Framework for Quantifying Lidar Uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newman, Jennifer F.; Clifton, Andrew

    2017-03-08

    Remote sensing of wind using lidar is revolutionizing wind energy. However, current generations of wind lidar are ascribed a climatic value of uncertainty, which is based on a poor description of lidar sensitivity to external conditions. In this presentation, we show how it is important to consider the complete lidar measurement process to define the measurement uncertainty, which in turn offers the ability to define a much more granular and dynamic measurement uncertainty. This approach is a progression from the 'white box' lidar uncertainty method.

  11. Assessing Uncertainties in Surface Water Security: A Probabilistic Multi-model Resampling approach

    NASA Astrophysics Data System (ADS)

    Rodrigues, D. B. B.

    2015-12-01

    Various uncertainties are involved in the representation of processes that characterize interactions between societal needs, ecosystem functioning, and hydrological conditions. Here, we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multi-model and resampling framework. We consider several uncertainty sources including those related to: i) observed streamflow data; ii) hydrological model structure; iii) residual analysis; iv) the definition of Environmental Flow Requirement method; v) the definition of critical conditions for water provision; and vi) the critical demand imposed by human activities. We estimate the overall uncertainty coming from the hydrological model by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km² agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multi-model framework and provided by each model uncertainty estimation approach. The method is general and can be easily extended forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision making process.

  12. Computation of probabilistic hazard maps and source parameter estimation for volcanic ash transport and dispersion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Madankan, R.; Pouget, S.; Singla, P., E-mail: psingla@buffalo.edu

    Volcanic ash advisory centers are charged with forecasting the movement of volcanic ash plumes, for aviation, health and safety preparation. Deterministic mathematical equations model the advection and dispersion of these plumes. However initial plume conditions – height, profile of particle location, volcanic vent parameters – are known only approximately at best, and other features of the governing system such as the windfield are stochastic. These uncertainties make forecasting plume motion difficult. As a result of these uncertainties, ash advisories based on a deterministic approach tend to be conservative, and many times over/under estimate the extent of a plume. This papermore » presents an end-to-end framework for generating a probabilistic approach to ash plume forecasting. This framework uses an ensemble of solutions, guided by Conjugate Unscented Transform (CUT) method for evaluating expectation integrals. This ensemble is used to construct a polynomial chaos expansion that can be sampled cheaply, to provide a probabilistic model forecast. The CUT method is then combined with a minimum variance condition, to provide a full posterior pdf of the uncertain source parameters, based on observed satellite imagery. The April 2010 eruption of the Eyjafjallajökull volcano in Iceland is employed as a test example. The puff advection/dispersion model is used to hindcast the motion of the ash plume through time, concentrating on the period 14–16 April 2010. Variability in the height and particle loading of that eruption is introduced through a volcano column model called bent. Output uncertainty due to the assumed uncertain input parameter probability distributions, and a probabilistic spatial-temporal estimate of ash presence are computed.« less

  13. A New Multivariate Approach in Generating Ensemble Meteorological Forcings for Hydrological Forecasting

    NASA Astrophysics Data System (ADS)

    Khajehei, Sepideh; Moradkhani, Hamid

    2015-04-01

    Producing reliable and accurate hydrologic ensemble forecasts are subject to various sources of uncertainty, including meteorological forcing, initial conditions, model structure, and model parameters. Producing reliable and skillful precipitation ensemble forecasts is one approach to reduce the total uncertainty in hydrological applications. Currently, National Weather Prediction (NWP) models are developing ensemble forecasts for various temporal ranges. It is proven that raw products from NWP models are biased in mean and spread. Given the above state, there is a need for methods that are able to generate reliable ensemble forecasts for hydrological applications. One of the common techniques is to apply statistical procedures in order to generate ensemble forecast from NWP-generated single-value forecasts. The procedure is based on the bivariate probability distribution between the observation and single-value precipitation forecast. However, one of the assumptions of the current method is fitting Gaussian distribution to the marginal distributions of observed and modeled climate variable. Here, we have described and evaluated a Bayesian approach based on Copula functions to develop an ensemble precipitation forecast from the conditional distribution of single-value precipitation forecasts. Copula functions are known as the multivariate joint distribution of univariate marginal distributions, which are presented as an alternative procedure in capturing the uncertainties related to meteorological forcing. Copulas are capable of modeling the joint distribution of two variables with any level of correlation and dependency. This study is conducted over a sub-basin in the Columbia River Basin in USA using the monthly precipitation forecasts from Climate Forecast System (CFS) with 0.5x0.5 Deg. spatial resolution to reproduce the observations. The verification is conducted on a different period and the superiority of the procedure is compared with Ensemble Pre-Processor approach currently used by National Weather Service River Forecast Centers in USA.

  14. The precautionary principle within European Union public health policy. The implementation of the principle under conditions of supranationality and citizenship.

    PubMed

    Antonopoulou, Lila; van Meurs, Philip

    2003-11-01

    The present study examines the precautionary principle within the parameters of public health policy in the European Union, regarding both its meaning, as it has been shaped by relevant EU institutions and their counterparts within the Member States, and its implementation in practice. In the initial section I concentrate on the methodological question of "scientific uncertainty" concerning the calculation of risk and possible damage. Calculation of risk in many cases justifies the adopting of preventive measures, but, as it is argued, the principle of precaution and its implementation cannot be wholly captured by a logic of calculation; such a principle does not only contain scientific uncertainty-as the preventive principle does-but it itself is generated as a principle by this scientific uncertainty, recognising the need for a society to act. Thus, the implementation of the precautionary principle is also a simultaneous search for justification of its status as a principle. This justification would result in the adoption of precautionary measures against risk although no proof of this principle has been produced based on the "cause-effect" model. The main part of the study is occupied with an examination of three cases from which the stance of the official bodies of the European Union towards the precautionary principle and its implementation emerges: the case of the "mad cows" disease, the case of production and commercialization of genetically modified foodstuffs. The study concludes with the assessment that the effective implementation of the precautionary principle on a European level depends on the emergence of a concerned Europe-wide citizenship and its acting as a mechanism to counteract the material and social conditions that pose risks for human health.

  15. Conflict Resolution for Wind-Optimal Aircraft Trajectories in North Atlantic Oceanic Airspace with Wind Uncertainties

    NASA Technical Reports Server (NTRS)

    Rodionova, Olga; Sridhar, Banavar; Ng, Hok K.

    2016-01-01

    Air traffic in the North Atlantic oceanic airspace (NAT) experiences very strong winds caused by jet streams. Flying wind-optimal trajectories increases individual flight efficiency, which is advantageous when operating in the NAT. However, as the NAT is highly congested during peak hours, a large number of potential conflicts between flights are detected for the sets of wind-optimal trajectories. Conflict resolution performed at the strategic level of flight planning can significantly reduce the airspace congestion. However, being completed far in advance, strategic planning can only use predicted environmental conditions that may significantly differ from the real conditions experienced further by aircraft. The forecast uncertainties result in uncertainties in conflict prediction, and thus, conflict resolution becomes less efficient. This work considers wind uncertainties in order to improve the robustness of conflict resolution in the NAT. First, the influence of wind uncertainties on conflict prediction is investigated. Then, conflict resolution methods accounting for wind uncertainties are proposed.

  16. Space Shuttle stability and control flight test techniques

    NASA Technical Reports Server (NTRS)

    Cooke, D. R.

    1980-01-01

    A unique approach for obtaining vehicle aerodynamic characteristics during entry has been developed for the Space Shuttle. This is due to the high cost of Shuttle testing, the need to open constraints for operational flights, and the fact that all flight regimes are flown starting with the first flight. Because of uncertainties associated with predicted aerodynamic coefficients, nine flight conditions have been identified at which control problems could occur. A detailed test plan has been developed for testing at these conditions and is presented. Due to limited testing, precise computer initiated maneuvers are implemented. These maneuvers are designed to optimize the vehicle motion for determining aerodynamic coefficients. Special sensors and atmospheric measurements are required to provide stability and control flight data during an entire entry. The techniques employed in data reduction are proven programs developed and used at NASA/DFRC.

  17. Impact of hydrogeological data on measures of uncertainty, site characterization and environmental performance metrics

    NASA Astrophysics Data System (ADS)

    de Barros, Felipe P. J.; Ezzedine, Souheil; Rubin, Yoram

    2012-02-01

    The significance of conditioning predictions of environmental performance metrics (EPMs) on hydrogeological data in heterogeneous porous media is addressed. Conditioning EPMs on available data reduces uncertainty and increases the reliability of model predictions. We present a rational and concise approach to investigate the impact of conditioning EPMs on data as a function of the location of the environmentally sensitive target receptor, data types and spacing between measurements. We illustrate how the concept of comparative information yield curves introduced in de Barros et al. [de Barros FPJ, Rubin Y, Maxwell R. The concept of comparative information yield curves and its application to risk-based site characterization. Water Resour Res 2009;45:W06401. doi:10.1029/2008WR007324] could be used to assess site characterization needs as a function of flow and transport dimensionality and EPMs. For a given EPM, we show how alternative uncertainty reduction metrics yield distinct gains of information from a variety of sampling schemes. Our results show that uncertainty reduction is EPM dependent (e.g., travel times) and does not necessarily indicate uncertainty reduction in an alternative EPM (e.g., human health risk). The results show how the position of the environmental target, flow dimensionality and the choice of the uncertainty reduction metric can be used to assist in field sampling campaigns.

  18. Impulsivity modulates performance under response uncertainty in a reaching task.

    PubMed

    Tzagarakis, C; Pellizzer, G; Rogers, R D

    2013-03-01

    We sought to explore the interaction of the impulsivity trait with response uncertainty. To this end, we used a reaching task (Pellizzer and Hedges in Exp Brain Res 150:276-289, 2003) where a motor response direction was cued at different levels of uncertainty (1 cue, i.e., no uncertainty, 2 cues or 3 cues). Data from 95 healthy adults (54 F, 41 M) were analysed. Impulsivity was measured using the Barratt Impulsiveness Scale version 11 (BIS-11). Behavioral variables recorded were reaction time (RT), errors of commission (referred to as 'early errors') and errors of precision. Data analysis employed generalised linear mixed models and generalised additive mixed models. For the early errors, there was an interaction of impulsivity with uncertainty and gender, with increased errors for high impulsivity in the one-cue condition for women and the three-cue condition for men. There was no effect of impulsivity on precision errors or RT. However, the analysis of the effect of RT and impulsivity on precision errors showed a different pattern for high versus low impulsives in the high uncertainty (3 cue) condition. In addition, there was a significant early error speed-accuracy trade-off for women, primarily in low uncertainty and a 'reverse' speed-accuracy trade-off for men in high uncertainty. These results extend those of past studies of impulsivity which help define it as a behavioural trait that modulates speed versus accuracy response styles depending on environmental constraints and highlight once more the importance of gender in the interplay of personality and behaviour.

  19. Effects of uncertain topographic input data on two-dimensional flow modeling in a gravel-bed river

    USGS Publications Warehouse

    Legleiter, C.J.; Kyriakidis, P.C.; McDonald, R.R.; Nelson, J.M.

    2011-01-01

    Many applications in river research and management rely upon two-dimensional (2D) numerical models to characterize flow fields, assess habitat conditions, and evaluate channel stability. Predictions from such models are potentially highly uncertain due to the uncertainty associated with the topographic data provided as input. This study used a spatial stochastic simulation strategy to examine the effects of topographic uncertainty on flow modeling. Many, equally likely bed elevation realizations for a simple meander bend were generated and propagated through a typical 2D model to produce distributions of water-surface elevation, depth, velocity, and boundary shear stress at each node of the model's computational grid. Ensemble summary statistics were used to characterize the uncertainty associated with these predictions and to examine the spatial structure of this uncertainty in relation to channel morphology. Simulations conditioned to different data configurations indicated that model predictions became increasingly uncertain as the spacing between surveyed cross sections increased. Model sensitivity to topographic uncertainty was greater for base flow conditions than for a higher, subbankfull flow (75% of bankfull discharge). The degree of sensitivity also varied spatially throughout the bend, with the greatest uncertainty occurring over the point bar where the flow field was influenced by topographic steering effects. Uncertain topography can therefore introduce significant uncertainty to analyses of habitat suitability and bed mobility based on flow model output. In the presence of such uncertainty, the results of these studies are most appropriately represented in probabilistic terms using distributions of model predictions derived from a series of topographic realizations. Copyright 2011 by the American Geophysical Union.

  20. Assessing uncertainties in surface water security: An empirical multimodel approach

    NASA Astrophysics Data System (ADS)

    Rodrigues, Dulce B. B.; Gupta, Hoshin V.; Mendiondo, Eduardo M.; Oliveira, Paulo Tarso S.

    2015-11-01

    Various uncertainties are involved in the representation of processes that characterize interactions among societal needs, ecosystem functioning, and hydrological conditions. Here we develop an empirical uncertainty assessment of water security indicators that characterize scarcity and vulnerability, based on a multimodel and resampling framework. We consider several uncertainty sources including those related to (i) observed streamflow data; (ii) hydrological model structure; (iii) residual analysis; (iv) the method for defining Environmental Flow Requirement; (v) the definition of critical conditions for water provision; and (vi) the critical demand imposed by human activities. We estimate the overall hydrological model uncertainty by means of a residual bootstrap resampling approach, and by uncertainty propagation through different methodological arrangements applied to a 291 km2 agricultural basin within the Cantareira water supply system in Brazil. Together, the two-component hydrograph residual analysis and the block bootstrap resampling approach result in a more accurate and precise estimate of the uncertainty (95% confidence intervals) in the simulated time series. We then compare the uncertainty estimates associated with water security indicators using a multimodel framework and the uncertainty estimates provided by each model uncertainty estimation approach. The range of values obtained for the water security indicators suggests that the models/methods are robust and performs well in a range of plausible situations. The method is general and can be easily extended, thereby forming the basis for meaningful support to end-users facing water resource challenges by enabling them to incorporate a viable uncertainty analysis into a robust decision-making process.

  1. Emulation for probabilistic weather forecasting

    NASA Astrophysics Data System (ADS)

    Cornford, Dan; Barillec, Remi

    2010-05-01

    Numerical weather prediction models are typically very expensive to run due to their complexity and resolution. Characterising the sensitivity of the model to its initial condition and/or to its parameters requires numerous runs of the model, which is impractical for all but the simplest models. To produce probabilistic forecasts requires knowledge of the distribution of the model outputs, given the distribution over the inputs, where the inputs include the initial conditions, boundary conditions and model parameters. Such uncertainty analysis for complex weather prediction models seems a long way off, given current computing power, with ensembles providing only a partial answer. One possible way forward that we develop in this work is the use of statistical emulators. Emulators provide an efficient statistical approximation to the model (or simulator) while quantifying the uncertainty introduced. In the emulator framework, a Gaussian process is fitted to the simulator response as a function of the simulator inputs using some training data. The emulator is essentially an interpolator of the simulator output and the response in unobserved areas is dictated by the choice of covariance structure and parameters in the Gaussian process. Suitable parameters are inferred from the data in a maximum likelihood, or Bayesian framework. Once trained, the emulator allows operations such as sensitivity analysis or uncertainty analysis to be performed at a much lower computational cost. The efficiency of emulators can be further improved by exploiting the redundancy in the simulator output through appropriate dimension reduction techniques. We demonstrate this using both Principal Component Analysis on the model output and a new reduced-rank emulator in which an optimal linear projection operator is estimated jointly with other parameters, in the context of simple low order models, such as the Lorenz 40D system. We present the application of emulators to probabilistic weather forecasting, where the construction of the emulator training set replaces the traditional ensemble model runs. Thus the actual forecast distributions are computed using the emulator conditioned on the ‘ensemble runs' which are chosen to explore the plausible input space using relatively crude experimental design methods. One benefit here is that the ensemble does not need to be a sample from the true distribution of the input space, rather it should cover that input space in some sense. The probabilistic forecasts are computed using Monte Carlo methods sampling from the input distribution and using the emulator to produce the output distribution. Finally we discuss the limitations of this approach and briefly mention how we might use similar methods to learn the model error within a framework that incorporates a data assimilation like aspect, using emulators and learning complex model error representations. We suggest future directions for research in the area that will be necessary to apply the method to more realistic numerical weather prediction models.

  2. Adaptive guidance for an aero-assisted boost vehicle

    NASA Astrophysics Data System (ADS)

    Pamadi, Bandu N.; Taylor, Lawrence W., Jr.; Price, Douglas B.

    An adaptive guidance system incorporating dynamic pressure constraint is studied for a single stage to low earth orbit (LEO) aero-assist booster with thrust gimbal angle as the control variable. To derive an adaptive guidance law, cubic spline functions are used to represent the ascent profile. The booster flight to LEO is divided into initial and terminal phases. In the initial phase, the ascent profile is continuously updated to maximize the performance of the boost vehicle enroute. A linear feedback control is used in the terminal phase to guide the aero-assisted booster onto the desired LEO. The computer simulation of the vehicle dynamics considers a rotating spherical earth, inverse square (Newtonian) gravity field and an exponential model for the earth's atmospheric density. This adaptive guidance algorithm is capable of handling large deviations in both atmospheric conditions and modeling uncertainties, while ensuring maximum booster performance.

  3. Qualification of CASMO5 / SIMULATE-3K against the SPERT-III E-core cold start-up experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grandi, G.; Moberg, L.

    SIMULATE-3K is a three-dimensional kinetic code applicable to LWR Reactivity Initiated Accidents. S3K has been used to calculate several international recognized benchmarks. However, the feedback models in the benchmark exercises are different from the feedback models that SIMULATE-3K uses for LWR reactors. For this reason, it is worth comparing the SIMULATE-3K capabilities for Reactivity Initiated Accidents against kinetic experiments. The Special Power Excursion Reactor Test III was a pressurized-water, nuclear-research facility constructed to analyze the reactor kinetic behavior under initial conditions similar to those of commercial LWRs. The SPERT III E-core resembles a PWR in terms of fuel type, moderator,more » coolant flow rate, and system pressure. The initial test conditions (power, core flow, system pressure, core inlet temperature) are representative of cold start-up, hot start-up, hot standby, and hot full power. The qualification of S3K against the SPERT III E-core measurements is an ongoing work at Studsvik. In this paper, the results for the 30 cold start-up tests are presented. The results show good agreement with the experiments for the reactivity initiated accident main parameters: peak power, energy release and compensated reactivity. Predicted and measured peak powers differ at most by 13%. Measured and predicted reactivity compensations at the time of the peak power differ less than 0.01 $. Predicted and measured energy release differ at most by 13%. All differences are within the experimental uncertainty. (authors)« less

  4. Tsunami hazard assessments with consideration of uncertain earthquakes characteristics

    NASA Astrophysics Data System (ADS)

    Sepulveda, I.; Liu, P. L. F.; Grigoriu, M. D.; Pritchard, M. E.

    2017-12-01

    The uncertainty quantification of tsunami assessments due to uncertain earthquake characteristics faces important challenges. First, the generated earthquake samples must be consistent with the properties observed in past events. Second, it must adopt an uncertainty propagation method to determine tsunami uncertainties with a feasible computational cost. In this study we propose a new methodology, which improves the existing tsunami uncertainty assessment methods. The methodology considers two uncertain earthquake characteristics, the slip distribution and location. First, the methodology considers the generation of consistent earthquake slip samples by means of a Karhunen Loeve (K-L) expansion and a translation process (Grigoriu, 2012), applicable to any non-rectangular rupture area and marginal probability distribution. The K-L expansion was recently applied by Le Veque et al. (2016). We have extended the methodology by analyzing accuracy criteria in terms of the tsunami initial conditions. Furthermore, and unlike this reference, we preserve the original probability properties of the slip distribution, by avoiding post sampling treatments such as earthquake slip scaling. Our approach is analyzed and justified in the framework of the present study. Second, the methodology uses a Stochastic Reduced Order model (SROM) (Grigoriu, 2009) instead of a classic Monte Carlo simulation, which reduces the computational cost of the uncertainty propagation. The methodology is applied on a real case. We study tsunamis generated at the site of the 2014 Chilean earthquake. We generate earthquake samples with expected magnitude Mw 8. We first demonstrate that the stochastic approach of our study generates consistent earthquake samples with respect to the target probability laws. We also show that the results obtained from SROM are more accurate than classic Monte Carlo simulations. We finally validate the methodology by comparing the simulated tsunamis and the tsunami records for the 2014 Chilean earthquake. Results show that leading wave measurements fall within the tsunami sample space. At later times, however, there are mismatches between measured data and the simulated results, suggesting that other sources of uncertainty are as relevant as the uncertainty of the studied earthquake characteristics.

  5. Influence of model reduction on uncertainty of flood inundation predictions

    NASA Astrophysics Data System (ADS)

    Romanowicz, R. J.; Kiczko, A.; Osuch, M.

    2012-04-01

    Derivation of flood risk maps requires an estimation of the maximum inundation extent for a flood with an assumed probability of exceedence, e.g. a 100 or 500 year flood. The results of numerical simulations of flood wave propagation are used to overcome the lack of relevant observations. In practice, deterministic 1-D models are used for flow routing, giving a simplified image of a flood wave propagation process. The solution of a 1-D model depends on the simplifications to the model structure, the initial and boundary conditions and the estimates of model parameters which are usually identified using the inverse problem based on the available noisy observations. Therefore, there is a large uncertainty involved in the derivation of flood risk maps. In this study we examine the influence of model structure simplifications on estimates of flood extent for the urban river reach. As the study area we chose the Warsaw reach of the River Vistula, where nine bridges and several dikes are located. The aim of the study is to examine the influence of water structures on the derived model roughness parameters, with all the bridges and dikes taken into account, with a reduced number and without any water infrastructure. The results indicate that roughness parameter values of a 1-D HEC-RAS model can be adjusted for the reduction in model structure. However, the price we pay is the model robustness. Apart from a relatively simple question regarding reducing model structure, we also try to answer more fundamental questions regarding the relative importance of input, model structure simplification, parametric and rating curve uncertainty to the uncertainty of flood extent estimates. We apply pseudo-Bayesian methods of uncertainty estimation and Global Sensitivity Analysis as the main methodological tools. The results indicate that the uncertainties have a substantial influence on flood risk assessment. In the paper we present a simplified methodology allowing the influence of that uncertainty to be assessed. This work was supported by National Science Centre of Poland (grant 2011/01/B/ST10/06866).

  6. Experimental joint quantum measurements with minimum uncertainty.

    PubMed

    Ringbauer, Martin; Biggerstaff, Devon N; Broome, Matthew A; Fedrizzi, Alessandro; Branciard, Cyril; White, Andrew G

    2014-01-17

    Quantum physics constrains the accuracy of joint measurements of incompatible observables. Here we test tight measurement-uncertainty relations using single photons. We implement two independent, idealized uncertainty-estimation methods, the three-state method and the weak-measurement method, and adapt them to realistic experimental conditions. Exceptional quantum state fidelities of up to 0.999 98(6) allow us to verge upon the fundamental limits of measurement uncertainty.

  7. Estimation Of TMDLs And Margin Of Safety Under Conditions Of Uncertainty

    EPA Science Inventory

    In TMDL development, an adequate margin of safety (MOS) is required in the calculation process to provide a cushion needed because of uncertainties in the data and analysis. Current practices, however, rarely factor analysis' uncertainty in TMDL development and the MOS is largel...

  8. The Key Role of Experiential Uncertainty when Dealing with Risks: Its Relationships with Demand for Regulation and Institutional Trust.

    PubMed

    Poortvliet, P Marijn; Lokhorst, Anne Marike

    2016-08-01

    The results of a survey and an experiment show that experiential uncertainty-people's experience of uncertainty in risk contexts-plays a moderating role in individuals' risk-related demand for government regulation and trust in risk-managing government institutions. First, descriptions of risks were presented to respondents in a survey (N = 1,017) and their reactions to questions about experiential uncertainty, risk perception, and demand for government regulation were measured, as well as levels of risk-specific knowledge. When experiential uncertainty was high, risk perceptions had a positive relationship with demand for government regulation of risk; no such relationship showed under low experiential uncertainty. Conversely, when people experience little experiential uncertainty, having more knowledge about the risk topic involved was associated with a weaker demand for government regulation of risk. For people experiencing uncertainty, this relationship between knowledge and demand for regulation did not emerge. Second, in an experiment (N = 120), experiential uncertainty and openness in risk communication were manipulated to investigate effects on trust. In the uncertainty condition, the results showed that open versus nonopen government communication about Q-fever-a zoonosis-led to higher levels of trust in the government agency, but not in in the control condition. Altogether, this research suggests that only when people experience relatively little uncertainty about the risk, knowledge provision may preclude them from demanding government action. Also, only when persons experience uncertainty are stronger risk perceptions associated with a demand for government regulation, and they are affected by openness of risk communication in forming institutional trust. © 2016 Society for Risk Analysis.

  9. Accuracy of neutron self-activation method with iodine-containing scintillators for quantifying 128I generation using decay-fitting technique

    NASA Astrophysics Data System (ADS)

    Nohtomi, Akihiro; Wakabayashi, Genichiro

    2015-11-01

    We evaluated the accuracy of a self-activation method with iodine-containing scintillators in quantifying 128I generation in an activation detector; the self-activation method was recently proposed for photo-neutron on-line measurements around X-ray radiotherapy machines. Here, we consider the accuracy of determining the initial count rate R0, observed just after termination of neutron irradiation of the activation detector. The value R0 is directly related to the amount of activity generated by incident neutrons; the detection efficiency of radiation emitted from the activity should be taken into account for such an evaluation. Decay curves of 128I activity were numerically simulated by a computer program for various conditions including different initial count rates (R0) and background rates (RB), as well as counting statistical fluctuations. The data points sampled at minute intervals and integrated over the same period were fit by a non-linear least-squares fitting routine to obtain the value R0 as a fitting parameter with an associated uncertainty. The corresponding background rate RB was simultaneously calculated in the same fitting routine. Identical data sets were also evaluated by a well-known integration algorithm used for conventional activation methods and the results were compared with those of the proposed fitting method. When we fixed RB = 500 cpm, the relative uncertainty σR0 /R0 ≤ 0.02 was achieved for R0/RB ≥ 20 with 20 data points from 1 min to 20 min following the termination of neutron irradiation used in the fitting; σR0 /R0 ≤ 0.01 was achieved for R0/RB ≥ 50 with the same data points. Reasonable relative uncertainties to evaluate initial count rates were reached by the decay-fitting method using practically realistic sampling numbers. These results clarified the theoretical limits of the fitting method. The integration method was found to be potentially vulnerable to short-term variations in background levels, especially instantaneous contaminations by spike-like noise. The fitting method easily detects and removes such spike-like noise.

  10. Quantifying the Effects of Spatial Uncertainty in Fracture Permeability on CO2 Leakage through Columbia River Basalt Flow Interiors

    NASA Astrophysics Data System (ADS)

    Gierzynski, A.; Pollyea, R.

    2016-12-01

    Recent studies suggest that continental flood basalts may be suitable for geologic carbon sequestration, due to fluid-rock reactions that mineralize injected CO2 on relatively short time-scales. Flood basalts also possess a morphological structure conducive to injection, with alternating high-permeability (flow margin) and low-permeability (flow interior) layers. However, little information exists on the behavior of CO2 migration within field-scale fracture networks, particularly within flow interiors and at conditions near the critical point for CO2. In this study, numerical simulation is used to investigate the influence of fracture permeability uncertainty during gravity-driven CO2 migration within a jointed basalt flow interior as CO2 undergoes phase change from supercritical fluid to a subcritical phase. The model domain comprises a 2D fracture network mapped with terrestrial LiDAR scans of Columbia River Basalt acquired near Starbuck, WA. The model domain is 5 m × 5 m with bimodal heterogeneity (fracture and matrix), and initial conditions corresponding to a hydrostatic pressure gradient between 750 and 755 m depth. Under these conditions, the critical point for CO2 occurs 1.5 m above the bottom of the domain. For this model scenario, CO2 enters the base of the fracture network at 0.5 MPa overpressure, and matrix permeability is assumed constant. Fracture permeability follows a lognormal distribution on the basis of fracture aperture values from literature. In order to account for spatial uncertainty, the lognormal fracture permeability distribution is randomly located in the model domain and CO2 migration is simulated within the same fracture network for 50 equally probable realizations. Model results suggest that fracture connectivity, which is independent of permeability distribution, governs the path taken by buoyant CO2 as it rises through the flow interior; however, the permeability distribution strongly governs the CO2 flux magnitude. In particular, this research shows that even where fracture networks are sufficiently connected, CO2 flux is often inhibited by a cell of lower permeability, analogous to an obstruction or asperity in a natural fracture. This impresses the importance of considering spatial uncertainty in fracture apertures when modeling CO2 leakage through a caprock.

  11. Bitwise efficiency in chaotic models

    PubMed Central

    Düben, Peter; Palmer, Tim

    2017-01-01

    Motivated by the increasing energy consumption of supercomputing for weather and climate simulations, we introduce a framework for investigating the bit-level information efficiency of chaotic models. In comparison with previous explorations of inexactness in climate modelling, the proposed and tested information metric has three specific advantages: (i) it requires only a single high-precision time series; (ii) information does not grow indefinitely for decreasing time step; and (iii) information is more sensitive to the dynamics and uncertainties of the model rather than to the implementation details. We demonstrate the notion of bit-level information efficiency in two of Edward Lorenz’s prototypical chaotic models: Lorenz 1963 (L63) and Lorenz 1996 (L96). Although L63 is typically integrated in 64-bit ‘double’ floating point precision, we show that only 16 bits have significant information content, given an initial condition uncertainty of approximately 1% of the size of the attractor. This result is sensitive to the size of the uncertainty but not to the time step of the model. We then apply the metric to the L96 model and find that a 16-bit scaled integer model would suffice given the uncertainty of the unresolved sub-grid-scale dynamics. We then show that, by dedicating computational resources to spatial resolution rather than numeric precision in a field programmable gate array (FPGA), we see up to 28.6% improvement in forecast accuracy, an approximately fivefold reduction in the number of logical computing elements required and an approximately 10-fold reduction in energy consumed by the FPGA, for the L96 model. PMID:28989303

  12. Bitwise efficiency in chaotic models

    NASA Astrophysics Data System (ADS)

    Jeffress, Stephen; Düben, Peter; Palmer, Tim

    2017-09-01

    Motivated by the increasing energy consumption of supercomputing for weather and climate simulations, we introduce a framework for investigating the bit-level information efficiency of chaotic models. In comparison with previous explorations of inexactness in climate modelling, the proposed and tested information metric has three specific advantages: (i) it requires only a single high-precision time series; (ii) information does not grow indefinitely for decreasing time step; and (iii) information is more sensitive to the dynamics and uncertainties of the model rather than to the implementation details. We demonstrate the notion of bit-level information efficiency in two of Edward Lorenz's prototypical chaotic models: Lorenz 1963 (L63) and Lorenz 1996 (L96). Although L63 is typically integrated in 64-bit `double' floating point precision, we show that only 16 bits have significant information content, given an initial condition uncertainty of approximately 1% of the size of the attractor. This result is sensitive to the size of the uncertainty but not to the time step of the model. We then apply the metric to the L96 model and find that a 16-bit scaled integer model would suffice given the uncertainty of the unresolved sub-grid-scale dynamics. We then show that, by dedicating computational resources to spatial resolution rather than numeric precision in a field programmable gate array (FPGA), we see up to 28.6% improvement in forecast accuracy, an approximately fivefold reduction in the number of logical computing elements required and an approximately 10-fold reduction in energy consumed by the FPGA, for the L96 model.

  13. Probabilistic fatigue life prediction of metallic and composite materials

    NASA Astrophysics Data System (ADS)

    Xiang, Yibing

    Fatigue is one of the most common failure modes for engineering structures, such as aircrafts, rotorcrafts and aviation transports. Both metallic materials and composite materials are widely used and affected by fatigue damage. Huge uncertainties arise from material properties, measurement noise, imperfect models, future anticipated loads and environmental conditions. These uncertainties are critical issues for accurate remaining useful life (RUL) prediction for engineering structures in service. Probabilistic fatigue prognosis considering various uncertainties is of great importance for structural safety. The objective of this study is to develop probabilistic fatigue life prediction models for metallic materials and composite materials. A fatigue model based on crack growth analysis and equivalent initial flaw size concept is proposed for metallic materials. Following this, the developed model is extended to include structural geometry effects (notch effect), environmental effects (corroded specimens) and manufacturing effects (shot peening effects). Due to the inhomogeneity and anisotropy, the fatigue model suitable for metallic materials cannot be directly applied to composite materials. A composite fatigue model life prediction is proposed based on a mixed-mode delamination growth model and a stiffness degradation law. After the development of deterministic fatigue models of metallic and composite materials, a general probabilistic life prediction methodology is developed. The proposed methodology combines an efficient Inverse First-Order Reliability Method (IFORM) for the uncertainty propogation in fatigue life prediction. An equivalent stresstransformation has been developed to enhance the computational efficiency under realistic random amplitude loading. A systematical reliability-based maintenance optimization framework is proposed for fatigue risk management and mitigation of engineering structures.

  14. The Development and Assesment of Adaptation Pathways for Urban Pluvial Flooding

    NASA Astrophysics Data System (ADS)

    Babovic, F.; Mijic, A.; Madani, K.

    2017-12-01

    Around the globe, urban areas are growing in both size and importance. However, due to the prevalence of impermeable surfaces within the urban fabric of cities these areas have a high risk of pluvial flooding. Due to the convergence of population growth and climate change the risk of pluvial flooding is growing. When designing solutions and adaptations to pluvial flood risk urban planners and engineers encounter a great deal of uncertainty due to model uncertainty, uncertainty within the data utilised, and uncertainty related to future climate and land use conditions. The interaction of these uncertainties leads to conditions of deep uncertainty. However, infrastructure systems must be designed and built in the face of this deep uncertainty. An Adaptation Tipping Points (ATP) methodology was used to develop a strategy to adapt an urban drainage system in the North East of London under conditions of deep uncertainty. The ATP approach was used to assess the current drainage system and potential drainage system adaptations. These adaptations were assessed against potential changes in rainfall depth and peakedness-defined as the ratio of mean to peak rainfall. These solutions encompassed both traditional and blue-green solutions that the Local Authority are known to be considering. This resulted in a set of Adaptation Pathways. However, theses pathways do not convey any information regarding the relative merits and demerits of the potential adaptation options presented. To address this a cost-benefit metric was developed that would reflect the solutions' costs and benefits under uncertainty. The resulting metric combines elements of the Benefits of SuDS Tool (BeST) with real options analysis in order to reflect the potential value of ecosystem services delivered by blue-green solutions under uncertainty. Lastly, it is discussed how a local body can utilise the adaptation pathways; their relative costs and benefits; and a system of local data collection to help guide better decision making with respect to urban flood adaptation.

  15. Evaluation of the Uncertainty in JP-7 Kinetics Models Applied to Scramjets

    NASA Technical Reports Server (NTRS)

    Norris, A. T.

    2017-01-01

    One of the challenges of designing and flying a scramjet-powered vehicle is the difficulty of preflight testing. Ground tests at realistic flight conditions introduce several sources of uncertainty to the flow that must be addressed. For example, the scales of the available facilities limit the size of vehicles that can be tested and so performance metrics for larger flight vehicles must be extrapolated from ground tests at smaller scales. To create the correct flow enthalpy for higher Mach number flows, most tunnels use a heater that introduces vitiates into the flow. At these conditions, the effects of the vitiates on the combustion process is of particular interest to the engine designer, where the ground test results must be extrapolated to flight conditions. In this paper, the uncertainty of the cracked JP-7 chemical kinetics used in the modeling of a hydrocarbon-fueled scramjet was investigated. The factors that were identified as contributing to uncertainty in the combustion process were the level of flow vitiation, the uncertainty of the kinetic model coefficients and the variation of flow properties between ground testing and flight. The method employed was to run simulations of small, unit problems and identify which variables were the principal sources of uncertainty for the mixture temperature. Then using this resulting subset of all the variables, the effects of the uncertainty caused by the chemical kinetics on a representative scramjet flow-path for both vitiated (ground) and nonvitiated (flight) flows were investigated. The simulations showed that only a few of the kinetic rate equations contribute to the uncertainty in the unit problem results, and when applied to the representative scramjet flowpath, the resulting temperature variability was on the order of 100 K. Both the vitiated and clean air results showed very similar levels of uncertainty, and the difference between the mean properties were generally within the range of uncertainty predicted.

  16. The Impact of Model Uncertainty on Spatial Compensation in Active Structural Acoustic Control

    NASA Technical Reports Server (NTRS)

    Cabell, Randolph H.; Gibbs, Gary P.; Sprofera, Joseph D.; Clark, Robert L.

    2004-01-01

    Turbulent boundary layer (TBL) noise is considered a primary factor in the interior noise experienced by passengers aboard commercial airliners. There have been numerous investigations of interior noise control devoted to aircraft panels; however, practical realization is a challenge since the physical boundary conditions are uncertain at best. In most prior studies, pinned or clamped boundary conditions have been assumed; however, realistic panels likely display a range of varying boundary conditions between these two limits. Uncertainty in boundary conditions is a challenge for control system designers, both in terms of the compensator implemented and the location of actuators and sensors required to achieve the desired control. The impact of model uncertainties, uncertain boundary conditions in particular, on the selection of actuator and sensor locations for structural acoustic control are considered herein. Results from this research effort indicate that it is possible to optimize the design of actuator and sensor location and aperture, which minimizes the impact of boundary conditions on the desired structural acoustic control.

  17. Assessment of Experimental Uncertainty for a Floating Wind Semisubmersible under Hydrodynamic Loading: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Amy N; Wendt, Fabian F; Jonkman, Jason

    The objective of this paper is to assess the sources of experimental uncertainty in an offshore wind validation campaign focused on better understanding the nonlinear hydrodynamic response behavior of a floating semisubmersible. The test specimen and conditions were simplified compared to other floating wind test campaigns to reduce potential sources of uncertainties and better focus on the hydrodynamic load attributes. Repeat tests were used to understand the repeatability of the test conditions and to assess the level of random uncertainty in the measurements. Attention was also given to understanding bias in all components of the test. The end goal ofmore » this work is to set uncertainty bounds on the response metrics of interest, which will be used in future work to evaluate the success of modeling tools in accurately calculating hydrodynamic loads and the associated motion responses of the system.« less

  18. Evaluation of dose uncertainty in radiation processing using EPR spectroscopy and butylated hydroxytoluene rods as dosimetry system

    NASA Astrophysics Data System (ADS)

    Alkhorayef, M.; Mansour, A.; Sulieman, A.; Alnaaimi, M.; Alduaij, M.; Babikir, E.; Bradley, D. A.

    2017-12-01

    Butylatedhydroxytoluene (BHT) rods represent a potential dosimeter in radiation processing, with readout via electron paramagnetic resonance (EPR) spectroscopy. Among the possible sources of uncertainty are those associated with the performance of the dosimetric medium and the conditions under which measurements are made, including sampling and environmental conditions. Present study makes estimate of the uncertainties, investigating physical response in different resonance regions. BHT, a white crystalline solid with a melting point of between 70-73 °C, was investigated using 60Co gamma irradiation over the dose range 0.1-100 kGy. The intensity of the EPR signal increases linearly in the range 0.1-35 kGy, the uncertainty budget for high doses being 3.3% at the 2σ confidence level. The rod form represents an excellent alternative dosimeter for high level dosimetry, of small uncertainty compared to powder form.

  19. Non-local correlations via Wigner-Yanase skew information in two SC-qubit having mutual interaction under phase decoherence

    NASA Astrophysics Data System (ADS)

    Mohamed, Abdel-Baset A.

    2017-10-01

    An analytical solution of the master equation that describes a superconducting cavity containing two coupled superconducting charge qubits is obtained. Quantum-mechanical correlations based on Wigner-Yanase skew information, as local quantum uncertainty and uncertainty-induced quantum non-locality, are compared to the concurrence under the effects of the phase decoherence. Local quantum uncertainty exhibits sudden changes during its time evolution and revival process. Sudden death and sudden birth occur only for entanglement, depending on the initial state of the two coupled charge qubits, while the correlations of skew information does not vanish. The quantum correlations of skew information are found to be sensitive to the dephasing rate, the photons number in the cavity, the interaction strength between the two qubits, and the qubit distribution angle of the initial state. With a proper initial state, the stationary correlation of the skew information has a non-zero stationary value for a long time interval under the phase decoherence, that it may be useful in quantum information and computation processes.

  20. ISMIP6 - initMIP: Greenland ice sheet model initialisation experiments

    NASA Astrophysics Data System (ADS)

    Goelzer, Heiko; Nowicki, Sophie; Payne, Tony; Larour, Eric; Abe Ouchi, Ayako; Gregory, Jonathan; Lipscomb, William; Seroussi, Helene; Shepherd, Andrew; Edwards, Tamsin

    2016-04-01

    Earlier large-scale Greenland ice sheet sea-level projections e.g. those run during ice2sea and SeaRISE initiatives have shown that ice sheet initialisation can have a large effect on the projections and gives rise to important uncertainties. This intercomparison exercise (initMIP) aims at comparing, evaluating and improving the initialization techniques used in the ice sheet modeling community and to estimate the associated uncertainties. It is the first in a series of ice sheet model intercomparison activities within ISMIP6 (Ice Sheet Model Intercomparison Project for CMIP6). The experiments are conceived for the large-scale Greenland ice sheet and are designed to allow intercomparison between participating models of 1) the initial present-day state of the ice sheet and 2) the response in two schematic forward experiments. The latter experiments serve to evaluate the initialisation in terms of model drift (forward run without any forcing) and response to a large perturbation (prescribed surface mass balance anomaly). We present and discuss first results of the intercomparison and highlight important uncertainties with respect to projections of the Greenland ice sheet sea-level contribution.

  1. The dynamic correlation between policy uncertainty and stock market returns in China

    NASA Astrophysics Data System (ADS)

    Yang, Miao; Jiang, Zhi-Qiang

    2016-11-01

    The dynamic correlation is examined between government's policy uncertainty and Chinese stock market returns in the period from January 1995 to December 2014. We find that the stock market is significantly correlated to policy uncertainty based on the results of the Vector Auto Regression (VAR) and Structural Vector Auto Regression (SVAR) models. In contrast, the results of the Dynamic Conditional Correlation Generalized Multivariate Autoregressive Conditional Heteroscedasticity (DCC-MGARCH) model surprisingly show a low dynamic correlation coefficient between policy uncertainty and market returns, suggesting that the fluctuations of each variable are greatly influenced by their values in the preceding period. Our analysis highlights the understanding of the dynamical relationship between stock market and fiscal and monetary policy.

  2. Protein construct storage: Bayesian variable selection and prediction with mixtures.

    PubMed

    Clyde, M A; Parmigiani, G

    1998-07-01

    Determining optimal conditions for protein storage while maintaining a high level of protein activity is an important question in pharmaceutical research. A designed experiment based on a space-filling design was conducted to understand the effects of factors affecting protein storage and to establish optimal storage conditions. Different model-selection strategies to identify important factors may lead to very different answers about optimal conditions. Uncertainty about which factors are important, or model uncertainty, can be a critical issue in decision-making. We use Bayesian variable selection methods for linear models to identify important variables in the protein storage data, while accounting for model uncertainty. We also use the Bayesian framework to build predictions based on a large family of models, rather than an individual model, and to evaluate the probability that certain candidate storage conditions are optimal.

  3. VERIFI | Virtual Engine Research Institute and Fuels Initiative

    Science.gov Websites

    VERIFI Virtual Engine Research Institute and Fuels Initiative Argonne National Laboratory Skip to Virtual Engine Research Institute and Fuels Initiative (VERIFI) at Argonne National Laboratory is the Argonne National Laboratory in which to answer your complex engine questions, verify the uncertainties

  4. CASMO5/TSUNAMI-3D spent nuclear fuel reactivity uncertainty analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrer, R.; Rhodes, J.; Smith, K.

    2012-07-01

    The CASMO5 lattice physics code is used in conjunction with the TSUNAMI-3D sequence in ORNL's SCALE 6 code system to estimate the uncertainties in hot-to-cold reactivity changes due to cross-section uncertainty for PWR assemblies at various burnup points. The goal of the analysis is to establish the multiplication factor uncertainty similarity between various fuel assemblies at different conditions in a quantifiable manner and to obtain a bound on the hot-to-cold reactivity uncertainty over the various assembly types and burnup attributed to fundamental cross-section data uncertainty. (authors)

  5. Approximating uncertainty of annual runoff and reservoir yield using stochastic replicates of global climate model data

    NASA Astrophysics Data System (ADS)

    Peel, M. C.; Srikanthan, R.; McMahon, T. A.; Karoly, D. J.

    2015-04-01

    Two key sources of uncertainty in projections of future runoff for climate change impact assessments are uncertainty between global climate models (GCMs) and within a GCM. Within-GCM uncertainty is the variability in GCM output that occurs when running a scenario multiple times but each run has slightly different, but equally plausible, initial conditions. The limited number of runs available for each GCM and scenario combination within the Coupled Model Intercomparison Project phase 3 (CMIP3) and phase 5 (CMIP5) data sets, limits the assessment of within-GCM uncertainty. In this second of two companion papers, the primary aim is to present a proof-of-concept approximation of within-GCM uncertainty for monthly precipitation and temperature projections and to assess the impact of within-GCM uncertainty on modelled runoff for climate change impact assessments. A secondary aim is to assess the impact of between-GCM uncertainty on modelled runoff. Here we approximate within-GCM uncertainty by developing non-stationary stochastic replicates of GCM monthly precipitation and temperature data. These replicates are input to an off-line hydrologic model to assess the impact of within-GCM uncertainty on projected annual runoff and reservoir yield. We adopt stochastic replicates of available GCM runs to approximate within-GCM uncertainty because large ensembles, hundreds of runs, for a given GCM and scenario are unavailable, other than the Climateprediction.net data set for the Hadley Centre GCM. To date within-GCM uncertainty has received little attention in the hydrologic climate change impact literature and this analysis provides an approximation of the uncertainty in projected runoff, and reservoir yield, due to within- and between-GCM uncertainty of precipitation and temperature projections. In the companion paper, McMahon et al. (2015) sought to reduce between-GCM uncertainty by removing poorly performing GCMs, resulting in a selection of five better performing GCMs from CMIP3 for use in this paper. Here we present within- and between-GCM uncertainty results in mean annual precipitation (MAP), mean annual temperature (MAT), mean annual runoff (MAR), the standard deviation of annual precipitation (SDP), standard deviation of runoff (SDR) and reservoir yield for five CMIP3 GCMs at 17 worldwide catchments. Based on 100 stochastic replicates of each GCM run at each catchment, within-GCM uncertainty was assessed in relative form as the standard deviation expressed as a percentage of the mean of the 100 replicate values of each variable. The average relative within-GCM uncertainties from the 17 catchments and 5 GCMs for 2015-2044 (A1B) were MAP 4.2%, SDP 14.2%, MAT 0.7%, MAR 10.1% and SDR 17.6%. The Gould-Dincer Gamma (G-DG) procedure was applied to each annual runoff time series for hypothetical reservoir capacities of 1 × MAR and 3 × MAR and the average uncertainties in reservoir yield due to within-GCM uncertainty from the 17 catchments and 5 GCMs were 25.1% (1 × MAR) and 11.9% (3 × MAR). Our approximation of within-GCM uncertainty is expected to be an underestimate due to not replicating the GCM trend. However, our results indicate that within-GCM uncertainty is important when interpreting climate change impact assessments. Approximately 95% of values of MAP, SDP, MAT, MAR, SDR and reservoir yield from 1 × MAR or 3 × MAR capacity reservoirs are expected to fall within twice their respective relative uncertainty (standard deviation/mean). Within-GCM uncertainty has significant implications for interpreting climate change impact assessments that report future changes within our range of uncertainty for a given variable - these projected changes may be due solely to within-GCM uncertainty. Since within-GCM variability is amplified from precipitation to runoff and then to reservoir yield, climate change impact assessments that do not take into account within-GCM uncertainty risk providing water resources management decision makers with a sense of certainty that is unjustified.

  6. Balancing Certainty and Uncertainty in Clinical Practice

    ERIC Educational Resources Information Center

    Kamhi, Alan G.

    2011-01-01

    Purpose: In this epilogue, I respond to each of the five commentaries, discussing in some depth a central issue raised in each commentary. In the final section, I discuss how my thinking about certainty and uncertainty in clinical practice has evolved since I wrote the initial article. Method: Topics addressed include the similarities/differences…

  7. Aspect of ECMWF downscaled Regional Climate Modeling in simulating Indian summer monsoon rainfall and dependencies on lateral boundary conditions

    NASA Astrophysics Data System (ADS)

    Ghosh, Soumik; Bhatla, R.; Mall, R. K.; Srivastava, Prashant K.; Sahai, A. K.

    2018-03-01

    Climate model faces considerable difficulties in simulating the rainfall characteristics of southwest summer monsoon. In this study, the dynamical downscaling of European Centre for Medium-Range Weather Forecast's (ECMWF's) ERA-Interim (EIN15) has been utilized for the simulation of Indian summer monsoon (ISM) through the Regional Climate Model version 4.3 (RegCM-4.3) over the South Asia Co-Ordinated Regional Climate Downscaling EXperiment (CORDEX) domain. The complexities of model simulation over a particular terrain are generally influenced by factors such as complex topography, coastal boundary, and lack of unbiased initial and lateral boundary conditions. In order to overcome some of these limitations, the RegCM-4.3 is employed for simulating the rainfall characteristics over the complex topographical conditions. For reliable rainfall simulation, implementations of numerous lower boundary conditions are forced in the RegCM-4.3 with specific horizontal grid resolution of 50 km over South Asia CORDEX domain. The analysis is considered for 30 years of climatological simulation of rainfall, outgoing longwave radiation (OLR), mean sea level pressure (MSLP), and wind with different vertical levels over the specified region. The dependency of model simulation with the forcing of EIN15 initial and lateral boundary conditions is used to understand the impact of simulated rainfall characteristics during different phases of summer monsoon. The results obtained from this study are used to evaluate the activity of initial conditions of zonal wind circulation speed, which causes an increase in the uncertainty of regional model output over the region under investigation. Further, the results showed that the EIN15 zonal wind circulation lacks sufficient speed over the specified region in a particular time, which was carried forward by the RegCM output and leads to a disrupted regional simulation in the climate model.

  8. A New Combined Stepwise-Based High-Order Decoupled Direct and Reduced-Form Method To Improve Uncertainty Analysis in PM2.5 Simulations.

    PubMed

    Huang, Zhijiong; Hu, Yongtao; Zheng, Junyu; Yuan, Zibing; Russell, Armistead G; Ou, Jiamin; Zhong, Zhuangmin

    2017-04-04

    The traditional reduced-form model (RFM) based on the high-order decoupled direct method (HDDM), is an efficient uncertainty analysis approach for air quality models, but it has large biases in uncertainty propagation due to the limitation of the HDDM in predicting nonlinear responses to large perturbations of model inputs. To overcome the limitation, a new stepwise-based RFM method that combines several sets of local sensitive coefficients under different conditions is proposed. Evaluations reveal that the new RFM improves the prediction of nonlinear responses. The new method is applied to quantify uncertainties in simulated PM 2.5 concentrations in the Pearl River Delta (PRD) region of China as a case study. Results show that the average uncertainty range of hourly PM 2.5 concentrations is -28% to 57%, which can cover approximately 70% of the observed PM 2.5 concentrations, while the traditional RFM underestimates the upper bound of the uncertainty range by 1-6%. Using a variance-based method, the PM 2.5 boundary conditions and primary PM 2.5 emissions are found to be the two major uncertainty sources in PM 2.5 simulations. The new RFM better quantifies the uncertainty range in model simulations and can be applied to improve applications that rely on uncertainty information.

  9. The Role of Type and Source of Uncertainty on the Processing of Climate Models Projections.

    PubMed

    Benjamin, Daniel M; Budescu, David V

    2018-01-01

    Scientists agree that the climate is changing due to human activities, but there is less agreement about the specific consequences and their timeline. Disagreement among climate projections is attributable to the complexity of climate models that differ in their structure, parameters, initial conditions, etc. We examine how different sources of uncertainty affect people's interpretation of, and reaction to, information about climate change by presenting participants forecasts from multiple experts. Participants viewed three types of sets of sea-level rise projections: (1) precise, but conflicting ; (2) imprecise , but agreeing, and (3) hybrid that were both conflicting and imprecise. They estimated the most likely sea-level rise, provided a range of possible values and rated the sets on several features - ambiguity, credibility, completeness, etc. In Study 1, everyone saw the same hybrid set. We found that participants were sensitive to uncertainty between sources, but not to uncertainty about which model was used. The impacts of conflict and imprecision were combined for estimation tasks and compromised for feature ratings . Estimates were closer to the experts' original projections, and sets were rated more favorably under imprecision. Estimates were least consistent with (narrower than) the experts in the hybrid condition, but participants rated the conflicting set least favorably. In Study 2, we investigated the hybrid case in more detail by creating several distinct interval sets that combine conflict and imprecision. Two factors drive perceptual differences: overlap - the structure of the forecast set (whether intersecting, nested, tangent, or disjoint) - and a symmetry - the balance of the set. Estimates were primarily driven by asymmetry, and preferences were primarily driven by overlap. Asymmetric sets were least consistent with the experts: estimated ranges were narrower, and estimates of the most likely value were shifted further below the set mean. Intersecting and nested sets were rated similarly to imprecision, and ratings of disjoint and tangent sets were rated like conflict. Our goal was to determine which underlying factors of information sets drive perceptions of uncertainty in consistent, predictable ways. The two studies lead us to conclude that perceptions of agreement require intersection and balance, and overly precise forecasts lead to greater perceptions of disagreement and a greater likelihood of the public discrediting and misinterpreting information.

  10. The Role of Type and Source of Uncertainty on the Processing of Climate Models Projections

    PubMed Central

    Benjamin, Daniel M.; Budescu, David V.

    2018-01-01

    Scientists agree that the climate is changing due to human activities, but there is less agreement about the specific consequences and their timeline. Disagreement among climate projections is attributable to the complexity of climate models that differ in their structure, parameters, initial conditions, etc. We examine how different sources of uncertainty affect people’s interpretation of, and reaction to, information about climate change by presenting participants forecasts from multiple experts. Participants viewed three types of sets of sea-level rise projections: (1) precise, but conflicting; (2) imprecise, but agreeing, and (3) hybrid that were both conflicting and imprecise. They estimated the most likely sea-level rise, provided a range of possible values and rated the sets on several features – ambiguity, credibility, completeness, etc. In Study 1, everyone saw the same hybrid set. We found that participants were sensitive to uncertainty between sources, but not to uncertainty about which model was used. The impacts of conflict and imprecision were combined for estimation tasks and compromised for feature ratings. Estimates were closer to the experts’ original projections, and sets were rated more favorably under imprecision. Estimates were least consistent with (narrower than) the experts in the hybrid condition, but participants rated the conflicting set least favorably. In Study 2, we investigated the hybrid case in more detail by creating several distinct interval sets that combine conflict and imprecision. Two factors drive perceptual differences: overlap – the structure of the forecast set (whether intersecting, nested, tangent, or disjoint) – and asymmetry – the balance of the set. Estimates were primarily driven by asymmetry, and preferences were primarily driven by overlap. Asymmetric sets were least consistent with the experts: estimated ranges were narrower, and estimates of the most likely value were shifted further below the set mean. Intersecting and nested sets were rated similarly to imprecision, and ratings of disjoint and tangent sets were rated like conflict. Our goal was to determine which underlying factors of information sets drive perceptions of uncertainty in consistent, predictable ways. The two studies lead us to conclude that perceptions of agreement require intersection and balance, and overly precise forecasts lead to greater perceptions of disagreement and a greater likelihood of the public discrediting and misinterpreting information. PMID:29636717

  11. The structure of particle-laden jets and nonevaporating sprays

    NASA Technical Reports Server (NTRS)

    Shuen, J. S.; Solomon, A. S. P.; Zhang, Q. F.; Faeth, G. M.

    1983-01-01

    Mean and fluctuating gas velocities, liquid mass fluxes and drop sizes were in nonevaporating sprays. These results, as well as existing measurements in solid particle-laden jets, were used to evaluate models of these processes. The following models were considered: (1) a locally homogeneous flow (LHF) model, where slip between the phases was neglected; (2) a deterministic separated flow (DSF) model, where slip was considered but effects of particle dispersion by turbulence were ignored; and (3) a stochastic separated flow (SSF) model, where effects of interphase slip and turbulent dispersion were considered using random-walk computations for particle motion. The LHF and DSF models did not provide very satisfactory predictions over the present data base. In contrast, the SSF model performed reasonably well - including conditions in nonevaporating sprays where enhanced dispersion of particles by turbulence caused the spray to spread more rapidly than single-phase jets for comparable conditions. While these results are encouraging, uncertainties in initial conditions limit the reliability of the evaluation. Current work is seeking to eliminate this deficiency.

  12. Assessing Potential Climate Change Effects on Loblolly Pine Growth: A Probabilistic Regional Modeling Approach

    Treesearch

    Peter B. Woodbury; James E. Smith; David A. Weinstein; John A. Laurence

    1998-01-01

    Most models of the potential effects of climate change on forest growth have produced deterministic predictions. However, there are large uncertainties in data on regional forest condition, estimates of future climate, and quantitative relationships between environmental conditions and forest growth rate. We constructed a new model to analyze these uncertainties...

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramuhalli, Pradeep; Hirt, Evelyn H.; Veeramany, Arun

    This research report summaries the development and evaluation of a prototypic enhanced risk monitor (ERM) methodology (framework) that includes alternative risk metrics and uncertainty analysis. This updated ERM methodology accounts for uncertainty in the equipment condition assessment (ECA), the prognostic result, and the probabilistic risk assessment (PRA) model. It is anticipated that the ability to characterize uncertainty in the estimated risk and update the risk estimates in real time based on equipment condition assessment (ECA) will provide a mechanism for optimizing plant performance while staying within specified safety margins. These results (based on impacting active component O&M using real-time equipmentmore » condition information) are a step towards ERMs that, if integrated with AR supervisory plant control systems, can help control O&M costs and improve affordability of advanced reactors.« less

  14. When autocratic leaders become an option--uncertainty and self-esteem predict implicit leadership preferences.

    PubMed

    Schoel, Christiane; Bluemke, Matthias; Mueller, Patrick; Stahlberg, Dagmar

    2011-09-01

    We investigated the impact of uncertainty on leadership preferences and propose that the conjunction of self-esteem level and stability is an important moderator in this regard. Self-threatening uncertainty is aversive and activates the motivation to regain control. People with high and stable self-esteem should be confident of achieving this goal by self-determined amelioration of the situation and should therefore show a stronger preference for democratic leadership under conditions of uncertainty. By contrast, people with low and unstable self-esteem should place their trust and hope in the abilities of powerful others, resulting in a preference for autocratic leadership. Studies 1a and 1b validate explicit and implicit leadership measures and demonstrate a general prodemocratic default attitude under conditions of certainty. Studies 2 and 3 reveal a democratic reaction for individuals with stable high self-esteem and a submissive reaction for individuals with unstable low self-esteem under conditions of uncertainty. In Study 4, this pattern is cancelled out when individuals evaluate leadership styles from a leader instead of a follower perspective. PsycINFO Database Record (c) 2011 APA, all rights reserved.

  15. Traceable measurements of the electrical parameters of solid-state lighting products

    NASA Astrophysics Data System (ADS)

    Zhao, D.; Rietveld, G.; Braun, J.-P.; Overney, F.; Lippert, T.; Christensen, A.

    2016-12-01

    In order to perform traceable measurements of the electrical parameters of solid-state lighting (SSL) products, it is necessary to technically adequately define the measurement procedures and to identify the relevant uncertainty sources. The present published written standard for SSL products specifies test conditions, but it lacks an explanation of how adequate these test conditions are. More specifically, both an identification of uncertainty sources and a quantitative uncertainty analysis are absent. This paper fills the related gap in the present written standard. New uncertainty sources with respect to conventional lighting sources are determined and their effects are quantified. It shows that for power measurements, the main uncertainty sources are temperature deviation, power supply voltage distortion, and instability of the SSL product. For current RMS measurements, the influence of bandwidth, shunt resistor, power supply source impedance and ac frequency flatness are significant as well. The measurement uncertainty depends not only on the test equipment but is also a function of the characteristics of the device under test (DUT), for example, current harmonics spectrum and input impedance. Therefore, an online calculation tool is provided to help non-electrical experts. Following our procedures, unrealistic uncertainty estimations, unnecessary procedures and expensive equipment can be prevented.

  16. Uncertainty Analysis of Thermal Comfort Parameters

    NASA Astrophysics Data System (ADS)

    Ribeiro, A. Silva; Alves e Sousa, J.; Cox, Maurice G.; Forbes, Alistair B.; Matias, L. Cordeiro; Martins, L. Lages

    2015-08-01

    International Standard ISO 7730:2005 defines thermal comfort as that condition of mind that expresses the degree of satisfaction with the thermal environment. Although this definition is inevitably subjective, the Standard gives formulae for two thermal comfort indices, predicted mean vote ( PMV) and predicted percentage dissatisfied ( PPD). The PMV formula is based on principles of heat balance and experimental data collected in a controlled climate chamber under steady-state conditions. The PPD formula depends only on PMV. Although these formulae are widely recognized and adopted, little has been done to establish measurement uncertainties associated with their use, bearing in mind that the formulae depend on measured values and tabulated values given to limited numerical accuracy. Knowledge of these uncertainties are invaluable when values provided by the formulae are used in making decisions in various health and civil engineering situations. This paper examines these formulae, giving a general mechanism for evaluating the uncertainties associated with values of the quantities on which the formulae depend. Further, consideration is given to the propagation of these uncertainties through the formulae to provide uncertainties associated with the values obtained for the indices. Current international guidance on uncertainty evaluation is utilized.

  17. Ensemble urban flood simulation in comparison with laboratory-scale experiments: Impact of interaction models for manhole, sewer pipe, and surface flow

    NASA Astrophysics Data System (ADS)

    Noh, Seong Jin; Lee, Seungsoo; An, Hyunuk; Kawaike, Kenji; Nakagawa, Hajime

    2016-11-01

    An urban flood is an integrated phenomenon that is affected by various uncertainty sources such as input forcing, model parameters, complex geometry, and exchanges of flow among different domains in surfaces and subsurfaces. Despite considerable advances in urban flood modeling techniques, limited knowledge is currently available with regard to the impact of dynamic interaction among different flow domains on urban floods. In this paper, an ensemble method for urban flood modeling is presented to consider the parameter uncertainty of interaction models among a manhole, a sewer pipe, and surface flow. Laboratory-scale experiments on urban flood and inundation are performed under various flow conditions to investigate the parameter uncertainty of interaction models. The results show that ensemble simulation using interaction models based on weir and orifice formulas reproduces experimental data with high accuracy and detects the identifiability of model parameters. Among interaction-related parameters, the parameters of the sewer-manhole interaction show lower uncertainty than those of the sewer-surface interaction. Experimental data obtained under unsteady-state conditions are more informative than those obtained under steady-state conditions to assess the parameter uncertainty of interaction models. Although the optimal parameters vary according to the flow conditions, the difference is marginal. Simulation results also confirm the capability of the interaction models and the potential of the ensemble-based approaches to facilitate urban flood simulation.

  18. Entropy of hydrological systems under small samples: Uncertainty and variability

    NASA Astrophysics Data System (ADS)

    Liu, Dengfeng; Wang, Dong; Wang, Yuankun; Wu, Jichun; Singh, Vijay P.; Zeng, Xiankui; Wang, Lachun; Chen, Yuanfang; Chen, Xi; Zhang, Liyuan; Gu, Shenghua

    2016-01-01

    Entropy theory has been increasingly applied in hydrology in both descriptive and inferential ways. However, little attention has been given to the small-sample condition widespread in hydrological practice, where either hydrological measurements are limited or are even nonexistent. Accordingly, entropy estimated under this condition may incur considerable bias. In this study, small-sample condition is considered and two innovative entropy estimators, the Chao-Shen (CS) estimator and the James-Stein-type shrinkage (JSS) estimator, are introduced. Simulation tests are conducted with common distributions in hydrology, that lead to the best-performing JSS estimator. Then, multi-scale moving entropy-based hydrological analyses (MM-EHA) are applied to indicate the changing patterns of uncertainty of streamflow data collected from the Yangtze River and the Yellow River, China. For further investigation into the intrinsic property of entropy applied in hydrological uncertainty analyses, correlations of entropy and other statistics at different time-scales are also calculated, which show connections between the concept of uncertainty and variability.

  19. Climate impacts on human livelihoods: where uncertainty matters in projections of water availability

    NASA Astrophysics Data System (ADS)

    Lissner, T. K.; Reusser, D. E.; Schewe, J.; Lakes, T.; Kropp, J. P.

    2014-03-01

    Climate change will have adverse impacts on many different sectors of society, with manifold consequences for human livelihoods and well-being. However, a systematic method to quantify human well-being and livelihoods across sectors is so far unavailable, making it difficult to determine the extent of such impacts. Climate impact analyses are often limited to individual sectors (e.g. food or water) and employ sector-specific target-measures, while systematic linkages to general livelihood conditions remain unexplored. Further, recent multi-model assessments have shown that uncertainties in projections of climate impacts deriving from climate and impact models as well as greenhouse gas scenarios are substantial, posing an additional challenge in linking climate impacts with livelihood conditions. This article first presents a methodology to consistently measure Adequate Human livelihood conditions for wEll-being And Development (AHEAD). Based on a transdisciplinary sample of influential concepts addressing human well-being, the approach measures the adequacy of conditions of 16 elements. We implement the method at global scale, using results from the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP) to show how changes in water availability affect the fulfilment of AHEAD at national resolution. In addition, AHEAD allows identifying and differentiating uncertainty of climate and impact model projections. We show how the approach can help to put the substantial inter-model spread into the context of country-specific livelihood conditions by differentiating where the uncertainty about water scarcity is relevant with regard to livelihood conditions - and where it is not. The results indicate that in many countries today, livelihood conditions are compromised by water scarcity. However, more often, AHEAD fulfilment is limited through other elements. Moreover, the analysis shows that for 44 out of 111 countries, the water-specific uncertainty ranges are outside relevant thresholds for AHEAD, and therefore do not contribute to the overall uncertainty about climate change impacts on livelihoods. The AHEAD method presented here, together with first results, forms an important step towards making scientific results more applicable for policy-decisions.

  20. Mixed oxidizer hybrid propulsion system optimization under uncertainty using applied response surface methodology and Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Whitehead, James Joshua

    The analysis documented herein provides an integrated approach for the conduct of optimization under uncertainty (OUU) using Monte Carlo Simulation (MCS) techniques coupled with response surface-based methods for characterization of mixture-dependent variables. This novel methodology provides an innovative means of conducting optimization studies under uncertainty in propulsion system design. Analytic inputs are based upon empirical regression rate information obtained from design of experiments (DOE) mixture studies utilizing a mixed oxidizer hybrid rocket concept. Hybrid fuel regression rate was selected as the target response variable for optimization under uncertainty, with maximization of regression rate chosen as the driving objective. Characteristic operational conditions and propellant mixture compositions from experimental efforts conducted during previous foundational work were combined with elemental uncertainty estimates as input variables. Response surfaces for mixture-dependent variables and their associated uncertainty levels were developed using quadratic response equations incorporating single and two-factor interactions. These analysis inputs, response surface equations and associated uncertainty contributions were applied to a probabilistic MCS to develop dispersed regression rates as a function of operational and mixture input conditions within design space. Illustrative case scenarios were developed and assessed using this analytic approach including fully and partially constrained operational condition sets over all of design mixture space. In addition, optimization sets were performed across an operationally representative region in operational space and across all investigated mixture combinations. These scenarios were selected as representative examples relevant to propulsion system optimization, particularly for hybrid and solid rocket platforms. Ternary diagrams, including contour and surface plots, were developed and utilized to aid in visualization. The concept of Expanded-Durov diagrams was also adopted and adapted to this study to aid in visualization of uncertainty bounds. Regions of maximum regression rate and associated uncertainties were determined for each set of case scenarios. Application of response surface methodology coupled with probabilistic-based MCS allowed for flexible and comprehensive interrogation of mixture and operating design space during optimization cases. Analyses were also conducted to assess sensitivity of uncertainty to variations in key elemental uncertainty estimates. The methodology developed during this research provides an innovative optimization tool for future propulsion design efforts.

  1. An application of a hydraulic model simulator in flood risk assessment under changing climatic conditions

    NASA Astrophysics Data System (ADS)

    Doroszkiewicz, J. M.; Romanowicz, R. J.

    2016-12-01

    The standard procedure of climate change impact assessment on future hydrological extremes consists of a chain of consecutive actions, starting from the choice of GCM driven by an assumed CO2 scenario, through downscaling of climatic forcing to a catchment scale, estimation of hydrological extreme indices using hydrological modelling tools and subsequent derivation of flood risk maps with the help of a hydraulic model. Among many possible sources of uncertainty, the main are the uncertainties related to future climate scenarios, climate models, downscaling techniques and hydrological and hydraulic models. Unfortunately, we cannot directly assess the impact of these different sources of uncertainties on flood risk in future due to lack of observations of future climate realizations. The aim of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the processes involved, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-sections. The study shows that the application of a simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the uncertainty of future climate projections, an uncertainty of flow routing model, the propagation of that uncertainty through the hydraulic model, and finally, the uncertainty related to the derivation of flood risk maps.

  2. Default risk modeling beyond the first-passage approximation: extended Black-Cox model.

    PubMed

    Katz, Yuri A; Shokhirev, Nikolai V

    2010-07-01

    We develop a generalization of the Black-Cox structural model of default risk. The extended model captures uncertainty related to firm's ability to avoid default even if company's liabilities momentarily exceeding its assets. Diffusion in a linear potential with the radiation boundary condition is used to mimic a company's default process. The exact solution of the corresponding Fokker-Planck equation allows for derivation of analytical expressions for the cumulative probability of default and the relevant hazard rate. Obtained closed formulas fit well the historical data on global corporate defaults and demonstrate the split behavior of credit spreads for bonds of companies in different categories of speculative-grade ratings with varying time to maturity. Introduction of the finite rate of default at the boundary improves valuation of credit risk for short time horizons, which is the key advantage of the proposed model. We also consider the influence of uncertainty in the initial distance to the default barrier on the outcome of the model and demonstrate that this additional source of incomplete information may be responsible for nonzero credit spreads for bonds with very short time to maturity.

  3. A Generalized Perturbation Theory Solver In Rattlesnake Based On PETSc With Application To TREAT Steady State Uncertainty Quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schunert, Sebastian; Wang, Congjian; Wang, Yaqi

    Rattlesnake and MAMMOTH are the designated TREAT analysis tools currently being developed at the Idaho National Laboratory. Concurrent with development of the multi-physics, multi-scale capabilities, sensitivity analysis and uncertainty quantification (SA/UQ) capabilities are required for predicitive modeling of the TREAT reactor. For steady-state SA/UQ, that is essential for setting initial conditions for the transients, generalized perturbation theory (GPT) will be used. This work describes the implementation of a PETSc based solver for the generalized adjoint equations that constitute a inhomogeneous, rank deficient problem. The standard approach is to use an outer iteration strategy with repeated removal of the fundamental modemore » contamination. The described GPT algorithm directly solves the GPT equations without the need of an outer iteration procedure by using Krylov subspaces that are orthogonal to the operator’s nullspace. Three test problems are solved and provide sufficient verification for the Rattlesnake’s GPT capability. We conclude with a preliminary example evaluating the impact of the Boron distribution in the TREAT reactor using perturbation theory.« less

  4. Resilient leadership and the organizational culture of resilience: construct validation.

    PubMed

    Everly, George S; Smith, Kenneth J; Lobo, Rachel

    2013-01-01

    Political, economic, and social unrest and uncertainty seem replete throughout the world. Within the United States, political vitriol and economic volatility have led to severe economic restrictions. Both government and private sector organizations are being asked to do more with less. The specter of dramatic changes in healthcare creates a condition of uncertainty affecting budget allocations and hiring practices. If ever there was a time when a "resilient culture" was needed, it is now. In this paper we shall discuss the application of "tipping point" theory (Gladwell, 2000) operationalized through a special form of leadership: "resilient leadership" (Everly, Strouse, Everly, 2010). Resilient leadership is consistent with Gladwells "Law of the Few" and strives to create an organizational culture of resilience by implementing an initial change within no more than 20% of an organization's workforce. It is expected that such a minority, if chosen correctly, will "tip" the rest of the organization toward enhanced resilience, ideally creating a self-sustaining culture of resilience. This paper reports on the empirical foundations and construct validation of "resilient leadership".

  5. Assimilating Remote Sensing Observations of Leaf Area Index and Soil Moisture for Wheat Yield Estimates: An Observing System Simulation Experiment

    NASA Technical Reports Server (NTRS)

    Nearing, Grey S.; Crow, Wade T.; Thorp, Kelly R.; Moran, Mary S.; Reichle, Rolf H.; Gupta, Hoshin V.

    2012-01-01

    Observing system simulation experiments were used to investigate ensemble Bayesian state updating data assimilation of observations of leaf area index (LAI) and soil moisture (theta) for the purpose of improving single-season wheat yield estimates with the Decision Support System for Agrotechnology Transfer (DSSAT) CropSim-Ceres model. Assimilation was conducted in an energy-limited environment and a water-limited environment. Modeling uncertainty was prescribed to weather inputs, soil parameters and initial conditions, and cultivar parameters and through perturbations to model state transition equations. The ensemble Kalman filter and the sequential importance resampling filter were tested for the ability to attenuate effects of these types of uncertainty on yield estimates. LAI and theta observations were synthesized according to characteristics of existing remote sensing data, and effects of observation error were tested. Results indicate that the potential for assimilation to improve end-of-season yield estimates is low. Limitations are due to a lack of root zone soil moisture information, error in LAI observations, and a lack of correlation between leaf and grain growth.

  6. Mathematics applied to the climate system: outstanding challenges and recent progress

    PubMed Central

    Williams, Paul D.; Cullen, Michael J. P.; Davey, Michael K.; Huthnance, John M.

    2013-01-01

    The societal need for reliable climate predictions and a proper assessment of their uncertainties is pressing. Uncertainties arise not only from initial conditions and forcing scenarios, but also from model formulation. Here, we identify and document three broad classes of problems, each representing what we regard to be an outstanding challenge in the area of mathematics applied to the climate system. First, there is the problem of the development and evaluation of simple physically based models of the global climate. Second, there is the problem of the development and evaluation of the components of complex models such as general circulation models. Third, there is the problem of the development and evaluation of appropriate statistical frameworks. We discuss these problems in turn, emphasizing the recent progress made by the papers presented in this Theme Issue. Many pressing challenges in climate science require closer collaboration between climate scientists, mathematicians and statisticians. We hope the papers contained in this Theme Issue will act as inspiration for such collaborations and for setting future research directions. PMID:23588054

  7. The Low-mass Population in the Young Cluster Stock 8: Stellar Properties and Initial Mass Function

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jose, Jessy; Herczeg, Gregory J.; Fang, Qiliang

    The evolution of H ii regions/supershells can trigger a new generation of stars/clusters at their peripheries, with environmental conditions that may affect the initial mass function, disk evolution, and star formation efficiency. In this paper we study the stellar content and star formation processes in the young cluster Stock 8, which itself is thought to be formed during the expansion of a supershell. We present deep optical photometry along with JHK and 3.6 and 4.5 μ m photometry from UKIDSS and Spitzer -IRAC. We use multicolor criteria to identify the candidate young stellar objects in the region. Using evolutionary models,more » we obtain a median log(age) of ∼6.5 (∼3.0 Myr) with an observed age spread of ∼0.25 dex for the cluster. Monte Carlo simulations of the population of Stock 8, based on estimates for the photometric uncertainty, differential reddening, binarity, and variability, indicate that these uncertainties introduce an age spread of ∼0.15 dex. The intrinsic age spread in the cluster is ∼0.2 dex. The fraction of young stellar objects surrounded by disks is ∼35%. The K -band luminosity function of Stock 8 is similar to that of the Trapezium cluster. The initial mass function (IMF) of Stock 8 has a Salpeter-like slope at >0.5 M {sub ⊙} and flattens and peaks at ∼0.4 M {sub ⊙}, below which it declines into the substellar regime. Although Stock 8 is surrounded by several massive stars, there seems to be no severe environmental effect in the form of the IMF due to the proximity of massive stars around the cluster.« less

  8. Factoring uncertainty into restoration modeling of in-situ leach uranium mines

    USGS Publications Warehouse

    Johnson, Raymond H.; Friedel, Michael J.

    2009-01-01

    Postmining restoration is one of the greatest concerns for uranium in-situ leach (ISL) mining operations. The ISL-affected aquifer needs to be returned to conditions specified in the mining permit (either premining or other specified conditions). When uranium ISL operations are completed, postmining restoration is usually achieved by injecting reducing agents into the mined zone. The objective of this process is to restore the aquifer to premining conditions by reducing the solubility of uranium and other metals in the ground water. Reactive transport modeling is a potentially useful method for simulating the effectiveness of proposed restoration techniques. While reactive transport models can be useful, they are a simplification of reality that introduces uncertainty through the model conceptualization, parameterization, and calibration processes. For this reason, quantifying the uncertainty in simulated temporal and spatial hydrogeochemistry is important for postremedial risk evaluation of metal concentrations and mobility. Quantifying the range of uncertainty in key predictions (such as uranium concentrations at a specific location) can be achieved using forward Monte Carlo or other inverse modeling techniques (trial-and-error parameter sensitivity, calibration constrained Monte Carlo). These techniques provide simulated values of metal concentrations at specified locations that can be presented as nonlinear uncertainty limits or probability density functions. Decisionmakers can use these results to better evaluate environmental risk as future metal concentrations with a limited range of possibilities, based on a scientific evaluation of uncertainty.

  9. Factors Associated with Parental Adaptation to Children with an Undiagnosed Medical Condition

    PubMed Central

    Yanes, Tatiane; Humphreys, Linda; McInerney-Leo, Aideen; Biesecker, Barbara

    2017-01-01

    Little is known about the adaptive process and experiences of parents raising a child with an undiagnosed medical condition. The present study aims to assess how uncertainty, hope, social support, and coping efficacy contributes to adaptation among parents of children with an undiagnosed medical condition. Sixty-two parents of child affected by an undiagnosed medical condition for at least two years completed an electronically self-administered survey. Descriptive analysis suggested parents in this population had significantly lower adaptation scores when compared to other parents of children with undiagnosed medical conditions, and parents of children with a diagnosed intellectual and/or physical disability. Similarly, parents in this population had significantly lower hope, perceived social support and coping efficacy when compared to parents of children with a diagnosed medical condition. Multiple linear regression was used to identify relationships between independent variables and domains of adaptation. Positive stress response was negatively associated with emotional support (B = −0.045, p ≤ 0.05), and positively associated with coping efficacy (B = 0.009, p ≤ 0.05). Adaptive self-esteem was negatively associated with uncertainty towards one's social support (B = −0.248, p ≤ 0.05), and positively associated with coping efficacy (B = 0.007, p ≤ 0.05). Adaptive social integration was negatively associated with uncertainty towards one's social support (B-0.273, p ≤ 0.05), and positively associated with uncertainty towards child's health (B = 0.323, p ≤ 0.001), and affectionate support (B = 0.110, p ≤ 0.001). Finally, adaptive spiritual wellbeing was negatively associated with uncertainty towards one's family (B = −0.221, p ≤ 0.05). Findings from this study have highlighted the areas where parents believed additional support was required, and provided insight into factors that contribute to parental adaptation. PMID:28039658

  10. Potential sources of variability in mesocosm experiments on the response of phytoplankton to ocean acidification

    NASA Astrophysics Data System (ADS)

    Moreno de Castro, Maria; Schartau, Markus; Wirtz, Kai

    2017-04-01

    Mesocosm experiments on phytoplankton dynamics under high CO2 concentrations mimic the response of marine primary producers to future ocean acidification. However, potential acidification effects can be hindered by the high standard deviation typically found in the replicates of the same CO2 treatment level. In experiments with multiple unresolved factors and a sub-optimal number of replicates, post-processing statistical inference tools might fail to detect an effect that is present. We propose that in such cases, data-based model analyses might be suitable tools to unearth potential responses to the treatment and identify the uncertainties that could produce the observed variability. As test cases, we used data from two independent mesocosm experiments. Both experiments showed high standard deviations and, according to statistical inference tools, biomass appeared insensitive to changing CO2 conditions. Conversely, our simulations showed earlier and more intense phytoplankton blooms in modeled replicates at high CO2 concentrations and suggested that uncertainties in average cell size, phytoplankton biomass losses, and initial nutrient concentration potentially outweigh acidification effects by triggering strong variability during the bloom phase. We also estimated the thresholds below which uncertainties do not escalate to high variability. This information might help in designing future mesocosm experiments and interpreting controversial results on the effect of acidification or other pressures on ecosystem functions.

  11. Determination of Uncertainties for the New SSME Model

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.; Hawk, Clark W.

    1996-01-01

    This report discusses the uncertainty analysis performed in support of a new test analysis and performance prediction model for the Space Shuttle Main Engine. The new model utilizes uncertainty estimates for experimental data and for the analytical model to obtain the most plausible operating condition for the engine system. This report discusses the development of the data sets and uncertainty estimates to be used in the development of the new model. It also presents the application of uncertainty analysis to analytical models and the uncertainty analysis for the conservation of mass and energy balance relations is presented. A new methodology for the assessment of the uncertainty associated with linear regressions is presented.

  12. Characterization of XR-RV3 GafChromic{sup ®} films in standard laboratory and in clinical conditions and means to evaluate uncertainties and reduce errors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farah, J., E-mail: jad.farah@irsn.fr; Clairand, I.; Huet, C.

    2015-07-15

    Purpose: To investigate the optimal use of XR-RV3 GafChromic{sup ®} films to assess patient skin dose in interventional radiology while addressing the means to reduce uncertainties in dose assessment. Methods: XR-Type R GafChromic films have been shown to represent the most efficient and suitable solution to determine patient skin dose in interventional procedures. As film dosimetry can be associated with high uncertainty, this paper presents the EURADOS WG 12 initiative to carry out a comprehensive study of film characteristics with a multisite approach. The considered sources of uncertainties include scanner, film, and fitting-related errors. The work focused on studying filmmore » behavior with clinical high-dose-rate pulsed beams (previously unavailable in the literature) together with reference standard laboratory beams. Results: First, the performance analysis of six different scanner models has shown that scan uniformity perpendicular to the lamp motion axis and that long term stability are the main sources of scanner-related uncertainties. These could induce errors of up to 7% on the film readings unless regularly checked and corrected. Typically, scan uniformity correction matrices and reading normalization to the scanner-specific and daily background reading should be done. In addition, the analysis on multiple film batches has shown that XR-RV3 films have generally good uniformity within one batch (<1.5%), require 24 h to stabilize after the irradiation and their response is roughly independent of dose rate (<5%). However, XR-RV3 films showed large variations (up to 15%) with radiation quality both in standard laboratory and in clinical conditions. As such, and prior to conducting patient skin dose measurements, it is mandatory to choose the appropriate calibration beam quality depending on the characteristics of the x-ray systems that will be used clinically. In addition, yellow side film irradiations should be preferentially used since they showed a lower dependence on beam parameters compared to white side film irradiations. Finally, among the six different fit equations tested in this work, typically used third order polynomials and more rational and simplistic equations, of the form dose inversely proportional to pixel value, were both found to provide satisfactory results. Fitting-related uncertainty was clearly identified as a major contributor to the overall film dosimetry uncertainty with up to 40% error on the dose estimate. Conclusions: The overall uncertainty associated with the use of XR-RV3 films to determine skin dose in the interventional environment can realistically be estimated to be around 20% (k = 1). This uncertainty can be reduced to within 5% if carefully monitoring scanner, film, and fitting-related errors or it can easily increase to over 40% if minimal care is not taken. This work demonstrates the importance of appropriate calibration, reading, fitting, and other film-related and scan-related processes, which will help improve the accuracy of skin dose measurements in interventional procedures.« less

  13. Effects of nonspatial selective and divided visual attention on fMRI BOLD responses.

    PubMed

    Weerda, Riklef; Vallines, Ignacio; Thomas, James P; Rutschmann, Roland M; Greenlee, Mark W

    2006-09-01

    Using an uncertainty paradigm and functional magnetic resonance imaging (fMRI) we studied the effect of nonspatial selective and divided visual attention on the activity of specific areas of human extrastriate visual cortex. The stimuli were single ovals that differed from an implicit standard oval in either colour or width. The subjects' task was to classify the current stimulus as one of two possible alternatives per stimulus dimension. Three different experimental conditions were conducted: "colour-certainty", "shape-certainty" and "uncertainty". In all experimental conditions, the stimulus differed in only one stimulus dimension per trial. In the two certainty conditions, the subjects knew in advance which dimension this would be. During the uncertainty condition they had no such previous knowledge and had to monitor both dimensions simultaneously. Statistical analysis of the fMRI data (with SPM2) revealed a modest effect of the attended stimulus dimension on the neural activity in colour sensitive area V4 (more activity during attention to colour) and in shape sensitive area LOC (more activity during attention to shape). Furthermore, cortical areas known to be related to attention and working memory processes (e.g., lateral prefrontal and posterior parietal cortex) exhibit higher activity during the condition of divided attention ("uncertainty") than during that of selective attention ("certainty").

  14. Characterization and Uncertainty Analysis of a Reference Pressure Measurement System for Wind Tunnels

    NASA Technical Reports Server (NTRS)

    Amer, Tahani; Tripp, John; Tcheng, Ping; Burkett, Cecil; Sealey, Bradley

    2004-01-01

    This paper presents the calibration results and uncertainty analysis of a high-precision reference pressure measurement system currently used in wind tunnels at the NASA Langley Research Center (LaRC). Sensors, calibration standards, and measurement instruments are subject to errors due to aging, drift with time, environment effects, transportation, the mathematical model, the calibration experimental design, and other factors. Errors occur at every link in the chain of measurements and data reduction from the sensor to the final computed results. At each link of the chain, bias and precision uncertainties must be separately estimated for facility use, and are combined to produce overall calibration and prediction confidence intervals for the instrument, typically at a 95% confidence level. The uncertainty analysis and calibration experimental designs used herein, based on techniques developed at LaRC, employ replicated experimental designs for efficiency, separate estimation of bias and precision uncertainties, and detection of significant parameter drift with time. Final results, including calibration confidence intervals and prediction intervals given as functions of the applied inputs, not as a fixed percentage of the full-scale value are presented. System uncertainties are propagated beginning with the initial reference pressure standard, to the calibrated instrument as a working standard in the facility. Among the several parameters that can affect the overall results are operating temperature, atmospheric pressure, humidity, and facility vibration. Effects of factors such as initial zeroing and temperature are investigated. The effects of the identified parameters on system performance and accuracy are discussed.

  15. A socio-technical model to explore urban water systems scenarios.

    PubMed

    de Haan, Fjalar J; Ferguson, Briony C; Deletic, Ana; Brown, Rebekah R

    2013-01-01

    This article reports on the ongoing work and research involved in the development of a socio-technical model of urban water systems. Socio-technical means the model is not so much concerned with the technical or biophysical aspects of urban water systems, but rather with the social and institutional implications of the urban water infrastructure and vice versa. A socio-technical model, in the view purported in this article, produces scenarios of different urban water servicing solutions gaining or losing influence in meeting water-related societal needs, like potable water, drainage, environmental health and amenity. The urban water system is parameterised with vectors of the relative influence of each servicing solution. The model is a software implementation of the Multi-Pattern Approach, a theory on societal systems, like urban water systems, and how these develop and go through transitions under various internal and external conditions. Acknowledging that social dynamics comes with severe and non-reducible uncertainties, the model is set up to be exploratory, meaning that for any initial condition several possible future scenarios are produced. This article gives a concise overview of the necessary theoretical background, the model architecture and some initial test results using a drainage example.

  16. SHM-Based Probabilistic Fatigue Life Prediction for Bridges Based on FE Model Updating

    PubMed Central

    Lee, Young-Joo; Cho, Soojin

    2016-01-01

    Fatigue life prediction for a bridge should be based on the current condition of the bridge, and various sources of uncertainty, such as material properties, anticipated vehicle loads and environmental conditions, make the prediction very challenging. This paper presents a new approach for probabilistic fatigue life prediction for bridges using finite element (FE) model updating based on structural health monitoring (SHM) data. Recently, various types of SHM systems have been used to monitor and evaluate the long-term structural performance of bridges. For example, SHM data can be used to estimate the degradation of an in-service bridge, which makes it possible to update the initial FE model. The proposed method consists of three steps: (1) identifying the modal properties of a bridge, such as mode shapes and natural frequencies, based on the ambient vibration under passing vehicles; (2) updating the structural parameters of an initial FE model using the identified modal properties; and (3) predicting the probabilistic fatigue life using the updated FE model. The proposed method is demonstrated by application to a numerical model of a bridge, and the impact of FE model updating on the bridge fatigue life is discussed. PMID:26950125

  17. Geomorphological control on variably saturated hillslope hydrology and slope instability

    USGS Publications Warehouse

    Giuseppe, Formetta; Simoni, Silvia; Godt, Jonathan W.; Lu, Ning; Rigon, Riccardo

    2016-01-01

    In steep topography, the processes governing variably saturated subsurface hydrologic response and the interparticle stresses leading to shallow landslide initiation are physically linked. However, these processes are usually analyzed separately. Here, we take a combined approach, simultaneously analyzing the influence of topography on both hillslope hydrology and the effective stress fields within the hillslope itself. Clearly, runoff and saturated groundwater flow are dominated by gravity and, ultimately, by topography. Less clear is how landscape morphology influences flows in the vadose zone, where transient fluxes are usually taken to be vertical. We aim to assess and quantify the impact of topography on both saturated and unsaturated hillslope hydrology and its effects on shallow slope stability. Three real hillslope morphologies (concave, convex, and planar) are analyzed using a 3-D, physically based, distributed model coupled with a module for computation of the probability of failure, based on the infinite slope assumption. The results of the analyses, which included parameter uncertainty analysis of the results themselves, show that convex and planar slopes are more stable than concave slopes. Specifically, under the same initial, boundary, and infiltration conditions, the percentage of unstable areas ranges from 1.3% for the planar hillslope, 21% for convex, to a maximum value of 33% for the concave morphology. The results are supported by a sensitivity analysis carried out to examine the effect of initial conditions and rainfall intensity.

  18. Climate Twins - a tool to explore future climate impacts by assessing real world conditions: Exploration principles, underlying data, similarity conditions and uncertainty ranges

    NASA Astrophysics Data System (ADS)

    Loibl, Wolfgang; Peters-Anders, Jan; Züger, Johann

    2010-05-01

    To achieve public awareness and thorough understanding about expected climate changes and their future implications, ways have to be found to communicate model outputs to the public in a scientifically sound and easily understandable way. The newly developed Climate Twins tool tries to fulfil these requirements via an intuitively usable web application, which compares spatial patterns of current climate with future climate patterns, derived from regional climate model results. To get a picture of the implications of future climate in an area of interest, users may click on a certain location within an interactive map with underlying future climate information. A second map depicts the matching Climate Twin areas according to current climate conditions. In this way scientific output can be communicated to the public which allows for experiencing climate change through comparison with well-known real world conditions. To identify climatic coincidence seems to be a simple exercise, but the accuracy and applicability of the similarity identification depends very much on the selection of climate indicators, similarity conditions and uncertainty ranges. Too many indicators representing various climate characteristics and too narrow uncertainty ranges will judge little or no area as regions with similar climate, while too little indicators and too wide uncertainty ranges will address too large regions as those with similar climate which may not be correct. Similarity cannot be just explored by comparing mean values or by calculating correlation coefficients. As climate change triggers an alteration of various indicators, like maxima, minima, variation magnitude, frequency of extreme events etc., the identification of appropriate similarity conditions is a crucial question to be solved. For Climate Twins identification, it is necessary to find a right balance of indicators, similarity conditions and uncertainty ranges, unless the results will be too vague conducting a useful Climate Twins regions search. The Climate Twins tool works actually comparing future climate conditions of a certain source area in the Greater Alpine Region with current climate conditions of entire Europe and the neighbouring southern as well south-eastern areas as target regions. A next version will integrate web crawling features for searching information about climate-related local adaptations observed today in the target region which may turn out as appropriate solution for the source region under future climate conditions. The contribution will present the current tool functionally and will discuss which indicator sets, similarity conditions and uncertainty ranges work best to deliver scientifically sound climate comparisons and distinct mapping results.

  19. Life-cycle assessment of municipal solid waste management alternatives with consideration of uncertainty: SIWMS development and application.

    PubMed

    Hanandeh, Ali El; El-Zein, Abbas

    2010-05-01

    This paper describes the development and application of the Stochastic Integrated Waste Management Simulator (SIWMS) model. SIWMS provides a detailed view of the environmental impacts and associated costs of municipal solid waste (MSW) management alternatives under conditions of uncertainty. The model follows a life-cycle inventory approach extended with compensatory systems to provide more equitable bases for comparing different alternatives. Economic performance is measured by the net present value. The model is verified against four publicly available models under deterministic conditions and then used to study the impact of uncertainty on Sydney's MSW management 'best practices'. Uncertainty has a significant effect on all impact categories. The greatest effect is observed in the global warming category where a reversal of impact direction is predicted. The reliability of the system is most sensitive to uncertainties in the waste processing and disposal. The results highlight the importance of incorporating uncertainty at all stages to better understand the behaviour of the MSW system. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  20. Uncertainty Quantification in Multi-Scale Coronary Simulations Using Multi-resolution Expansion

    NASA Astrophysics Data System (ADS)

    Tran, Justin; Schiavazzi, Daniele; Ramachandra, Abhay; Kahn, Andrew; Marsden, Alison

    2016-11-01

    Computational simulations of coronary flow can provide non-invasive information on hemodynamics that can aid in surgical planning and research on disease propagation. In this study, patient-specific geometries of the aorta and coronary arteries are constructed from CT imaging data and finite element flow simulations are carried out using the open source software SimVascular. Lumped parameter networks (LPN), consisting of circuit representations of vascular hemodynamics and coronary physiology, are used as coupled boundary conditions for the solver. The outputs of these simulations depend on a set of clinically-derived input parameters that define the geometry and boundary conditions, however their values are subjected to uncertainty. We quantify the effects of uncertainty from two sources: uncertainty in the material properties of the vessel wall and uncertainty in the lumped parameter models whose values are estimated by assimilating patient-specific clinical and literature data. We use a generalized multi-resolution chaos approach to propagate the uncertainty. The advantages of this approach lies in its ability to support inputs sampled from arbitrary distributions and its built-in adaptivity that efficiently approximates stochastic responses characterized by steep gradients.

  1. Dealing with Uncertainties in Initial Orbit Determination

    NASA Technical Reports Server (NTRS)

    Armellin, Roberto; Di Lizia, Pierluigi; Zanetti, Renato

    2015-01-01

    A method to deal with uncertainties in initial orbit determination (IOD) is presented. This is based on the use of Taylor differential algebra (DA) to nonlinearly map the observation uncertainties from the observation space to the state space. When a minimum set of observations is available DA is used to expand the solution of the IOD problem in Taylor series with respect to measurement errors. When more observations are available high order inversion tools are exploited to obtain full state pseudo-observations at a common epoch. The mean and covariance of these pseudo-observations are nonlinearly computed by evaluating the expectation of high order Taylor polynomials. Finally, a linear scheme is employed to update the current knowledge of the orbit. Angles-only observations are considered and simplified Keplerian dynamics adopted to ease the explanation. Three test cases of orbit determination of artificial satellites in different orbital regimes are presented to discuss the feature and performances of the proposed methodology.

  2. Robust stability of fractional order polynomials with complicated uncertainty structure

    PubMed Central

    Şenol, Bilal; Pekař, Libor

    2017-01-01

    The main aim of this article is to present a graphical approach to robust stability analysis for families of fractional order (quasi-)polynomials with complicated uncertainty structure. More specifically, the work emphasizes the multilinear, polynomial and general structures of uncertainty and, moreover, the retarded quasi-polynomials with parametric uncertainty are studied. Since the families with these complex uncertainty structures suffer from the lack of analytical tools, their robust stability is investigated by numerical calculation and depiction of the value sets and subsequent application of the zero exclusion condition. PMID:28662173

  3. Results of the Greenland ice sheet model initialisation experiments: ISMIP6 - initMIP-Greenland

    NASA Astrophysics Data System (ADS)

    Goelzer, Heiko; Nowicki, Sophie; Edwards, Tamsin; Beckley, Matthew

    2017-04-01

    Ice sheet model initialisation has a large effect on projected future sea-level contributions and gives rise to important uncertainties. The goal of this intercomparison exercise for the continental-scale Greenland ice sheet is therefore to compare, evaluate and improve the initialisation techniques used in the ice sheet modelling community. The initMIP-Greenland project is the first in a series of ice sheet model intercomparison activities within ISMIP6 (Ice Sheet Model Intercomparison Project for CMIP6). The experimental set-up has been designed to allow comparison of the initial present-day state of the Greenland ice sheet between participating models and against observations. Furthermore, the initial states are tested with two schematic forward experiments to evaluate the initialisation in terms of model drift (forward run without any forcing) and response to a large perturbation (prescribed surface mass balance anomaly). We present and discuss results that highlight the wide diversity of data sets, boundary conditions and initialisation techniques used in the community to generate initial states of the Greenland ice sheet.

  4. ELUCID - Exploring the Local Universe with ReConstructed Initial Density Field III: Constrained Simulation in the SDSS Volume

    NASA Astrophysics Data System (ADS)

    Wang, Huiyuan; Mo, H. J.; Yang, Xiaohu; Zhang, Youcai; Shi, JingJing; Jing, Y. P.; Liu, Chengze; Li, Shijie; Kang, Xi; Gao, Yang

    2016-11-01

    A method we developed recently for the reconstruction of the initial density field in the nearby universe is applied to the Sloan Digital Sky Survey Data Release 7. A high-resolution N-body constrained simulation (CS) of the reconstructed initial conditions, with 30723 particles evolved in a 500 {h}-1 {Mpc} box, is carried out and analyzed in terms of the statistical properties of the final density field and its relation with the distribution of Sloan Digital Sky Survey galaxies. We find that the statistical properties of the cosmic web and the halo populations are accurately reproduced in the CS. The galaxy density field is strongly correlated with the CS density field, with a bias that depends on both galaxy luminosity and color. Our further investigations show that the CS provides robust quantities describing the environments within which the observed galaxies and galaxy systems reside. Cosmic variance is greatly reduced in the CS so that the statistical uncertainties can be controlled effectively, even for samples of small volumes.

  5. Instrumental record of debris flow initiation during natural rainfall: Implications for modeling slope stability

    USGS Publications Warehouse

    Montgomery, D.R.; Schmidt, K.M.; Dietrich, W.E.; McKean, J.

    2009-01-01

    The middle of a hillslope hollow in the Oregon Coast Range failed and mobilized as a debris flow during heavy rainfall in November 1996. Automated pressure transducers recorded high spatial variability of pore water pressure within the area that mobilized as a debris flow, which initiated where local upward flow from bedrock developed into overlying colluvium. Postfailure observations of the bedrock surface exposed in the debris flow scar reveal a strong spatial correspondence between elevated piezometric response and water discharging from bedrock fractures. Measurements of apparent root cohesion on the basal (Cb) and lateral (Cl) scarp demonstrate substantial local variability, with areally weighted values of Cb = 0.1 and Cl = 4.6 kPa. Using measured soil properties and basal root strength, the widely used infinite slope model, employed assuming slope parallel groundwater flow, provides a poor prediction of hydrologie conditions at failure. In contrast, a model including lateral root strength (but neglecting lateral frictional strength) gave a predicted critical value of relative soil saturation that fell within the range defined by the arithmetic and geometric mean values at the time of failure. The 3-D slope stability model CLARA-W, used with locally observed pore water pressure, predicted small areas with lower factors of safety within the overall slide mass at sites consistent with field observations of where the failure initiated. This highly variable and localized nature of small areas of high pore pressure that can trigger slope failure means, however, that substantial uncertainty appears inevitable for estimating hydrologie conditions within incipient debris flows under natural conditions. Copyright 2009 by the American Geophysical Union.

  6. H I versus H α - comparing the kinematic tracers in modelling the initial conditions of the Mice

    NASA Astrophysics Data System (ADS)

    Mortazavi, S. Alireza; Lotz, Jennifer M.; Barnes, Joshua E.; Privon, George C.; Snyder, Gregory F.

    2018-03-01

    We explore the effect of using different kinematic tracers (H I and H α) on reconstructing the encounter parameters of the Mice major galaxy merger (NGC 4676A/B). We observed the Mice using the SparsePak Integral Field Unit (IFU) on the WIYN telescope, and compared the H α velocity map with VLA H I observations. The relatively high spectral resolution of our data (R ≈ 5000) allows us to resolve more than one kinematic component in the emission lines of some fibres. We separate the H α-[N II] emission of the star-forming regions from shocks using their [N II]/H α line ratio and velocity dispersion. We show that the velocity of star-forming regions agree with that of the cold gas (H I), particularly, in the tidal tails of the system. We reconstruct the morphology and kinematics of these tidal tails utilizing an automated modelling method based on the IDENTIKIT software package. We quantify the goodness of fit and the uncertainties of the derived encounter parameters. Most of the initial conditions reconstructed using H α and H I are consistent with each other, and qualitatively agree with the results of previous works. For example, we find 210± ^{50}_{40} Myr, and 180± ^{50}_{40} Myr for the time since pericentre, when modelling H α and H I kinematics, respectively. This confirms that in some cases, H α kinematics can be used instead of H I kinematics for reconstructing the initial conditions of galaxy mergers, and our automated modelling method is applicable to some merging systems.

  7. Communicating uncertainty in circulation aspects of climate change

    NASA Astrophysics Data System (ADS)

    Shepherd, Ted

    2017-04-01

    The usual way of representing uncertainty in climate change is to define a likelihood range of possible futures, conditioned on a particular pathway of greenhouse gas concentrations (RCPs). Typically these likelihood ranges are derived from multi-model ensembles. However, there is no obvious basis for treating such ensembles as probability distributions. Moreover, for aspects of climate related to atmospheric circulation, such an approach generally leads to large uncertainty and low confidence in projections. Yet this does not mean that the associated climate risks are small. We therefore need to develop suitable ways of communicating climate risk whilst acknowledging the uncertainties. This talk will outline an approach based on conditioning the purely thermodynamic aspects of climate change, concerning which there is comparatively high confidence, on circulation-related aspects, and treating the latter through non-probabilistic storylines.

  8. The visualization of spatial uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Srivastava, R.M.

    1994-12-31

    Geostatistical conditions simulation is gaining acceptance as a numerical modeling tool in the petroleum industry. Unfortunately, many of the new users of conditional simulation work with only one outcome or ``realization`` and ignore the many other outcomes that could be produced by their conditional simulation tools. 3-D visualization tools allow them to create very realistic images of this single outcome as reality. There are many methods currently available for presenting the uncertainty information from a family of possible outcomes; most of these, however, use static displays and many present uncertainty in a format that is not intuitive. This paper exploresmore » the visualization of uncertainty through dynamic displays that exploit the intuitive link between uncertainty and change by presenting the use with a constantly evolving model. The key technical challenge to such a dynamic presentation is the ability to create numerical models that honor the available well data and geophysical information and yet are incrementally different so that successive frames can be viewed rapidly as an animated cartoon. An example of volumetric uncertainty from a Gulf Coast reservoir will be used to demonstrate that such a dynamic presentation is the ability to create numerical models that honor the available well data and geophysical information and yet are incrementally different so that successive frames can be viewed rapidly as an animated cartoon. An example of volumetric uncertainty from a Gulf Coast reservoir will be used to demonstrate that such animation is possible and to show that such dynamic displays can be an effective tool in risk analysis for the petroleum industry.« less

  9. Polarization Angle Calibration and B-Mode Characterization with the BICEP and Keck Array CMB Telescopes

    NASA Astrophysics Data System (ADS)

    Bullock, Eric

    Since its discovery in 1964, the Cosmic Microwave Background (CMB) has led to widespread acceptance of the Big Bang cosmological paradigm as an explanation for the evolution of the Universe. However, this paradigm does not explain the origin of the initial conditions, leading to such issues as the "horizon problem" and "flatness problem." In the early 1980's, the inflationary paradigm was introduced as a possible source for the initial conditions. This theory postulates that the Universe underwent a period of exponential expansion within a tiny fraction of a second after the beginning. Such an expansion is predicted to inject a stochastic background of gravitational waves that could imprint a detectable B-mode (curl-like) signal in the polarization of the CMB. It is this signal that the family of telescopes used by the B ICEP1, BICEP2, and Keck Array collaborations were designed to detect. These telescopes are small aperture, on-axis, refracting telescopes. We have used the data from these telescopes, particularly BICEP2 and the Keck Array, to place the tightest constraints, as of March 2016, on the tensor-to-scalar ratio of the CMB of r 0.05 < 0.07. In this dissertation, we provide an overview of the Keck Array telescopes and analysis of the data. We also investigate, as the main focus of this dissertation, a device we call the Dielectric Sheet Calibrator (DSC) that is used to measure the polarization angles of our detectors as projected on the sky. With these measurements, we gain the potential to separate the polarization rotation effects of parity-violating physics, such as cosmic birefringence, from a systematic uncertainty on our detectors' polarization angles. Current calibration techniques for polarization sensitive CMB detectors claim an accuracy of +/-0.5°, which sets a limit for determining the usefulness of the DSC. Through a series of consistency tests on a single Keck Array receiver, we demonstrate a statistical uncertainty on the DSC measurements of +/-0.03° and estimate a systematic uncertainty of +/-0.2°. which meets the minimum goal. We also conclude that there is no conflict between the DSC-derived polarization angles of this single receiver and the rotation derived from that receiver's CMB data under the hypothesis of no cosmic birefringence.

  10. Transient flow conditions change how we should think about WHPA delineation: a joint frequency and probability analysis

    NASA Astrophysics Data System (ADS)

    Rodriguez Pretelin (1), Abelardo; Nowak (1), Wolfgang

    2017-04-01

    Well head protection areas (WHPAs) are frequently used as safety measures for drinking water wells, preventing them from being polluted by restricting land use activities in their proximities. Two sources of uncertainty are involved during delineation: 1) uncertainty in aquifer parameters and 2) time-varying groundwater flow scenarios and their own inherent uncertainties. The former has been studied by Enzenhoefer et al (2012 [1] and 2014 [2]) as probabilistic risk version of WHPA delineation. The latter is frequently neglected and replaced by steady-state assumptions; thereby ignoring time-variant flow conditions triggered either by anthropogenic causes or climatic conditions. In this study we analyze the influence of transient flow considerations in WHPA delineation, following annual seasonality behavior; with transiency represented by four transient conditions: (I) regional groundwater flow direction, (II) strength of the regional hydraulic gradient, (III) natural recharge to the groundwater and (IV) pumping rate. Addressing WHPA delineation in transient flow scenarios is computationally expensive. Thus, we develop an efficient method using a dynamic superposition of steady-state flow solutions coupled with a reversed formulation of advective-dispersive transport based on a Lagrangian particle tracking with continuous injection. This analysis results in a time-frequency map of pixel-wise membership to the well catchment. Additional to transient flow conditions, we recognize two sources of uncertainty, inexact knowledge of transient drivers and parameters. The uncertainties are accommodated through Monte Carlo simulation. With the help of a global sensitivity analysis, we investigate the impact of transiency in WHPA solutions. In particular, we evaluate: (1) Among all considered transients, which ones are the most influential. (2) How influential in WHPA delineation is the transience-related uncertainty compared to aquifer parameter uncertainty. Literature [1] R. Enzenhoefer, W. Nowak, and R. Helmig. Probabilistic exposure risk assessment with advective-dispersive well vulnerability criteria. Advances in Water Resources, 36:121-132, 2012. [2] R. Enzenhoefer, T. Bunk, and W. Nowak. Nine steps to risk-informed wellhead protection and management: a case study. Ground water, 52:161-174, 2014.

  11. Observations and simulations of the western United States' hydroclimate

    NASA Astrophysics Data System (ADS)

    Guirguis, Kristen

    While very important from an economical and societal point of view, estimating precipitation in the western United States remains an unsolved and challenging problem. This is due to difficulties in observing and modeling precipitation in complex terrain. This research examines this issue by (i) providing a systematic evaluation of precipitation observations to quantify data uncertainty; and (ii) investigating the ability of the Ocean-Land-Atmosphere Model (OLAM) to simulate the winter hydroclimate in this region. This state-of-the-art, non-hydrostatic model has the capability of simulating simultaneously all scales of motions at various resolutions. This research intercompares nine precipitation datasets commonly used in hydrometeorological research in two ways. First, using principal component analysis, a precipitation climatology is conducted for the western U.S. from which five unique precipitation climates are identified. From this analysis, data uncertainty is shown to be primarily due to differences in (i) precipitation over the Rocky Mountains, (ii) the eastward wet-to-dry precipitation gradient during the cold season, (iii) the North American Monsoon signal, and (iv) precipitation in the desert southwest during spring and summer. The second intercomparison uses these five precipitation regions to provide location-specific assessments of uncertainty, which is shown to be dependent on season, location. Long-range weather forecasts on the order of a season are important for water-scarce regions such as the western U.S. The modeling component of this research looks at the ability of the OLAM to simulate the hydroclimate in the western U.S. during the winter of 1999. Six global simulations are run, each with a different spatial resolution over the western U.S. (360 km down to 11 km). For this study, OLAM is configured as for a long-range seasonal hindcast but with observed sea surface temperatures. OLAM precipitation compares well against observations, and is generally within the range of data uncertainty. Observed and simulated synoptic meteorological conditions are examined during the wettest and driest events. OLAM is shown to reproduce the appropriate anomaly fields, which is encouraging since it demonstrates the capability of a global climate model, driven only by SSTs and initial conditions, to represent meteorological features associated with daily precipitation variability.

  12. Hydrologic drought prediction under climate change: Uncertainty modeling with Dempster-Shafer and Bayesian approaches

    NASA Astrophysics Data System (ADS)

    Raje, Deepashree; Mujumdar, P. P.

    2010-09-01

    Representation and quantification of uncertainty in climate change impact studies are a difficult task. Several sources of uncertainty arise in studies of hydrologic impacts of climate change, such as those due to choice of general circulation models (GCMs), scenarios and downscaling methods. Recently, much work has focused on uncertainty quantification and modeling in regional climate change impacts. In this paper, an uncertainty modeling framework is evaluated, which uses a generalized uncertainty measure to combine GCM, scenario and downscaling uncertainties. The Dempster-Shafer (D-S) evidence theory is used for representing and combining uncertainty from various sources. A significant advantage of the D-S framework over the traditional probabilistic approach is that it allows for the allocation of a probability mass to sets or intervals, and can hence handle both aleatory or stochastic uncertainty, and epistemic or subjective uncertainty. This paper shows how the D-S theory can be used to represent beliefs in some hypotheses such as hydrologic drought or wet conditions, describe uncertainty and ignorance in the system, and give a quantitative measurement of belief and plausibility in results. The D-S approach has been used in this work for information synthesis using various evidence combination rules having different conflict modeling approaches. A case study is presented for hydrologic drought prediction using downscaled streamflow in the Mahanadi River at Hirakud in Orissa, India. Projections of n most likely monsoon streamflow sequences are obtained from a conditional random field (CRF) downscaling model, using an ensemble of three GCMs for three scenarios, which are converted to monsoon standardized streamflow index (SSFI-4) series. This range is used to specify the basic probability assignment (bpa) for a Dempster-Shafer structure, which represents uncertainty associated with each of the SSFI-4 classifications. These uncertainties are then combined across GCMs and scenarios using various evidence combination rules given by the D-S theory. A Bayesian approach is also presented for this case study, which models the uncertainty in projected frequencies of SSFI-4 classifications by deriving a posterior distribution for the frequency of each classification, using an ensemble of GCMs and scenarios. Results from the D-S and Bayesian approaches are compared, and relative merits of each approach are discussed. Both approaches show an increasing probability of extreme, severe and moderate droughts and decreasing probability of normal and wet conditions in Orissa as a result of climate change.

  13. Assessing effects of variation in global climate data sets on spatial predictions from climate envelope models

    USGS Publications Warehouse

    Romañach, Stephanie; Watling, James I.; Fletcher, Robert J.; Speroterra, Carolina; Bucklin, David N.; Brandt, Laura A.; Pearlstine, Leonard G.; Escribano, Yesenia; Mazzotti, Frank J.

    2014-01-01

    Climate change poses new challenges for natural resource managers. Predictive modeling of species–environment relationships using climate envelope models can enhance our understanding of climate change effects on biodiversity, assist in assessment of invasion risk by exotic organisms, and inform life-history understanding of individual species. While increasing interest has focused on the role of uncertainty in future conditions on model predictions, models also may be sensitive to the initial conditions on which they are trained. Although climate envelope models are usually trained using data on contemporary climate, we lack systematic comparisons of model performance and predictions across alternative climate data sets available for model training. Here, we seek to fill that gap by comparing variability in predictions between two contemporary climate data sets to variability in spatial predictions among three alternative projections of future climate. Overall, correlations between monthly temperature and precipitation variables were very high for both contemporary and future data. Model performance varied across algorithms, but not between two alternative contemporary climate data sets. Spatial predictions varied more among alternative general-circulation models describing future climate conditions than between contemporary climate data sets. However, we did find that climate envelope models with low Cohen's kappa scores made more discrepant spatial predictions between climate data sets for the contemporary period than did models with high Cohen's kappa scores. We suggest conservation planners evaluate multiple performance metrics and be aware of the importance of differences in initial conditions for spatial predictions from climate envelope models.

  14. Improvement of the ephemerides of Phoebe, 9th satellite of Saturn, from new observations made from 1995 to 2000

    NASA Astrophysics Data System (ADS)

    Arlot, J.-E.; Bec-Borsenberger, A.; Fienga, A.; Baron, N.

    2003-11-01

    In order to improve the model used for the ephemerides of Phoebe, the 9th satellite of Saturn, we started observations in 1998. We made 135 observations in 1998 and 39 observations in 1999 using the 120 cm-telescope of Observatoire de Haute-Provence, France. We used a numerical integration in order to calculate new initial conditions and to be able to build new ephemerides. We also used some precise observations made from 1995 to 2000 together with old observations for that purpose. The result is a decrease in the uncertainties on Phoebe's orbit. Based in part on observations made at observatoire de Haute Provence (CNRS), France.

  15. Advances and Challenges In Uncertainty Quantification with Application to Climate Prediction, ICF design and Science Stockpile Stewardship

    NASA Astrophysics Data System (ADS)

    Klein, R.; Woodward, C. S.; Johannesson, G.; Domyancic, D.; Covey, C. C.; Lucas, D. D.

    2012-12-01

    Uncertainty Quantification (UQ) is a critical field within 21st century simulation science that resides at the very center of the web of emerging predictive capabilities. The science of UQ holds the promise of giving much greater meaning to the results of complex large-scale simulations, allowing for quantifying and bounding uncertainties. This powerful capability will yield new insights into scientific predictions (e.g. Climate) of great impact on both national and international arenas, allow informed decisions on the design of critical experiments (e.g. ICF capsule design, MFE, NE) in many scientific fields, and assign confidence bounds to scientifically predictable outcomes (e.g. nuclear weapons design). In this talk I will discuss a major new strategic initiative (SI) we have developed at Lawrence Livermore National Laboratory to advance the science of Uncertainty Quantification at LLNL focusing in particular on (a) the research and development of new algorithms and methodologies of UQ as applied to multi-physics multi-scale codes, (b) incorporation of these advancements into a global UQ Pipeline (i.e. a computational superstructure) that will simplify user access to sophisticated tools for UQ studies as well as act as a self-guided, self-adapting UQ engine for UQ studies on extreme computing platforms and (c) use laboratory applications as a test bed for new algorithms and methodologies. The initial SI focus has been on applications for the quantification of uncertainty associated with Climate prediction, but the validated UQ methodologies we have developed are now being fed back into Science Based Stockpile Stewardship (SSS) and ICF UQ efforts. To make advancements in several of these UQ grand challenges, I will focus in talk on the following three research areas in our Strategic Initiative: Error Estimation in multi-physics and multi-scale codes ; Tackling the "Curse of High Dimensionality"; and development of an advanced UQ Computational Pipeline to enable complete UQ workflow and analysis for ensemble runs at the extreme scale (e.g. exascale) with self-guiding adaptation in the UQ Pipeline engine. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013 (UCRL LLNL-ABS-569112).

  16. Extracting fuzzy rules under uncertainty and measuring definability using rough sets

    NASA Technical Reports Server (NTRS)

    Culas, Donald E.

    1991-01-01

    Although computers have come a long way since their invention, they are basically able to handle only crisp values at the hardware level. Unfortunately, the world we live in consists of problems which fail to fall into this category, i.e., uncertainty is all too common. A problem is looked at which involves uncertainty. To be specific, attributes are dealt with which are fuzzy sets. Under this condition, knowledge is acquired by looking at examples. In each example, a condition as well as a decision is made available. Based on the examples given, two sets of rules are extracted, certain and possible. Furthermore, measures are constructed of how much these rules are believed in, and finally, the decisions are defined as a function of the terms used in the conditions.

  17. Uncertainty forecasts improve weather-related decisions and attenuate the effects of forecast error.

    PubMed

    Joslyn, Susan L; LeClerc, Jared E

    2012-03-01

    Although uncertainty is inherent in weather forecasts, explicit numeric uncertainty estimates are rarely included in public forecasts for fear that they will be misunderstood. Of particular concern are situations in which precautionary action is required at low probabilities, often the case with severe events. At present, a categorical weather warning system is used. The work reported here tested the relative benefits of several forecast formats, comparing decisions made with and without uncertainty forecasts. In three experiments, participants assumed the role of a manager of a road maintenance company in charge of deciding whether to pay to salt the roads and avoid a potential penalty associated with icy conditions. Participants used overnight low temperature forecasts accompanied in some conditions by uncertainty estimates and in others by decision advice comparable to categorical warnings. Results suggested that uncertainty information improved decision quality overall and increased trust in the forecast. Participants with uncertainty forecasts took appropriate precautionary action and withheld unnecessary action more often than did participants using deterministic forecasts. When error in the forecast increased, participants with conventional forecasts were reluctant to act. However, this effect was attenuated by uncertainty forecasts. Providing categorical decision advice alone did not improve decisions. However, combining decision advice with uncertainty estimates resulted in the best performance overall. The results reported here have important implications for the development of forecast formats to increase compliance with severe weather warnings as well as other domains in which one must act in the face of uncertainty. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  18. Reducing Spatial Uncertainty Through Attentional Cueing Improves Contrast Sensitivity in Regions of the Visual Field With Glaucomatous Defects

    PubMed Central

    Phu, Jack; Kalloniatis, Michael; Khuu, Sieu K.

    2018-01-01

    Purpose Current clinical perimetric test paradigms present stimuli randomly to various locations across the visual field (VF), inherently introducing spatial uncertainty, which reduces contrast sensitivity. In the present study, we determined the extent to which spatial uncertainty affects contrast sensitivity in glaucoma patients by minimizing spatial uncertainty through attentional cueing. Methods Six patients with open-angle glaucoma and six healthy subjects underwent laboratory-based psychophysical testing to measure contrast sensitivity at preselected locations at two eccentricities (9.5° and 17.5°) with two stimulus sizes (Goldmann sizes III and V) under different cueing conditions: 1, 2, 4, or 8 points verbally cued. Method of Constant Stimuli and a single-interval forced-choice procedure were used to generate frequency of seeing (FOS) curves at locations with and without VF defects. Results At locations with VF defects, cueing minimizes spatial uncertainty and improves sensitivity under all conditions. The effect of cueing was maximal when one point was cued, and rapidly diminished when more points were cued (no change to baseline with 8 points cued). The slope of the FOS curve steepened with reduced spatial uncertainty. Locations with normal sensitivity in glaucomatous eyes had similar performance to that of healthy subjects. There was a systematic increase in uncertainty with the depth of VF loss. Conclusions Sensitivity measurements across the VF are negatively affected by spatial uncertainty, which increases with greater VF loss. Minimizing uncertainty can improve sensitivity at locations of deficit. Translational Relevance Current perimetric techniques introduce spatial uncertainty and may therefore underestimate sensitivity in regions of VF loss. PMID:29600116

  19. Price-cap Regulation, Uncertainty and the Price Evolution of New Pharmaceuticals.

    PubMed

    Shajarizadeh, Ali; Hollis, Aidan

    2015-08-01

    This paper examines the effect of the regulations restricting price increases on the evolution of pharmaceutical prices. A novel theoretical model shows that this policy leads firms to price new drugs with uncertain demand above the expected value initially. Price decreases after drug launch are more likely, the higher the uncertainty. We empirically test the model's predictions using data from the Canadian pharmaceutical market. The level of uncertainty is shown to play a crucial role in drug pricing strategies. © 2014 The Authors. Health Economics Published by John Wiley & Sons Ltd.

  20. Educating Amid Uncertainty: The Organizational Supports Teachers Need to Serve Students in High-Poverty, Urban Schools

    ERIC Educational Resources Information Center

    Kraft, Matthew A.; Papay, John P.; Johnson, Susan Moore; Charner-Laird, Megin; Ng, Monica; Reinhorn, Stefanie

    2015-01-01

    Purpose: We examine how uncertainty, both about students and the context in which they are taught, remains a persistent condition of teachers' work in high-poverty, urban schools. We describe six schools' organizational responses to these uncertainties, analyze how these responses reflect open- versus closed-system approaches, and examine how this…

  1. The Vita Activa as Compass: Navigating Uncertainty in Teaching with Hannah Arendt

    ERIC Educational Resources Information Center

    Rogers, Carrie Ann Barnes

    2010-01-01

    This dissertation is an exploration of stories of uncertainty in the lives of elementary teachers and the value that the ideas of Hannah Arendt lend to the discussion around uncertainty. In "The Human Condition" (1958) Hannah Arendt theorizes the life of action, the "vita activa". Arendtian action is inherently uncertain because to be "capable of…

  2. Current net ecosystem exchange of CO2 in a young mixed forest: any heritage from the previous ecosystem?

    NASA Astrophysics Data System (ADS)

    Violette, Aurélie; Heinesch, Bernard; Erpicum, Michel; Carnol, Monique; Aubinet, Marc; François, Louis

    2013-04-01

    For 15 years, networks of flux towers have been developed to determine accurate carbon balance with the eddy-covariance method and determine if forests are sink or source of carbon. However, for prediction of the evolution of carbon cycle and climate, major uncertainties remain on the ecosystem respiration (Reco, which includes the respiration of above ground part of trees, roots respiration and mineralization of the soil organic matter), the gross primary productivity (GPP) and their difference, the net ecosystem exchange (NEE) of forests. These uncertainties are consequences of spatial and inter-annual variability, driven by previous and current climatic conditions, as well as by the particular history of the site (management, diseases, etc.). In this study we focus on the carbon cycle in two mixed forests in the Belgian Ardennes. The first site, Vielsalm, is a mature stand mostly composed of beeches (Fagus sylvatica) and douglas fir (Pseudotsuga menziesii) from 80 to 100 years old. The second site, La Robinette, was covered before 1995 with spruces. After an important windfall and a clear cutting, the site was replanted, between 1995 and 2000, with spruces (Piceas abies) and deciduous species (mostly Betula pendula, Aulnus glutinosa and Salix aurita). The challenge here is to highlight how initial conditions can influence the current behavior of the carbon cycle in a growing stand compared to a mature one, where initial conditions are supposed to be forgotten. A modeling approach suits particularly well for sensitivity tests and estimation of the temporal lag between an event and the ecosystem response. We use the forest ecosystem model ASPECTS (Rasse et al., Ecological Modelling 141, 35-52, 2001). This model predicts long-term forest growth by calculating, over time, hourly NEE. It was developed and already validated on the Vielsalm forest. Modelling results are confronted to eddy-covariance data on both sites from 2006 to 2011. The main difference between both sites seems to rely on soil respiration, which is probably partly a heritage of the previous ecosystem at the young forest site.

  3. Development of robust building energy demand-side control strategy under uncertainty

    NASA Astrophysics Data System (ADS)

    Kim, Sean Hay

    The potential of carbon emission regulations applied to an individual building will encourage building owners to purchase utility-provided green power or to employ onsite renewable energy generation. As both cases are based on intermittent renewable energy sources, demand side control is a fundamental precondition for maximizing the effectiveness of using renewable energy sources. Such control leads to a reduction in peak demand and/or in energy demand variability, therefore, such reduction in the demand profile eventually enhances the efficiency of an erratic supply of renewable energy. The combined operation of active thermal energy storage and passive building thermal mass has shown substantial improvement in demand-side control performance when compared to current state-of-the-art demand-side control measures. Specifically, "model-based" optimal control for this operation has the potential to significantly increase performance and bring economic advantages. However, due to the uncertainty in certain operating conditions in the field its control effectiveness could be diminished and/or seriously damaged, which results in poor performance. This dissertation pursues improvements of current demand-side controls under uncertainty by proposing a robust supervisory demand-side control strategy that is designed to be immune from uncertainty and perform consistently under uncertain conditions. Uniqueness and superiority of the proposed robust demand-side controls are found as below: a. It is developed based on fundamental studies about uncertainty and a systematic approach to uncertainty analysis. b. It reduces variability of performance under varied conditions, and thus avoids the worst case scenario. c. It is reactive in cases of critical "discrepancies" observed caused by the unpredictable uncertainty that typically scenario uncertainty imposes, and thus it increases control efficiency. This is obtainable by means of i) multi-source composition of weather forecasts including both historical archive and online sources and ii) adaptive Multiple model-based controls (MMC) to mitigate detrimental impacts of varying scenario uncertainties. The proposed robust demand-side control strategy verifies its outstanding demand-side control performance in varied and non-indigenous conditions compared to the existing control strategies including deterministic optimal controls. This result reemphasizes importance of the demand-side control for a building in the global carbon economy. It also demonstrates a capability of risk management of the proposed robust demand-side controls in highly uncertain situations, which eventually attains the maximum benefit in both theoretical and practical perspectives.

  4. Priority setting partnership to identify the top 10 research priorities for the management of Parkinson's disease.

    PubMed

    Deane, Katherine H O; Flaherty, Helen; Daley, David J; Pascoe, Roland; Penhale, Bridget; Clarke, Carl E; Sackley, Catherine; Storey, Stacey

    2014-12-14

    This priority setting partnership was commissioned by Parkinson's UK to encourage people with direct and personal experience of the condition to work together to identify and prioritise the top 10 evidential uncertainties that impact on everyday clinical practice for the management of Parkinson's disease (PD). The UK. Anyone with experience of PD including: people with Parkinson's (PwP), carers, family and friends, healthcare and social care professionals. Non-clinical researchers and employees of pharmaceutical or medical devices companies were excluded. 1000 participants (60% PwP) provided ideas on research uncertainties, 475 (72% PwP) initially prioritised them and 27 (37% PwP) stakeholders agreed a final top 10. Using a modified nominal group technique, participants were surveyed to identify what issues for the management of PD needed research. Unique research questions unanswered by current evidence were identified and participants were asked to identify their top 10 research priorities from this list. The top 26 uncertainties were presented to a consensus meeting with key stakeholders to agree the top 10 research priorities. 1000 participants provided 4100 responses, which contained 94 unique unanswered research questions that were initially prioritised by 475 participants. A consensus meeting with 27 stakeholders agreed the top 10 research priorities. The overarching research aspiration was an effective cure for PD. The top 10 research priorities for PD management included the need to address motor symptoms (balance and falls, and fine motor control), non-motor symptoms (sleep and urinary dysfunction), mental health issues (stress and anxiety, dementia and mild cognitive impairments), side effects of medications (dyskinesia) and the need to develop interventions specific to the phenotypes of PD and better monitoring methods. These research priorities identify crucial gaps in the existing evidence to address everyday practicalities in the management of the complexities of PD. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  5. Priority setting partnership to identify the top 10 research priorities for the management of Parkinson's disease

    PubMed Central

    Deane, Katherine H O; Flaherty, Helen; Daley, David J; Pascoe, Roland; Penhale, Bridget; Clarke, Carl E; Sackley, Catherine; Storey, Stacey

    2014-01-01

    Objectives This priority setting partnership was commissioned by Parkinson's UK to encourage people with direct and personal experience of the condition to work together to identify and prioritise the top 10 evidential uncertainties that impact on everyday clinical practice for the management of Parkinson's disease (PD). Setting The UK. Participants Anyone with experience of PD including: people with Parkinson's (PwP), carers, family and friends, healthcare and social care professionals. Non-clinical researchers and employees of pharmaceutical or medical devices companies were excluded. 1000 participants (60% PwP) provided ideas on research uncertainties, 475 (72% PwP) initially prioritised them and 27 (37% PwP) stakeholders agreed a final top 10. Methods Using a modified nominal group technique, participants were surveyed to identify what issues for the management of PD needed research. Unique research questions unanswered by current evidence were identified and participants were asked to identify their top 10 research priorities from this list. The top 26 uncertainties were presented to a consensus meeting with key stakeholders to agree the top 10 research priorities. Results 1000 participants provided 4100 responses, which contained 94 unique unanswered research questions that were initially prioritised by 475 participants. A consensus meeting with 27 stakeholders agreed the top 10 research priorities. The overarching research aspiration was an effective cure for PD. The top 10 research priorities for PD management included the need to address motor symptoms (balance and falls, and fine motor control), non-motor symptoms (sleep and urinary dysfunction), mental health issues (stress and anxiety, dementia and mild cognitive impairments), side effects of medications (dyskinesia) and the need to develop interventions specific to the phenotypes of PD and better monitoring methods. Conclusions These research priorities identify crucial gaps in the existing evidence to address everyday practicalities in the management of the complexities of PD. PMID:25500772

  6. Insights into the deterministic skill of air quality ensembles ...

    EPA Pesticide Factsheets

    Simulations from chemical weather models are subject to uncertainties in the input data (e.g. emission inventory, initial and boundary conditions) as well as those intrinsic to the model (e.g. physical parameterization, chemical mechanism). Multi-model ensembles can improve the forecast skill, provided that certain mathematical conditions are fulfilled. In this work, four ensemble methods were applied to two different datasets, and their performance was compared for ozone (O3), nitrogen dioxide (NO2) and particulate matter (PM10). Apart from the unconditional ensemble average, the approach behind the other three methods relies on adding optimum weights to members or constraining the ensemble to those members that meet certain conditions in time or frequency domain. The two different datasets were created for the first and second phase of the Air Quality Model Evaluation International Initiative (AQMEII). The methods are evaluated against ground level observations collected from the EMEP (European Monitoring and Evaluation Programme) and AirBase databases. The goal of the study is to quantify to what extent we can extract predictable signals from an ensemble with superior skill over the single models and the ensemble mean. Verification statistics show that the deterministic models simulate better O3 than NO2 and PM10, linked to different levels of complexity in the represented processes. The unconditional ensemble mean achieves higher skill compared to each stati

  7. Differential effects of uncertainty on LPP responses to emotional events during explicit and implicit anticipation.

    PubMed

    Lin, Huiyan; Liang, Jiafeng; Jin, Hua; Zhao, Dongmei

    2018-07-01

    Previous studies have investigated whether uncertainty influences neural responses to emotional events. The findings of such studies, particularly with respect to event-related potentials (ERPs), have been controversial due to several factors, such as the stimuli that serve as cues and the emotional content of the events. However, it is still unknown whether the effects of uncertainty on ERP responses to emotional events are influenced by anticipation patterns (e.g., explicit or implicit anticipation). To address this issue, participants in the present study were presented with anticipatory cues and then emotional (negative and neutral) pictures. The cues either did or did not signify the emotional content of the upcoming picture. In the inter-stimulus intervals between cues and pictures, participants were asked to estimate the expected probability of the occurrence of a specific emotional category of the subsequent picture based on a scale in the explicit anticipation condition, while in the implicit condition, participants were asked to indicate, using a number on a scale, which color was different from the others. The results revealed that in the explicit condition, uncertainty increased late positive potential (LPP) responses, particularly for negative pictures, whereas LPP responses were larger for certain negative pictures than for uncertain negative pictures in the implicit condition. The findings in the present study suggest that the anticipation pattern influences the effects of uncertainty when evaluation of negative events. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Uncertainty analysis of accident notification time and emergency medical service response time in work zone traffic accidents.

    PubMed

    Meng, Qiang; Weng, Jinxian

    2013-01-01

    Taking into account the uncertainty caused by exogenous factors, the accident notification time (ANT) and emergency medical service (EMS) response time were modeled as 2 random variables following the lognormal distribution. Their mean values and standard deviations were respectively formulated as the functions of environmental variables including crash time, road type, weekend, holiday, light condition, weather, and work zone type. Work zone traffic accident data from the Fatality Analysis Report System between 2002 and 2009 were utilized to determine the distributions of the ANT and the EMS arrival time in the United States. A mixed logistic regression model, taking into account the uncertainty associated with the ANT and the EMS response time, was developed to estimate the risk of death. The results showed that the uncertainty of the ANT was primarily influenced by crash time and road type, whereas the uncertainty of EMS response time is greatly affected by road type, weather, and light conditions. In addition, work zone accidents occurring during a holiday and in poor light conditions were found to be statistically associated with a longer mean ANT and longer EMS response time. The results also show that shortening the ANT was a more effective approach in reducing the risk of death than the EMS response time in work zones. To shorten the ANT and the EMS response time, work zone activities are suggested to be undertaken during non-holidays, during the daytime, and in good weather and light conditions.

  9. Results of the Greenland Ice Sheet Model Initialisation Experiments ISMIP6 - initMIP-Greenland

    NASA Astrophysics Data System (ADS)

    Goelzer, H.; Nowicki, S.; Edwards, T.; Beckley, M.; Abe-Ouchi, A.; Aschwanden, A.; Calov, R.; Gagliardini, O.; Gillet-chaulet, F.; Golledge, N. R.; Gregory, J. M.; Greve, R.; Humbert, A.; Huybrechts, P.; Larour, E. Y.; Lipscomb, W. H.; Le ´h, S.; Lee, V.; Kennedy, J. H.; Pattyn, F.; Payne, A. J.; Rodehacke, C. B.; Rückamp, M.; Saito, F.; Schlegel, N.; Seroussi, H. L.; Shepherd, A.; Sun, S.; Vandewal, R.; Ziemen, F. A.

    2016-12-01

    Earlier large-scale Greenland ice sheet sea-level projections e.g. those run during ice2sea and SeaRISE initiatives have shown that ice sheet initialisation can have a large effect on the projections and gives rise to important uncertainties. The goal of this intercomparison exercise (initMIP-Greenland) is to compare, evaluate and improve the initialization techniques used in the ice sheet modeling community and to estimate the associated uncertainties. It is the first in a series of ice sheet model intercomparison activities within ISMIP6 (Ice Sheet Model Intercomparison Project for CMIP6). Two experiments for the large-scale Greenland ice sheet have been designed to allow intercomparison between participating models of 1) the initial present-day state of the ice sheet and 2) the response in two schematic forward experiments. The forward experiments serve to evaluate the initialisation in terms of model drift (forward run without any forcing) and response to a large perturbation (prescribed surface mass balance anomaly). We present and discuss final results of the intercomparison and highlight important uncertainties with respect to projections of the Greenland ice sheet sea-level contribution.

  10. Feasibility, acceptability and preliminary psychological benefits of mindfulness meditation training in a sample of men diagnosed with prostate cancer on active surveillance: results from a randomized controlled pilot trial.

    PubMed

    Victorson, David; Hankin, Vered; Burns, James; Weiland, Rebecca; Maletich, Carly; Sufrin, Nathaniel; Schuette, Stephanie; Gutierrez, Bruriah; Brendler, Charles

    2017-08-01

    In a pilot randomized controlled trial, examine the feasibility and preliminary efficacy of an 8-week, mindfulness training program (Mindfulness Based Stress Reduction) in a sample of men on active surveillance on important psychological outcomes including prostate cancer anxiety, uncertainty intolerance and posttraumatic growth. Men were randomized to either mindfulness (n = 24) or an attention control arm (n = 19) and completed self-reported measures of prostate cancer anxiety, uncertainty intolerance, global quality of life, mindfulness and posttraumatic growth at baseline, 8 weeks, 6 months and 12 months. Participants in the mindfulness arm demonstrated significant decreases in prostate cancer anxiety and uncertainty intolerance, and significant increases in mindfulness, global mental health and posttraumatic growth. Participants in the control condition also demonstrated significant increases in mindfulness over time. Longitudinal increases in posttraumatic growth were significantly larger in the mindfulness arm than they were in the control arm. While mindfulness training was found to be generally feasible and acceptable among participants who enrolled in the 8-week intervention as determined by completion rates and open-ended survey responses, the response rate between initial enrollment and the total number of men approached was lower than desired (47%). While larger sample sizes are necessary to examine the efficacy of mindfulness training on important psychological outcomes, in this pilot study posttraumatic growth was shown to significantly increase over time for men in the treatment group. Mindfulness training has the potential to help men cope more effectively with some of the stressors and uncertainties associated with active surveillance. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  11. How good a clock is rotation? The stellar rotation-mass-age relationship for old field stars

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Epstein, Courtney R.; Pinsonneault, Marc H., E-mail: epstein@astronomy.ohio-state.edu, E-mail: pinsono@astronomy.ohio-state.edu

    2014-01-10

    The rotation-mass-age relationship offers a promising avenue for measuring the ages of field stars, assuming the attendant uncertainties to this technique can be well characterized. We model stellar angular momentum evolution starting with a rotation distribution from open cluster M37. Our predicted rotation-mass-age relationship shows significant zero-point offsets compared to an alternative angular momentum loss law and published gyrochronology relations. Systematic errors at the 30% level are permitted by current data, highlighting the need for empirical guidance. We identify two fundamental sources of uncertainty that limit the precision of rotation-based ages and quantify their impact. Stars are born with amore » range of rotation rates, which leads to an age range at fixed rotation period. We find that the inherent ambiguity from the initial conditions is important for all young stars, and remains large for old stars below 0.6 M {sub ☉}. Latitudinal surface differential rotation also introduces a minimum uncertainty into rotation period measurements and, by extension, rotation-based ages. Both models and the data from binary star systems 61 Cyg and α Cen demonstrate that latitudinal differential rotation is the limiting factor for rotation-based age precision among old field stars, inducing uncertainties at the ∼2 Gyr level. We also examine the relationship between variability amplitude, rotation period, and age. Existing ground-based surveys can detect field populations with ages as old as 1-2 Gyr, while space missions can detect stars as old as the Galactic disk. In comparison with other techniques for measuring the ages of lower main sequence stars, including geometric parallax and asteroseismology, rotation-based ages have the potential to be the most precise chronometer for 0.6-1.0 M {sub ☉} stars.« less

  12. Value based pricing, research and development, and patient access schemes. Will the United Kingdom get it right or wrong?

    PubMed Central

    Towse, Adrian

    2010-01-01

    The National Health Service (NHS) should reward innovation it values. This will enable the NHS and the United Kingdom (UK) economy to benefit and impact positively on the Research and Development (R&D) decision making of companies. The National Institute for Health and Clinical Excellence (NICE) currently seeks to do this on behalf of the NHS. Yet the Office of Fair Trading proposals for Value Based Pricing add price setting powers – initially for the Department of Health (DH) and then for NICE. This introduces an additional substantial uncertainty that will impact on R&D and, conditional on R&D proceeding, on launch (or not) in the UK. Instead of adding to uncertainty the institutional arrangements for assessing value should seek to be predictable and science based, building on NICE's current arrangements. The real challenge is to increase understanding of the underlying cost-effectiveness of the technology itself by collecting evidence alongside use. The 2009 Pharmaceutical Price Regulation Scheme sought to help do this with Flexible Pricing (FP) and Patient Access Schemes (PASs). The PASs to date have increased access to medicines, but no schemes proposed to date have yet helped to tackle outcomes uncertainty. The 2010 Innovation Pass can also be seen as a form of ‘coverage with evidence development.’ The NHS is understandably concerned about the costs of running such evidence collection schemes. Enabling the NHS to deliver on such schemes will impact favourably on R&D decisions. Increasing the uncertainty in the UK NHS market through government price setting will reduce incentives for R&D and for early UK launch. PMID:20716236

  13. Value based pricing, research and development, and patient access schemes. Will the United Kingdom get it right or wrong?

    PubMed

    Towse, Adrian

    2010-09-01

    The National Health Service (NHS) should reward innovation it values. This will enable the NHS and the United Kingdom (UK) economy to benefit and impact positively on the Research and Development (R&D) decision making of companies. The National Institute for Health and Clinical Excellence (NICE) currently seeks to do this on behalf of the NHS. Yet the Office of Fair Trading proposals for Value Based Pricing add price setting powers--initially for the Department of Health (DH) and then for NICE. This introduces an additional substantial uncertainty that will impact on R&D and, conditional on R&D proceeding, on launch (or not) in the UK. Instead of adding to uncertainty the institutional arrangements for assessing value should seek to be predictable and science based, building on NICE's current arrangements. The real challenge is to increase understanding of the underlying cost-effectiveness of the technology itself by collecting evidence alongside use. The 2009 Pharmaceutical Price Regulation Scheme sought to help do this with Flexible Pricing (FP) and Patient Access Schemes (PASs). The PASs to date have increased access to medicines, but no schemes proposed to date have yet helped to tackle outcomes uncertainty. The 2010 Innovation Pass can also be seen as a form of 'coverage with evidence development.' The NHS is understandably concerned about the costs of running such evidence collection schemes. Enabling the NHS to deliver on such schemes will impact favourably on R&D decisions. Increasing the uncertainty in the UK NHS market through government price setting will reduce incentives for R&D and for early UK launch.

  14. Probabilistic Physics-Based Risk Tools Used to Analyze the International Space Station Electrical Power System Output

    NASA Technical Reports Server (NTRS)

    Patel, Bhogila M.; Hoge, Peter A.; Nagpal, Vinod K.; Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2004-01-01

    This paper describes the methods employed to apply probabilistic modeling techniques to the International Space Station (ISS) power system. These techniques were used to quantify the probabilistic variation in the power output, also called the response variable, due to variations (uncertainties) associated with knowledge of the influencing factors called the random variables. These uncertainties can be due to unknown environmental conditions, variation in the performance of electrical power system components or sensor tolerances. Uncertainties in these variables, cause corresponding variations in the power output, but the magnitude of that effect varies with the ISS operating conditions, e.g. whether or not the solar panels are actively tracking the sun. Therefore, it is important to quantify the influence of these uncertainties on the power output for optimizing the power available for experiments.

  15. USGS Polar Temperature Logging System, Description and Measurement Uncertainties

    USGS Publications Warehouse

    Clow, Gary D.

    2008-01-01

    This paper provides an updated technical description of the USGS Polar Temperature Logging System (PTLS) and a complete assessment of the measurement uncertainties. This measurement system is used to acquire subsurface temperature data for climate-change detection in the polar regions and for reconstructing past climate changes using the 'borehole paleothermometry' inverse method. Specifically designed for polar conditions, the PTLS can measure temperatures as low as -60 degrees Celsius with a sensitivity ranging from 0.02 to 0.19 millikelvin (mK). A modular design allows the PTLS to reach depths as great as 4.5 kilometers with a skid-mounted winch unit or 650 meters with a small helicopter-transportable unit. The standard uncertainty (uT) of the ITS-90 temperature measurements obtained with the current PTLS range from 3.0 mK at -60 degrees Celsius to 3.3 mK at 0 degrees Celsius. Relative temperature measurements used for borehole paleothermometry have a standard uncertainty (urT) whose upper limit ranges from 1.6 mK at -60 degrees Celsius to 2.0 mK at 0 degrees Celsius. The uncertainty of a temperature sensor's depth during a log depends on specific borehole conditions and the temperature near the winch and thus must be treated on a case-by-case basis. However, recent experience indicates that when logging conditions are favorable, the 4.5-kilometer system is capable of producing depths with a standard uncertainty (uZ) on the order of 200-250 parts per million.

  16. Observation of quantum-memory-assisted entropic uncertainty relation under open systems, and its steering

    NASA Astrophysics Data System (ADS)

    Chen, Peng-Fei; Sun, Wen-Yang; Ming, Fei; Huang, Ai-Jun; Wang, Dong; Ye, Liu

    2018-01-01

    Quantum objects are susceptible to noise from their surrounding environments, interaction with which inevitably gives rise to quantum decoherence or dissipation effects. In this work, we examine how different types of local noise under an open system affect entropic uncertainty relations for two incompatible measurements. Explicitly, we observe the dynamics of the entropic uncertainty in the presence of quantum memory under two canonical categories of noisy environments: unital (phase flip) and nonunital (amplitude damping). Our study shows that the measurement uncertainty exhibits a non-monotonic dynamical behavior—that is, the amount of the uncertainty will first inflate, and subsequently decrease, with the growth of decoherence strengths in the two channels. In contrast, the uncertainty decreases monotonically with the growth of the purity of the initial state shared in prior. In order to reduce the measurement uncertainty in noisy environments, we put forward a remarkably effective strategy to steer the magnitude of uncertainty by means of a local non-unitary operation (i.e. weak measurement) on the qubit of interest. It turns out that this non-unitary operation can greatly reduce the entropic uncertainty, upon tuning the operation strength. Our investigations might thereby offer an insight into the dynamics and steering of entropic uncertainty in open systems.

  17. Averse to Initiative: Risk Management’s Effect on Mission Command

    DTIC Science & Technology

    2017-05-25

    military decision making process (MDMP). Other changes to structure reveal administrative and safety risk information (i.e. personal operated vehicle... decision making , it requires commanders to have the capacity to make an informed , intuitive decision . Uncertainty...analysis. His situation required him to embrace uncertainty, and exercise an informed intuition to make a risk decision to create opportunity

  18. Decision Making Under Uncertainty

    DTIC Science & Technology

    2010-11-01

    A sound approach to rational decision making requires a decision maker to establish decision objectives, identify alternatives, and evaluate those...often violate the axioms of rationality when making decisions under uncertainty. The systematic description of such observations may lead to the...which leads to “anchoring” on the initial value. The fact that individuals have been shown to deviate from rationality when making decisions

  19. Modeling uncertainty in requirements engineering decision support

    NASA Technical Reports Server (NTRS)

    Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.

    2005-01-01

    One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.

  20. Uncertainties in energy reconstruction of cosmic rays for ANITA III caused by differences in models of radio emission in atmospheric showers

    NASA Astrophysics Data System (ADS)

    Bugaev, Viatcheslav; Rauch, Brian; Schoorlemmer, Harm; Lam, Joe; Urdaneta, David; Wissel, Stephanie; Belov, Konstantin; Romero-Wolf, Andrew; Anita Collaboration

    2015-04-01

    The third flight of the high-altitude balloon-borne Antarctic Impulsive Transient Antenna (ANITA III) was launched on a high-altitude balloon from McMurdo, Antarctica on December 17th, 2014 and flew for 22 days. It was optimized for the measurement of impulsive radio signals from the charged component of extensive air showers initiated by ultra-high energy cosmic rays in the frequency range ~ 180 - 1200 MHz. In addition it is designed to detect radio impulses initiated by high-energy neutrinos interacting in the Antarctic ice, which was the primary objective of the first two ANITA flights. Based on an extensive set of Monte Carlo simulations of radio emissions from cosmic rays (CR) with the ZHAireS and CoREAS simulation packages, we estimate uncertainties in the electric fields at the payload due to different models used in the two packages. The uncertainties in the emission are then propagated through an algorithm for energy reconstruction of individual CR showers to assess uncertainties in the energy reconstruction. We also discuss optimization of this algorithm. This research is supported by NASA under Grant # NNX11AC49G.

  1. Testing a multi-malaria-model ensemble against 30 years of data in the Kenyan highlands

    PubMed Central

    2014-01-01

    Background Multi-model ensembles could overcome challenges resulting from uncertainties in models’ initial conditions, parameterization and structural imperfections. They could also quantify in a probabilistic way uncertainties in future climatic conditions and their impacts. Methods A four-malaria-model ensemble was implemented to assess the impact of long-term changes in climatic conditions on Plasmodium falciparum malaria morbidity observed in Kericho, in the highlands of Western Kenya, over the period 1979–2009. Input data included quality controlled temperature and rainfall records gathered at a nearby weather station over the historical periods 1979–2009 and 1980–2009, respectively. Simulations included models’ sensitivities to changes in sets of parameters and analysis of non-linear changes in the mean duration of host’s infectivity to vectors due to increased resistance to anti-malarial drugs. Results The ensemble explained from 32 to 38% of the variance of the observed P. falciparum malaria incidence. Obtained R2-values were above the results achieved with individual model simulation outputs. Up to 18.6% of the variance of malaria incidence could be attributed to the +0.19 to +0.25°C per decade significant long-term linear trend in near-surface air temperatures. On top of this 18.6%, at least 6% of the variance of malaria incidence could be related to the increased resistance to anti-malarial drugs. Ensemble simulations also suggest that climatic conditions have likely been less favourable to malaria transmission in Kericho in recent years. Conclusions Long-term changes in climatic conditions and non-linear changes in the mean duration of host’s infectivity are synergistically driving the increasing incidence of P. falciparum malaria in the Kenyan highlands. User-friendly, online-downloadable, open source mathematical tools, such as the one presented here, could improve decision-making processes of local and regional health authorities. PMID:24885824

  2. Impact of Martian atmosphere parameter uncertainties on entry vehicles aerodynamic for hypersonic rarefied conditions

    NASA Astrophysics Data System (ADS)

    Fei, Huang; Xu-hong, Jin; Jun-ming, Lv; Xiao-li, Cheng

    2016-11-01

    An attempt has been made to analyze impact of Martian atmosphere parameter uncertainties on entry vehicle aerodynamics for hypersonic rarefied conditions with a DSMC code. The code has been validated by comparing Viking vehicle flight data with present computational results. Then, by simulating flows around the Mars Science Laboratory, the impact of errors of free stream parameter uncertainties on aerodynamics is investigated. The validation results show that the present numerical approach can show good agreement with the Viking flight data. The physical and chemical properties of CO2 has strong impact on aerodynamics of Mars entry vehicles, so it is necessary to make proper corrections to the data obtained with air model in hypersonic rarefied conditions, which is consistent with the conclusions drawn in continuum regime. Uncertainties of free stream density and velocity weakly influence aerodynamics and pitching moment. However, aerodynamics appears to be little influenced by free stream temperature, the maximum error of what is below 0.5%. Center of pressure position is not sensitive to free stream parameters.

  3. Composite Multilinearity, Epistemic Uncertainty and Risk Achievement Worth

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    E. Borgonovo; C. L. Smith

    2012-10-01

    Risk Achievement Worth is one of the most widely utilized importance measures. RAW is defined as the ratio of the risk metric value attained when a component has failed over the base case value of the risk metric. Traditionally, both the numerator and denominator are point estimates. Relevant literature has shown that inclusion of epistemic uncertainty i) induces notable variability in the point estimate ranking and ii) causes the expected value of the risk metric to differ from its nominal value. We obtain the conditions under which the equality holds between the nominal and expected values of a reliability riskmore » metric. Among these conditions, separability and state-of-knowledge independence emerge. We then study how the presence of epistemic uncertainty aspects RAW and the associated ranking. We propose an extension of RAW (called ERAW) which allows one to obtain a ranking robust to epistemic uncertainty. We discuss the properties of ERAW and the conditions under which it coincides with RAW. We apply our findings to a probabilistic risk assessment model developed for the safety analysis of NASA lunar space missions.« less

  4. Exploiting active subspaces to quantify uncertainty in the numerical simulation of the HyShot II scramjet

    NASA Astrophysics Data System (ADS)

    Constantine, P. G.; Emory, M.; Larsson, J.; Iaccarino, G.

    2015-12-01

    We present a computational analysis of the reactive flow in a hypersonic scramjet engine with focus on effects of uncertainties in the operating conditions. We employ a novel methodology based on active subspaces to characterize the effects of the input uncertainty on the scramjet performance. The active subspace identifies one-dimensional structure in the map from simulation inputs to quantity of interest that allows us to reparameterize the operating conditions; instead of seven physical parameters, we can use a single derived active variable. This dimension reduction enables otherwise infeasible uncertainty quantification, considering the simulation cost of roughly 9500 CPU-hours per run. For two values of the fuel injection rate, we use a total of 68 simulations to (i) identify the parameters that contribute the most to the variation in the output quantity of interest, (ii) estimate upper and lower bounds on the quantity of interest, (iii) classify sets of operating conditions as safe or unsafe corresponding to a threshold on the output quantity of interest, and (iv) estimate a cumulative distribution function for the quantity of interest.

  5. When, not if: The inescapability of an uncertain future

    NASA Astrophysics Data System (ADS)

    Lewandowsky, S.; Ballard, T.

    2014-12-01

    Uncertainty is an inherent feature of most scientific endeavours, and many political decisions must be made in the presence of scientific uncertainty. In the case of climate change, there is evidence that greater scientific uncertainty increases the risk associated with the impact of climate change. Scientific uncertainty thus provides an impetus for cutting emissions rather than delaying action. In contrast to those normative considerations, uncertainty is frequently cited in political and public discourse as a reason to delay mitigation. We examine ways in which this gap between public and scientific understanding of uncertainty can be bridged. In particular, we sought ways to communicate uncertainty in a way that better calibrates people's risk perceptions with the projected impact of climate change. We report two behavioural experiments in which uncertainty about the future was expressed either as outcome uncertainty or temporal uncertainty. The conventional presentation of uncertainty involves uncertainty about an outcome at a given time—for example, the range of possible sea level rise (say 50cm +/- 20cm) by a certain date. An alternative presentation of the same situation presents a certain outcome ("sea levels will rise by 50cm") but places the uncertainty into the time of arrival ("this may occur as early as 2040 or as late as 2080"). We presented participants with a series of statements and graphs indicating projected increases in temperature, sea levels, ocean acidification, and a decrease in artic sea ice. In the uncertain magnitude condition, the statements and graphs reported the upper and lower confidence bounds of the projected magnitude and the mean projected time of arrival. In the uncertain time of arrival condition, they reported the upper and lower confidence bounds of the projected time of arrival and the mean projected magnitude. The results show that when uncertainty was presented as uncertain time of arrival rather than an uncertain outcome, people expressed greater concern about the projected outcomes. In a further experiment involving repeated "games" with a simulated economy, we similarly showed that people allocate more resources to mitigation if there is uncertainty about the timing of an adverse event rather than about the magnitude of its impact.

  6. Disentangling the uncertainty of hydrologic drought characteristics in a multi-model century-long experiment in continental river basins

    NASA Astrophysics Data System (ADS)

    Samaniego, Luis; Kumar, Rohini; Pechlivanidis, Illias; Breuer, Lutz; Wortmann, Michel; Vetter, Tobias; Flörke, Martina; Chamorro, Alejandro; Schäfer, David; Shah, Harsh; Zeng, Xiaofan

    2016-04-01

    The quantification of the predictive uncertainty in hydrologic models and their attribution to its main sources is of particular interest in climate change studies. In recent years, a number of studies have been aimed at assessing the ability of hydrologic models (HMs) to reproduce extreme hydrologic events. Disentangling the overall uncertainty of streamflow -including its derived low-flow characteristics- into individual contributions, stemming from forcings and model structure, has also been studied. Based on recent literature, it can be stated that there is a controversy with respect to which source is the largest (e.g., Teng, et al. 2012, Bosshard et al. 2013, Prudhomme et al. 2014). Very little has also been done to estimate the relative impact of the parametric uncertainty of the HMs with respect to overall uncertainty of low-flow characteristics. The ISI-MIP2 project provides a unique opportunity to understand the propagation of forcing and model structure uncertainties into century-long time series of drought characteristics. This project defines a consistent framework to deal with compatible initial conditions for the HMs and a set of standardized historical and future forcings. Moreover, the ensemble of hydrologic model predictions varies across a broad range of climate scenarios and regions. To achieve this goal, we use six preconditioned hydrologic models (HYPE or HBV, mHM, SWIM, VIC, and WaterGAP3) set up in seven large continental river basins: Amazon, Blue Nile, Ganges, Niger, Mississippi, Rhine, Yellow. These models are forced with bias-corrected outputs of five CMIP5 general circulation models (GCM) under four extreme representative concentration pathway (RCP) scenarios (i.e. 2.6, 4.5, 6.0, and 8.5 Wm-2) for the period 1971-2099. Simulated streamflow is transformed into a monthly runoff index (RI) to analyze the attribution of the GCM and HM uncertainty into drought magnitude and duration over time. Uncertainty contributions are investigated during periods: 1) 2006-2035, 2) 2036-2065 and 3) 2070-2099. Results presented in Samaniego et al. 2015 (submitted) indicate that GCM uncertainty mostly dominates over HM uncertainty for predictions of runoff drought characteristics, irrespective of the selected RCP and region. For the mHM model, in particular, GCM uncertainty always dominates over parametric uncertainty. In general, the overall uncertainty increases with time. The larger the radiative forcing of the RCP, the larger the uncertainty in drought characteristics, however, the propagation of the GCM uncertainty onto a drought characteristic depends largely upon the hydro-climatic regime. While our study emphasizes the need for multi-model ensembles for the assessment of future drought projections, the agreement between GCM forcings is still weak to draw conclusive recommendations. References: L. Samaniego, R. Kumar, I. G. Pechlivanidis, L. Breuer, M. Wortmann, T. Vetter, M. Flörke, A. Chamorro, D. Schäfer, H. Shah, X. Zeng: Propagation of forcing and model uncertainty into hydrological drought characteristics in a multi-model century-long experiment in continental river basins. Submitted to Climatic Change on Dec 2015. Bosshard, et al. 2013. doi:10.1029/2011WR011533. Prudhomme et al. 2014, doi:10.1073/pnas.1222473110. Teng, et al. 2012, doi:10.1175/JHM-D-11-058.1.

  7. Feedback Blunting: Total Sleep Deprivation Impairs Decision Making that Requires Updating Based on Feedback.

    PubMed

    Whitney, Paul; Hinson, John M; Jackson, Melinda L; Van Dongen, Hans P A

    2015-05-01

    To better understand the sometimes catastrophic effects of sleep loss on naturalistic decision making, we investigated effects of sleep deprivation on decision making in a reversal learning paradigm requiring acquisition and updating of information based on outcome feedback. Subjects were randomized to a sleep deprivation or control condition, with performance testing at baseline, after 2 nights of total sleep deprivation (or rested control), and following 2 nights of recovery sleep. Subjects performed a decision task involving initial learning of go and no go response sets followed by unannounced reversal of contingencies, requiring use of outcome feedback for decisions. A working memory scanning task and psychomotor vigilance test were also administered. Six consecutive days and nights in a controlled laboratory environment with continuous behavioral monitoring. Twenty-six subjects (22-40 y of age; 10 women). Thirteen subjects were randomized to a 62-h total sleep deprivation condition; the others were controls. Unlike controls, sleep deprived subjects had difficulty with initial learning of go and no go stimuli sets and had profound impairment adapting to reversal. Skin conductance responses to outcome feedback were diminished, indicating blunted affective reactions to feedback accompanying sleep deprivation. Working memory scanning performance was not significantly affected by sleep deprivation. And although sleep deprived subjects showed expected attentional lapses, these could not account for impairments in reversal learning decision making. Sleep deprivation is particularly problematic for decision making involving uncertainty and unexpected change. Blunted reactions to feedback while sleep deprived underlie failures to adapt to uncertainty and changing contingencies. Thus, an error may register, but with diminished effect because of reduced affective valence of the feedback or because the feedback is not cognitively bound with the choice. This has important implications for understanding and managing sleep loss-induced cognitive impairment in emergency response, disaster management, military operations, and other dynamic real-world settings with uncertain outcomes and imperfect information. © 2015 Associated Professional Sleep Societies, LLC.

  8. Reducing streamflow forecast uncertainty: Application and qualitative assessment of the upper klamath river Basin, Oregon

    USGS Publications Warehouse

    Hay, L.E.; McCabe, G.J.; Clark, M.P.; Risley, J.C.

    2009-01-01

    The accuracy of streamflow forecasts depends on the uncertainty associated with future weather and the accuracy of the hydrologic model that is used to produce the forecasts. We present a method for streamflow forecasting where hydrologic model parameters are selected based on the climate state. Parameter sets for a hydrologic model are conditioned on an atmospheric pressure index defined using mean November through February (NDJF) 700-hectoPascal geopotential heights over northwestern North America [Pressure Index from Geopotential heights (PIG)]. The hydrologic model is applied in the Sprague River basin (SRB), a snowmelt-dominated basin located in the Upper Klamath basin in Oregon. In the SRB, the majority of streamflow occurs during March through May (MAM). Water years (WYs) 1980-2004 were divided into three groups based on their respective PIG values (high, medium, and low PIG). Low (high) PIG years tend to have higher (lower) than average MAM streamflow. Four parameter sets were calibrated for the SRB, each using a different set of WYs. The initial set used WYs 1995-2004 and the remaining three used WYs defined as high-, medium-, and low-PIG years. Two sets of March, April, and May streamflow volume forecasts were made using Ensemble Streamflow Prediction (ESP). The first set of ESP simulations used the initial parameter set. Because the PIG is defined using NDJF pressure heights, forecasts starting in March can be made using the PIG parameter set that corresponds with the year being forecasted. The second set of ESP simulations used the parameter set associated with the given PIG year. Comparison of the ESP sets indicates that more accuracy and less variability in volume forecasts may be possible when the ESP is conditioned using the PIG. This is especially true during the high-PIG years (low-flow years). ?? 2009 American Water Resources Association.

  9. Adaptive management: a paradigm for remediation of public facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janecky, David R; Whicker, Jeffrey J; Doerr, Ted B

    2009-01-01

    Public facility restoration planning traditionally focused on response to natural disasters and hazardous materials accidental releases. These plans now need to integrate response to terrorist actions. Therefore, plans must address a wide range of potential vulnerabilities. Similar types of broad remediation planning are needed for restoration of waste and hazardous material handling areas and facilities. There are strong similarities in damage results and remediation activities between unintentional and terrorist actions; however, the uncertainties associated with terrorist actions result in a re-evaluation of approaches to planning. Restoration of public facilities following a release of a hazardous material is inherently far moremore » complex than in confined industrial settings and has many unique technical, economic, social, and political challenges. Therefore, they arguably involve a superset of drivers, concerns and public agencies compared to other restoration efforts. This superset of conditions increases complexity of interactions, reduces our knowledge of the initial conditions, and even condenses the timeline for restoration response. Therefore, evaluations of alternative restoration management approaches developed for responding to terrorist actions provide useful knowledge for large, complex waste management projects. Whereas present planning documents have substantial linearity in their organization, the 'adaptive management' paradigm provides a constructive parallel operations paradigm for restoration of facilities that anticipates and plans for uncertainty, multiple/simUltaneous public agency actions, and stakeholder participation. Adaptive management grew out of the need to manage and restore natural resources in highly complex and changing environments with limited knowledge about causal relationships and responses to restoration actions. Similarities between natural resource management and restoration of a facility and surrounding area(s) after a disruptive event suggest numerous advantages over preset linearly-structured plans by incorporating the flexibility and overlap of processes inherent in effective facility restoration. We discuss three restoration case studies (e.g., the Hart Senate Office Building anthrax restoration, Rocky Flats actinide remediation, and hurricane destruction restoration), that implement aspects of adaptive management but not a formal approach. We propose that more formal adoption of adaptive management principles could be a basis for more flexible standards to improve site-specific remediation plans under conditions of high uncertainty.« less

  10. A Bayesian-Based Novel Methodology to Generate Reliable Site Response Mapping Sensitive to Data Uncertainties

    NASA Astrophysics Data System (ADS)

    Chakraborty, A.; Goto, H.

    2017-12-01

    The 2011 off the Pacific coast of Tohoku earthquake caused severe damage in many areas further inside the mainland because of site-amplification. Furukawa district in Miyagi Prefecture, Japan recorded significant spatial differences in ground motion even at sub-kilometer scales. The site responses in the damage zone far exceeded the levels in the hazard maps. A reason why the mismatch occurred is that mapping follow only the mean value at the measurement locations with no regard to the data uncertainties and thus are not always reliable. Our research objective is to develop a methodology to incorporate data uncertainties in mapping and propose a reliable map. The methodology is based on a hierarchical Bayesian modeling of normally-distributed site responses in space where the mean (μ), site-specific variance (σ2) and between-sites variance(s2) parameters are treated as unknowns with a prior distribution. The observation data is artificially created site responses with varying means and variances for 150 seismic events across 50 locations in one-dimensional space. Spatially auto-correlated random effects were added to the mean (μ) using a conditionally autoregressive (CAR) prior. The inferences on the unknown parameters are done using Markov Chain Monte Carlo methods from the posterior distribution. The goal is to find reliable estimates of μ sensitive to uncertainties. During initial trials, we observed that the tau (=1/s2) parameter of CAR prior controls the μ estimation. Using a constraint, s = 1/(k×σ), five spatial models with varying k-values were created. We define reliability to be measured by the model likelihood and propose the maximum likelihood model to be highly reliable. The model with maximum likelihood was selected using a 5-fold cross-validation technique. The results show that the maximum likelihood model (μ*) follows the site-specific mean at low uncertainties and converges to the model-mean at higher uncertainties (Fig.1). This result is highly significant as it successfully incorporates the effect of data uncertainties in mapping. This novel approach can be applied to any research field using mapping techniques. The methodology is now being applied to real records from a very dense seismic network in Furukawa district, Miyagi Prefecture, Japan to generate a reliable map of the site responses.

  11. Metrology applied to ultrasound characterization of trabecular bones using the AIB parameter

    NASA Astrophysics Data System (ADS)

    Braz, D. S.; Silva, C. E.; Alvarenga, A. V.; Junior, D. S.; Costa-Félix, R. P. B.

    2016-07-01

    Apparent Integrated Backscattering (AIB) presents correlation between Apparent Backscatter Transfer Function and the transducer bandwidth. Replicas of trabecular bones (cubes of 20 mm side length) created by 3D printing technique were characterized using AIB with a 2.25 MHz center frequency transducer. A mechanical scanning system was used to acquire multiple backscatter signals. An uncertainty model in measurement was proposed based on the Guide to the Expression of Uncertainty in Measurement. Initial AIB results are not metrologically reliable, presenting high measurement uncertainties (sample: 5_0.2032/AIB: -15.1 dB ± 13.9 dB). It is noteworthy that the uncertainty model proposed contributes as unprecedented way for metrological assessment of trabecular bone characterization using AIB.

  12. Uncertainty in Climate Change Research: An Integrated Approach

    NASA Astrophysics Data System (ADS)

    Mearns, L.

    2017-12-01

    Uncertainty has been a major theme in research regarding climate change from virtually the very beginning. And appropriately characterizing and quantifying uncertainty has been an important aspect of this work. Initially, uncertainties were explored regarding the climate system and how it would react to future forcing. A concomitant area of concern was viewed in the future emissions and concentrations of important forcing agents such as greenhouse gases and aerosols. But, of course we know there are important uncertainties in all aspects of climate change research, not just that of the climate system and emissions. And as climate change research has become more important and of pragmatic concern as possible solutions to the climate change problem are addressed, exploring all the relevant uncertainties has become more relevant and urgent. More recently, over the past five years or so, uncertainties in impacts models, such as agricultural and hydrological models, have received much more attention, through programs such as AgMIP, and some research in this arena has indicated that the uncertainty in the impacts models can be as great or greater than that in the climate system. Still there remains other areas of uncertainty that remain underexplored and/or undervalued. This includes uncertainty in vulnerability and governance. Without more thoroughly exploring these last uncertainties, we likely will underestimate important uncertainties particularly regarding how different systems can successfully adapt to climate change . In this talk I will discuss these different uncertainties and how to combine them to give a complete picture of the total uncertainty individual systems are facing. And as part of this, I will discuss how the uncertainty can be successfully managed even if it is fairly large and deep. Part of my argument will be that large uncertainty is not the enemy, but rather false certainty is the true danger.

  13. Azimuthal anisotropy distributions in high-energy collisions

    NASA Astrophysics Data System (ADS)

    Yan, Li; Ollitrault, Jean-Yves; Poskanzer, Arthur M.

    2015-03-01

    Elliptic flow in ultrarelativistic heavy-ion collisions results from the hydrodynamic response to the spatial anisotropy of the initial density profile. A long-standing problem in the interpretation of flow data is that uncertainties in the initial anisotropy are mingled with uncertainties in the response. We argue that the non-Gaussianity of flow fluctuations in small systems with large fluctuations can be used to disentangle the initial state from the response. We apply this method to recent measurements of anisotropic flow in Pb+Pb and p+Pb collisions at the LHC, assuming linear response to the initial anisotropy. The response coefficient is found to decrease as the system becomes smaller and is consistent with a low value of the ratio of viscosity over entropy of η / s ≃ 0.19. Deviations from linear response are studied. While they significantly change the value of the response coefficient they do not change the rate of decrease with centrality. Thus, we argue that the estimate of η / s is robust against non-linear effects.

  14. Uncertainty associated with the gravimetric measurement of particulate matter concentration in ambient air.

    PubMed

    Lacey, Ronald E; Faulkner, William Brock

    2015-07-01

    This work applied a propagation of uncertainty method to typical total suspended particulate (TSP) sampling apparatus in order to estimate the overall measurement uncertainty. The objectives of this study were to estimate the uncertainty for three TSP samplers, develop an uncertainty budget, and determine the sensitivity of the total uncertainty to environmental parameters. The samplers evaluated were the TAMU High Volume TSP Sampler at a nominal volumetric flow rate of 1.42 m3 min(-1) (50 CFM), the TAMU Low Volume TSP Sampler at a nominal volumetric flow rate of 17 L min(-1) (0.6 CFM) and the EPA TSP Sampler at the nominal volumetric flow rates of 1.1 and 1.7 m3 min(-1) (39 and 60 CFM). Under nominal operating conditions the overall measurement uncertainty was found to vary from 6.1x10(-6) g m(-3) to 18.0x10(-6) g m(-3), which represented an uncertainty of 1.7% to 5.2% of the measurement. Analysis of the uncertainty budget determined that three of the instrument parameters contributed significantly to the overall uncertainty: the uncertainty in the pressure drop measurement across the orifice meter during both calibration and testing and the uncertainty of the airflow standard used during calibration of the orifice meter. Five environmental parameters occurring during field measurements were considered for their effect on overall uncertainty: ambient TSP concentration, volumetric airflow rate, ambient temperature, ambient pressure, and ambient relative humidity. Of these, only ambient TSP concentration and volumetric airflow rate were found to have a strong effect on the overall uncertainty. The technique described in this paper can be applied to other measurement systems and is especially useful where there are no methods available to generate these values empirically. This work addresses measurement uncertainty of TSP samplers used in ambient conditions. Estimation of uncertainty in gravimetric measurements is of particular interest, since as ambient particulate matter (PM) concentrations approach regulatory limits, the uncertainty of the measurement is essential in determining the sample size and the probability of type II errors in hypothesis testing. This is an important factor in determining if ambient PM concentrations exceed regulatory limits. The technique described in this paper can be applied to other measurement systems and is especially useful where there are no methods available to generate these values empirically.

  15. Adaptive Management and the Value of Information: Learning Via Intervention in Epidemiology

    PubMed Central

    Shea, Katriona; Tildesley, Michael J.; Runge, Michael C.; Fonnesbeck, Christopher J.; Ferrari, Matthew J.

    2014-01-01

    Optimal intervention for disease outbreaks is often impeded by severe scientific uncertainty. Adaptive management (AM), long-used in natural resource management, is a structured decision-making approach to solving dynamic problems that accounts for the value of resolving uncertainty via real-time evaluation of alternative models. We propose an AM approach to design and evaluate intervention strategies in epidemiology, using real-time surveillance to resolve model uncertainty as management proceeds, with foot-and-mouth disease (FMD) culling and measles vaccination as case studies. We use simulations of alternative intervention strategies under competing models to quantify the effect of model uncertainty on decision making, in terms of the value of information, and quantify the benefit of adaptive versus static intervention strategies. Culling decisions during the 2001 UK FMD outbreak were contentious due to uncertainty about the spatial scale of transmission. The expected benefit of resolving this uncertainty prior to a new outbreak on a UK-like landscape would be £45–£60 million relative to the strategy that minimizes livestock losses averaged over alternate transmission models. AM during the outbreak would be expected to recover up to £20.1 million of this expected benefit. AM would also recommend a more conservative initial approach (culling of infected premises and dangerous contact farms) than would a fixed strategy (which would additionally require culling of contiguous premises). For optimal targeting of measles vaccination, based on an outbreak in Malawi in 2010, AM allows better distribution of resources across the affected region; its utility depends on uncertainty about both the at-risk population and logistical capacity. When daily vaccination rates are highly constrained, the optimal initial strategy is to conduct a small, quick campaign; a reduction in expected burden of approximately 10,000 cases could result if campaign targets can be updated on the basis of the true susceptible population. Formal incorporation of a policy to update future management actions in response to information gained in the course of an outbreak can change the optimal initial response and result in significant cost savings. AM provides a framework for using multiple models to facilitate public-health decision making and an objective basis for updating management actions in response to improved scientific understanding. PMID:25333371

  16. Adaptive management and the value of information: learning via intervention in epidemiology

    USGS Publications Warehouse

    Shea, Katriona; Tildesley, Michael J.; Runge, Michael C.; Fonnesbeck, Christopher J.; Ferrari, Matthew J.

    2014-01-01

    Optimal intervention for disease outbreaks is often impeded by severe scientific uncertainty. Adaptive management (AM), long-used in natural resource management, is a structured decision-making approach to solving dynamic problems that accounts for the value of resolving uncertainty via real-time evaluation of alternative models. We propose an AM approach to design and evaluate intervention strategies in epidemiology, using real-time surveillance to resolve model uncertainty as management proceeds, with foot-and-mouth disease (FMD) culling and measles vaccination as case studies. We use simulations of alternative intervention strategies under competing models to quantify the effect of model uncertainty on decision making, in terms of the value of information, and quantify the benefit of adaptive versus static intervention strategies. Culling decisions during the 2001 UK FMD outbreak were contentious due to uncertainty about the spatial scale of transmission. The expected benefit of resolving this uncertainty prior to a new outbreak on a UK-like landscape would be £45–£60 million relative to the strategy that minimizes livestock losses averaged over alternate transmission models. AM during the outbreak would be expected to recover up to £20.1 million of this expected benefit. AM would also recommend a more conservative initial approach (culling of infected premises and dangerous contact farms) than would a fixed strategy (which would additionally require culling of contiguous premises). For optimal targeting of measles vaccination, based on an outbreak in Malawi in 2010, AM allows better distribution of resources across the affected region; its utility depends on uncertainty about both the at-risk population and logistical capacity. When daily vaccination rates are highly constrained, the optimal initial strategy is to conduct a small, quick campaign; a reduction in expected burden of approximately 10,000 cases could result if campaign targets can be updated on the basis of the true susceptible population. Formal incorporation of a policy to update future management actions in response to information gained in the course of an outbreak can change the optimal initial response and result in significant cost savings. AM provides a framework for using multiple models to facilitate public-health decision making and an objective basis for updating management actions in response to improved scientific understanding.

  17. Uncertainties in Predicting Rice Yield by Current Crop Models Under a Wide Range of Climatic Conditions

    NASA Technical Reports Server (NTRS)

    Li, Tao; Hasegawa, Toshihiro; Yin, Xinyou; Zhu, Yan; Boote, Kenneth; Adam, Myriam; Bregaglio, Simone; Buis, Samuel; Confalonieri, Roberto; Fumoto, Tamon; hide

    2014-01-01

    Predicting rice (Oryza sativa) productivity under future climates is important for global food security. Ecophysiological crop models in combination with climate model outputs are commonly used in yield prediction, but uncertainties associated with crop models remain largely unquantified. We evaluated 13 rice models against multi-year experimental yield data at four sites with diverse climatic conditions in Asia and examined whether different modeling approaches on major physiological processes attribute to the uncertainties of prediction to field measured yields and to the uncertainties of sensitivity to changes in temperature and CO2 concentration [CO2]. We also examined whether a use of an ensemble of crop models can reduce the uncertainties. Individual models did not consistently reproduce both experimental and regional yields well, and uncertainty was larger at the warmest and coolest sites. The variation in yield projections was larger among crop models than variation resulting from 16 global climate model-based scenarios. However, the mean of predictions of all crop models reproduced experimental data, with an uncertainty of less than 10 percent of measured yields. Using an ensemble of eight models calibrated only for phenology or five models calibrated in detail resulted in the uncertainty equivalent to that of the measured yield in well-controlled agronomic field experiments. Sensitivity analysis indicates the necessity to improve the accuracy in predicting both biomass and harvest index in response to increasing [CO2] and temperature.

  18. Uncertainty Analysis of OC5-DeepCwind Floating Semisubmersible Offshore Wind Test Campaign

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Amy N

    This paper examines how to assess the uncertainty levels for test measurements of the Offshore Code Comparison, Continued, with Correlation (OC5)-DeepCwind floating offshore wind system, examined within the OC5 project. The goal of the OC5 project was to validate the accuracy of ultimate and fatigue load estimates from a numerical model of the floating semisubmersible using data measured during scaled tank testing of the system under wind and wave loading. The examination of uncertainty was done after the test, and it was found that the limited amount of data available did not allow for an acceptable uncertainty assessment. Therefore, thismore » paper instead qualitatively examines the sources of uncertainty associated with this test to start a discussion of how to assess uncertainty for these types of experiments and to summarize what should be done during future testing to acquire the information needed for a proper uncertainty assessment. Foremost, future validation campaigns should initiate numerical modeling before testing to guide the test campaign, which should include a rigorous assessment of uncertainty, and perform validation during testing to ensure that the tests address all of the validation needs.« less

  19. Uncertainty Analysis of OC5-DeepCwind Floating Semisubmersible Offshore Wind Test Campaign: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Amy N

    This paper examines how to assess the uncertainty levels for test measurements of the Offshore Code Comparison, Continued, with Correlation (OC5)-DeepCwind floating offshore wind system, examined within the OC5 project. The goal of the OC5 project was to validate the accuracy of ultimate and fatigue load estimates from a numerical model of the floating semisubmersible using data measured during scaled tank testing of the system under wind and wave loading. The examination of uncertainty was done after the test, and it was found that the limited amount of data available did not allow for an acceptable uncertainty assessment. Therefore, thismore » paper instead qualitatively examines the sources of uncertainty associated with this test to start a discussion of how to assess uncertainty for these types of experiments and to summarize what should be done during future testing to acquire the information needed for a proper uncertainty assessment. Foremost, future validation campaigns should initiate numerical modeling before testing to guide the test campaign, which should include a rigorous assessment of uncertainty, and perform validation during testing to ensure that the tests address all of the validation needs.« less

  20. And yet it moves! Involving transient flow conditions is the logical next step for WHPA analysis

    NASA Astrophysics Data System (ADS)

    Rodriguez-Pretelin, A.; Nowak, W.

    2017-12-01

    As the first line of defense among different safety measures, Wellhead Protection Areas (WHPAs) have been broadly used to protect drinking water wells against sources of pollution. In most cases, their implementation relies on simplifications, such as assuming homogeneous or zonated aquifer conditions or considering steady-state flow scenarios. Obviously, both assumptions inevitably invoke errors. However, while uncertainty due to aquifer heterogeneity has been extensively studied in the literature, the impact of transient flow conditions have received yet very little attention. For instance, WHPA maps in the offices of water supply companies are fixed maps derived from steady-state models although the actual catchment out there are transient. To mitigate high computational costs, we approximate transiency by means of a dynamic superposition of steady-state flow solutions. Then, we analyze four transient drivers that often appear on the seasonal scale: (I) regional groundwater flow direction, (II) strength of the regional hydraulic gradient, (III) natural recharge to the groundwater and (IV) pumping rate. The integration of transiency in WHPA analysis leads to time-frequency maps. They express for each location the temporal frequency of catchment membership. Furthermore, we account for the uncertainty due to incomplete knowledge on geological and transiency conditions, solved through Monte Carlo simulations. The main contribution of this study, is to show the need of enhancing groundwater well protection by considering transient flow considerations during WHPA analysis. To support and complement our statement, we demonstrate that 1) each transient driver imprints an individual spatial pattern in the required WHPA, ranking their influence through a global sensitivity analysis. 2) We compare the influence of transient conditions compared to geological uncertainty in terms of areal WHPA demand. 3) We show that considering geological uncertainty alone is insufficient in the presence of transient conditions. 4) We propose a practical decision rule for selecting a proper reliability level protection in the presence of both transiency and geological uncertainty.

  1. A bi-objective model for robust yard allocation scheduling for outbound containers

    NASA Astrophysics Data System (ADS)

    Liu, Changchun; Zhang, Canrong; Zheng, Li

    2017-01-01

    This article examines the yard allocation problem for outbound containers, with consideration of uncertainty factors, mainly including the arrival and operation time of calling vessels. Based on the time buffer inserting method, a bi-objective model is constructed to minimize the total operational cost and to maximize the robustness of fighting against the uncertainty. Due to the NP-hardness of the constructed model, a two-stage heuristic is developed to solve the problem. In the first stage, initial solutions are obtained by a greedy algorithm that looks n-steps ahead with the uncertainty factors set as their respective expected values; in the second stage, based on the solutions obtained in the first stage and with consideration of uncertainty factors, a neighbourhood search heuristic is employed to generate robust solutions that can fight better against the fluctuation of uncertainty factors. Finally, extensive numerical experiments are conducted to test the performance of the proposed method.

  2. Climate Projections and Uncertainty Communication.

    PubMed

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections. Copyright © 2015 Cognitive Science Society, Inc.

  3. Ocean Predictability and Uncertainty Forecasts Using Local Ensemble Transfer Kalman Filter (LETKF)

    NASA Astrophysics Data System (ADS)

    Wei, M.; Hogan, P. J.; Rowley, C. D.; Smedstad, O. M.; Wallcraft, A. J.; Penny, S. G.

    2017-12-01

    Ocean predictability and uncertainty are studied with an ensemble system that has been developed based on the US Navy's operational HYCOM using the Local Ensemble Transfer Kalman Filter (LETKF) technology. One of the advantages of this method is that the best possible initial analysis states for the HYCOM forecasts are provided by the LETKF which assimilates operational observations using ensemble method. The background covariance during this assimilation process is implicitly supplied with the ensemble avoiding the difficult task of developing tangent linear and adjoint models out of HYCOM with the complicated hybrid isopycnal vertical coordinate for 4D-VAR. The flow-dependent background covariance from the ensemble will be an indispensable part in the next generation hybrid 4D-Var/ensemble data assimilation system. The predictability and uncertainty for the ocean forecasts are studied initially for the Gulf of Mexico. The results are compared with another ensemble system using Ensemble Transfer (ET) method which has been used in the Navy's operational center. The advantages and disadvantages are discussed.

  4. On the logistic equation subject to uncertainties in the environmental carrying capacity and initial population density

    NASA Astrophysics Data System (ADS)

    Dorini, F. A.; Cecconello, M. S.; Dorini, L. B.

    2016-04-01

    It is recognized that handling uncertainty is essential to obtain more reliable results in modeling and computer simulation. This paper aims to discuss the logistic equation subject to uncertainties in two parameters: the environmental carrying capacity, K, and the initial population density, N0. We first provide the closed-form results for the first probability density function of time-population density, N(t), and its inflection point, t*. We then use the Maximum Entropy Principle to determine both K and N0 density functions, treating such parameters as independent random variables and considering fluctuations of their values for a situation that commonly occurs in practice. Finally, closed-form results for the density functions and statistical moments of N(t), for a fixed t > 0, and of t* are provided, considering the uniform distribution case. We carried out numerical experiments to validate the theoretical results and compared them against that obtained using Monte Carlo simulation.

  5. Assimilating multi-source uncertainties of a parsimonious conceptual hydrological model using hierarchical Bayesian modeling

    Treesearch

    Wei Wu; James Clark; James Vose

    2010-01-01

    Hierarchical Bayesian (HB) modeling allows for multiple sources of uncertainty by factoring complex relationships into conditional distributions that can be used to draw inference and make predictions. We applied an HB model to estimate the parameters and state variables of a parsimonious hydrological model – GR4J – by coherently assimilating the uncertainties from the...

  6. Quantum gravity in the sky: interplay between fundamental theory and observations

    NASA Astrophysics Data System (ADS)

    Ashtekar, Abhay; Gupt, Brajesh

    2017-01-01

    Observational missions have provided us with a reliable model of the evolution of the universe starting from the last scattering surface all the way to future infinity. Furthermore given a specific model of inflation, using quantum field theory on curved space-times this history can be pushed back in time to the epoch when space-time curvature was some 1062 times that at the horizon of a solar mass black hole! However, to extend the history further back to the Planck regime requires input from quantum gravity. An important aspect of this input is the choice of the background quantum geometry and of the Heisenberg state of cosmological perturbations thereon, motivated by Planck scale physics. This paper introduces first steps in that direction. Specifically we propose two principles that link quantum geometry and Heisenberg uncertainties in the Planck epoch with late time physics and explore in detail the observational consequences of the initial conditions they select. We find that the predicted temperature-temperature (T-T) correlations for scalar modes are indistinguishable from standard inflation at small angular scales even though the initial conditions are now set in the deep Planck regime. However, there is a specific power suppression at large angular scales. As a result, the predicted spectrum provides a better fit to the PLANCK mission data than standard inflation, where the initial conditions are set in the general relativity regime. Thus, our proposal brings out a deep interplay between the ultraviolet and the infrared. Finally, the proposal also leads to specific predictions for power suppression at large angular scales also for the (T-E and E-E) correlations involving electric polarization3. The PLANCK team is expected to release this data in the coming year.

  7. Physical Limits on the Predictability of Erosion and Sediment Transport by Landslides and Debris Flows

    NASA Astrophysics Data System (ADS)

    Iverson, R. M.

    2015-12-01

    Episodic landslides and debris flows play a key role in sculpting many steep landscapes, and they also pose significant natural hazards. Field evidence, laboratory experiments, and theoretical analyses show that variations in the quantity, speed, and distance of sediment transport by landslides and debris flows can depend strongly on nuanced differences in initial conditions. Moreover, initial conditions themselves can be strongly dependent on the geological legacy of prior events. The scope of these dependencies is revealed by the results of landslide dynamics experiments [Iverson et al., Science, 2000], debris-flow erosion experiments [Iverson et al., Nature Geosci., 2011], and numerical simulations of the highly destructive 2014 Oso, Washington, landslide [Iverson et al., Earth Planet. Sci. Let., 2015]. In each of these cases, feedbacks between basal sediment deformation and pore-pressure generation cause the speed and distance of sediment transport to be very sensitive to subtle differences in the ambient sediment porosity and water content. On the other hand, the onset of most landslides and debris flows depends largely on pore-water pressure distributions and only indirectly on sediment porosity and water content. Thus, even if perfect predictions of the locations and timing of landslides and debris flows were available, the dynamics of the events - and their consequent hazards and sediment transport - would be difficult to predict. This difficulty is a manifestation of the nonlinear physics involved, rather than of poor understanding of those physics. Consequently, physically based models for assessing the hazards and sediment transport due to landslides and debris flows must take into account both evolving nonlinear dynamics and inherent uncertainties about initial conditions. By contrast, landscape evolution models that use prescribed algebraic formulas to represent sediment transport by landslides and debris flows lack a sound physical basis.

  8. Photochemical parameters of atmospheric source gases: accurate determination of OH reaction rate constants over atmospheric temperatures, UV and IR absorption spectra

    NASA Astrophysics Data System (ADS)

    Orkin, V. L.; Khamaganov, V. G.; Martynova, L. E.; Kurylo, M. J.

    2012-12-01

    The emissions of halogenated (Cl, Br containing) organics of both natural and anthropogenic origin contribute to the balance of and changes in the stratospheric ozone concentration. The associated chemical cycles are initiated by the photochemical decomposition of the portion of source gases that reaches the stratosphere. Reactions with hydroxyl radicals and photolysis are the main processes dictating the compound lifetime in the troposphere and release of active halogen in the stratosphere for a majority of halogen source gases. Therefore, the accuracy of photochemical data is of primary importance for the purpose of comprehensive atmospheric modeling and for simplified kinetic estimations of global impacts on the atmosphere, such as in ozone depletion (i.e., the Ozone Depletion Potential, ODP) and climate change (i.e., the Global Warming Potential, GWP). The sources of critically evaluated photochemical data for atmospheric modeling, NASA/JPL Publications and IUPAC Publications, recommend uncertainties within 10%-60% for the majority of OH reaction rate constants with only a few cases where uncertainties lie at the low end of this range. These uncertainties can be somewhat conservative because evaluations are based on the data from various laboratories obtained during the last few decades. Nevertheless, even the authors of the original experimental works rarely estimate the total combined uncertainties of the published OH reaction rate constants to be less than ca. 10%. Thus, uncertainties in the photochemical properties of potential and current atmospheric trace gases obtained under controlled laboratory conditions still may constitute a major source of uncertainty in estimating the compound's environmental impact. One of the purposes of the presentation is to illustrate the potential for obtaining accurate laboratory measurements of the OH reaction rate constant over the temperature range of atmospheric interest. A detailed inventory of accountable sources of instrumental uncertainties related to our FP-RF experiment proves a total uncertainty of the OH reaction rate constant to be as small as ca. 2-3%. The high precision of kinetic measurements allows reliable determination of weak temperature dependences of the rate constants and clear resolution of the curvature of the Arrhenius plots for the OH reaction rate constants of various compounds. The results of OH reaction rate constant determinations between 220 K and 370 K will be presented. Similarly, the accuracy of UV and IR absorption measurements will be highlighted to provide an improved basis for atmospheric modeling.

  9. Electroencephalographic Evidence of Abnormal Anticipatory Uncertainty Processing in Gambling Disorder Patients.

    PubMed

    Megías, Alberto; Navas, Juan F; Perandrés-Gómez, Ana; Maldonado, Antonio; Catena, Andrés; Perales, José C

    2018-06-01

    Putting money at stake produces anticipatory uncertainty, a process that has been linked to key features of gambling. Here we examined how learning and individual differences modulate the stimulus preceding negativity (SPN, an electroencephalographic signature of perceived uncertainty of valued outcomes) in gambling disorder patients (GDPs) and healthy controls (HCs), during a non-gambling contingency learning task. Twenty-four GDPs and 26 HCs performed a causal learning task under conditions of high and medium uncertainty (HU, MU; null and positive cue-outcome contingency, respectively). Participants were asked to predict the outcome trial-by-trial, and to regularly judge the strength of the cue-outcome contingency. A pre-outcome SPN was extracted from simultaneous electroencephalographic recordings for each participant, uncertainty level, and task block. The two groups similarly learnt to predict the occurrence of the outcome in the presence/absence of the cue. In HCs, SPN amplitude decreased as the outcome became predictable in the MU condition, a decrement that was absent in the HU condition, where the outcome remained unpredictable during the task. Most importantly, GDPs' SPN remained high and insensitive to task type and block. In GDPs, the SPN amplitude was linked to gambling preferences. When both groups were considered together, SPN amplitude was also related to impulsivity. GDPs thus showed an abnormal electrophysiological response to outcome uncertainty, not attributable to faulty contingency learning. Differences with controls were larger in frequent players of passive games, and smaller in players of more active games. Potential psychological mechanisms underlying this set of effects are discussed.

  10. Ultimate open pit stochastic optimization

    NASA Astrophysics Data System (ADS)

    Marcotte, Denis; Caron, Josiane

    2013-02-01

    Classical open pit optimization (maximum closure problem) is made on block estimates, without directly considering the block grades uncertainty. We propose an alternative approach of stochastic optimization. The stochastic optimization is taken as the optimal pit computed on the block expected profits, rather than expected grades, computed from a series of conditional simulations. The stochastic optimization generates, by construction, larger ore and waste tonnages than the classical optimization. Contrary to the classical approach, the stochastic optimization is conditionally unbiased for the realized profit given the predicted profit. A series of simulated deposits with different variograms are used to compare the stochastic approach, the classical approach and the simulated approach that maximizes expected profit among simulated designs. Profits obtained with the stochastic optimization are generally larger than the classical or simulated pit. The main factor controlling the relative gain of stochastic optimization compared to classical approach and simulated pit is shown to be the information level as measured by the boreholes spacing/range ratio. The relative gains of the stochastic approach over the classical approach increase with the treatment costs but decrease with mining costs. The relative gains of the stochastic approach over the simulated pit approach increase both with the treatment and mining costs. At early stages of an open pit project, when uncertainty is large, the stochastic optimization approach appears preferable to the classical approach or the simulated pit approach for fair comparison of the values of alternative projects and for the initial design and planning of the open pit.

  11. The Efficacy of Blue-Green Infrastructure for Pluvial Flood Prevention under Conditions of Deep Uncertainty

    NASA Astrophysics Data System (ADS)

    Babovic, Filip; Mijic, Ana; Madani, Kaveh

    2017-04-01

    Urban areas around the world are growing in size and importance; however, cities experience elevated risks of pluvial flooding due to the prevalence of impermeable land surfaces within them. Urban planners and engineers encounter a great deal of uncertainty when planning adaptations to these flood risks, due to the interaction of multiple factors such as climate change and land use change. This leads to conditions of deep uncertainty. Blue-Green (BG) solutions utilise natural vegetation and processes to absorb and retain runoff while providing a host of other social, economic and environmental services. When utilised in conjunction with Decision Making under Deep Uncertainty (DMDU) methodologies, BG infrastructure provides a flexible and adaptable method of "no-regret" adaptation; resulting in a practical, economically efficient, and socially acceptable solution for flood risk mitigation. This work presents the methodology for analysing the impact of BG infrastructure in the context of the Adaptation Tipping Points approach to protect against pluvial flood risk in an iterative manner. An economic analysis of the adaptation pathways is also conducted in order to better inform decision-makers on the benefits and costs of the adaptation options presented. The methodology was applied to a case study in the Cranbrook Catchment in the North East of London. Our results show that BG infrastructure performs better under conditions of uncertainty than traditional grey infrastructure.

  12. Modular Approaches to Earth Science Scientific Computing: 3D Electromagnetic Induction Modeling as an Example

    NASA Astrophysics Data System (ADS)

    Tandon, K.; Egbert, G.; Siripunvaraporn, W.

    2003-12-01

    We are developing a modular system for three-dimensional inversion of electromagnetic (EM) induction data, using an object oriented programming approach. This approach allows us to modify the individual components of the inversion scheme proposed, and also reuse the components for variety of problems in earth science computing howsoever diverse they might be. In particular, the modularity allows us to (a) change modeling codes independently of inversion algorithm details; (b) experiment with new inversion algorithms; and (c) modify the way prior information is imposed in the inversion to test competing hypothesis and techniques required to solve an earth science problem. Our initial code development is for EM induction equations on a staggered grid, using iterative solution techniques in 3D. An example illustrated here is an experiment with the sensitivity of 3D magnetotelluric inversion to uncertainties in the boundary conditions required for regional induction problems. These boundary conditions should reflect the large-scale geoelectric structure of the study area, which is usually poorly constrained. In general for inversion of MT data, one fixes boundary conditions at the edge of the model domain, and adjusts the earth?s conductivity structure within the modeling domain. Allowing for errors in specification of the open boundary values is simple in principle, but no existing inversion codes that we are aware of have this feature. Adding a feature such as this is straightforward within the context of the modular approach. More generally, a modular approach provides an efficient methodology for setting up earth science computing problems to test various ideas. As a concrete illustration relevant to EM induction problems, we investigate the sensitivity of MT data near San Andreas Fault at Parkfield (California) to uncertainties in the regional geoelectric structure.

  13. Quantifying uncertainty in discharge measurements: A new approach

    USGS Publications Warehouse

    Kiang, J.E.; Cohn, T.A.; Mason, R.R.

    2009-01-01

    The accuracy of discharge measurements using velocity meters and the velocity-area method is typically assessed based on empirical studies that may not correspond to conditions encountered in practice. In this paper, a statistical approach for assessing uncertainty based on interpolated variance estimation (IVE) is introduced. The IVE method quantifies all sources of random uncertainty in the measured data. This paper presents results employing data from sites where substantial over-sampling allowed for the comparison of IVE-estimated uncertainty and observed variability among repeated measurements. These results suggest that the IVE approach can provide approximate estimates of measurement uncertainty. The use of IVE to estimate the uncertainty of a discharge measurement would provide the hydrographer an immediate determination of uncertainty and help determine whether there is a need for additional sampling in problematic river cross sections. ?? 2009 ASCE.

  14. Exploring entropic uncertainty relation in the Heisenberg XX model with inhomogeneous magnetic field

    NASA Astrophysics Data System (ADS)

    Huang, Ai-Jun; Wang, Dong; Wang, Jia-Ming; Shi, Jia-Dong; Sun, Wen-Yang; Ye, Liu

    2017-08-01

    In this work, we investigate the quantum-memory-assisted entropic uncertainty relation in a two-qubit Heisenberg XX model with inhomogeneous magnetic field. It has been found that larger coupling strength J between the two spin-chain qubits can effectively reduce the entropic uncertainty. Besides, we observe the mechanics of how the inhomogeneous field influences the uncertainty, and find out that when the inhomogeneous field parameter b<1, the uncertainty will decrease with the decrease of the inhomogeneous field parameter b, conversely, the uncertainty will increase with decreasing b under the condition that b>1. Intriguingly, the entropic uncertainty can shrink to zero when the coupling coefficients are relatively large, while the entropic uncertainty only reduces to 1 with the increase of the homogeneous magnetic field. Additionally, we observe the purity of the state and Bell non-locality and obtain that the entropic uncertainty is anticorrelated with both the purity and Bell non-locality of the evolution state.

  15. Dealing with uncertainties in environmental burden of disease assessment

    PubMed Central

    2009-01-01

    Disability Adjusted Life Years (DALYs) combine the number of people affected by disease or mortality in a population and the duration and severity of their condition into one number. The environmental burden of disease is the number of DALYs that can be attributed to environmental factors. Environmental burden of disease estimates enable policy makers to evaluate, compare and prioritize dissimilar environmental health problems or interventions. These estimates often have various uncertainties and assumptions which are not always made explicit. Besides statistical uncertainty in input data and parameters – which is commonly addressed – a variety of other types of uncertainties may substantially influence the results of the assessment. We have reviewed how different types of uncertainties affect environmental burden of disease assessments, and we give suggestions as to how researchers could address these uncertainties. We propose the use of an uncertainty typology to identify and characterize uncertainties. Finally, we argue that uncertainties need to be identified, assessed, reported and interpreted in order for assessment results to adequately support decision making. PMID:19400963

  16. Litter decomposition patterns and dynamics across biomes: Initial results from the global TeaComposition initiative

    NASA Astrophysics Data System (ADS)

    Djukic, Ika; Kappel Schmidt, Inger; Steenberg Larsen, Klaus; Beier, Claus

    2017-04-01

    Litter decomposition represents one of the largest fluxes in the global terrestrial carbon cycle and a number of large-scale decomposition experiments have been conducted focusing on this fundamental soil process. However, previous studies were most often based on site-specific litters and methodologies. The contrasting litter and soil types used and the general lack of common protocols still poses a major challenge as it adds major uncertainty to meta-analyses across different experiments and sites. In the TeaComposition initiative, we aim to investigate the potential litter decomposition by using standardized substrates (tea) for comparison of temporal litter decomposition rates across different ecosystems worldwide. To this end, Lipton tea bags (Rooibos and Green Tea) has been buried in the H-A or Ah horizon and incubated over the period of 36 months within 400 sites covering diverse ecosystems in 9 zonobiomes. We measured initial litter chemistry and litter mass loss 3 months after the start of decomposition and linked the decomposition rates to site and climatic conditions as well as to the existing decompositions rates of the local litter. We will present and discuss the outcomes of this study. Acknowledgment: We are thankful to colleagues from more than 300 sites who were participating in the implementation of this initiative and who are not mentioned individually as co-authors yet.

  17. Policy for the Unpredictable (Uncertainty Research and Policy).

    ERIC Educational Resources Information Center

    Glass, Gene V.

    1979-01-01

    Most of the variance in educational effectiveness studies is inexplicable in terms of influences that can be measured and controlled. Nevertheless, it is still possible to design educational policy that will function well under conditions of uncertainty. (Author/RLV)

  18. Full-waveform and discrete-return lidar in salt marsh environments: An assessment of biophysical parameters, vertical uncertatinty, and nonparametric dem correction

    NASA Astrophysics Data System (ADS)

    Rogers, Jeffrey N.

    High-resolution and high-accuracy elevation data sets of coastal salt marsh environments are necessary to support restoration and other management initiatives, such as adaptation to sea level rise. Lidar (light detection and ranging) data may serve this need by enabling efficient acquisition of detailed elevation data from an airborne platform. However, previous research has revealed that lidar data tend to have lower vertical accuracy (i.e., greater uncertainty) in salt marshes than in other environments. The increase in vertical uncertainty in lidar data of salt marshes can be attributed primarily to low, dense-growing salt marsh vegetation. Unfortunately, this increased vertical uncertainty often renders lidar-derived digital elevation models (DEM) ineffective for analysis of topographic features controlling tidal inundation frequency and ecology. This study aims to address these challenges by providing a detailed assessment of the factors influencing lidar-derived elevation uncertainty in marshes. The information gained from this assessment is then used to: 1) test the ability to predict marsh vegetation biophysical parameters from lidar-derived metrics, and 2) develop a method for improving salt marsh DEM accuracy. Discrete-return and full-waveform lidar, along with RTK GNSS (Real-time Kinematic Global Navigation Satellite System) reference data, were acquired for four salt marsh systems characterized by four major taxa (Spartina alterniflora, Spartina patens, Distichlis spicata, and Salicornia spp.) on Cape Cod, Massachusetts. These data were used to: 1) develop an innovative combination of full-waveform lidar and field methods to assess the vertical distribution of aboveground biomass as well as its light blocking properties; 2) investigate lidar elevation bias and standard deviation using varying interpolation and filtering methods; 3) evaluate the effects of seasonality (temporal differences between peak growth and senescent conditions) using lidar data flown in summer and spring; 4) create new products, called Relative Uncertainty Surfaces (RUS), from lidar waveform-derived metrics and determine their utility; and 5) develop and test five nonparametric regression model algorithms (MARS -- Multivariate Adaptive Regression, CART -- Classification and Regression Trees, TreeNet, Random Forests, and GPSM -- Generalized Path Seeker) with 13 predictor variables derived from both discrete and full waveform lidar sources in order to develop a method of improving lidar DEM quality. Results of this study indicate strong correlations for Spartina alterniflora (r > 0.9) between vertical biomass (VB), the distribution of vegetation biomass by height, and vertical obscuration (VO), the measure of the vertical distribution of the ratio of vegetation to airspace. It was determined that simple, feature-based lidar waveform metrics, such as waveform width, can provide new information to estimate salt marsh vegetation biophysical parameters such as vegetation height. The results also clearly illustrate the importance of seasonality, species, and lidar interpolation and filtering methods on elevation uncertainty in salt marshes. Relative uncertainty surfaces generated from lidar waveform features were determined useful in qualitative/visual assessment of lidar elevation uncertainty and correlate well with vegetation height and presence of Spartina alterniflora. Finally, DEMs generated using full-waveform predictor models produced corrections (compared to ground based RTK GNSS elevations) with R2 values of up to 0.98 and slopes within 4% of a perfect 1:1 correlation. The findings from this research have strong potential to advance tidal marsh mapping, research and management initiatives.

  19. Treatment of uncertainty in artificial intelligence

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.

    1988-01-01

    The present assessment of the development status of research efforts concerned with AI reasoning under conditions of uncertainty emphasizes the importance of appropriateness in the approach selected for both the epistemic and the computational levels. At the former level, attention is given to the form of uncertainty-representation and the fidelity of its reflection of actual problems' uncertainties; at the latter level, such issues as the availability of the requisite information and the complexity of the reasoning process must be considered. The tradeoff between these levels must always be the focus of AI system-developers' attention.

  20. Investigation and incorporation of water inflow uncertainties through stochastic modelling in a combined optimisation methodology for water allocation in Alfeios River (Greece)

    NASA Astrophysics Data System (ADS)

    Bekri, Eleni; Yannopoulos, Panayotis; Disse, Markus

    2014-05-01

    The Alfeios River plays a vital role for Western Peloponnisos in Greece from natural, ecological, social and economic aspect. The main river and its six tributaries, forming the longest watercourse and the highest streamflow rate of Peloponnisose, represent a significant source of water supply for the region, aiming at delivering and satisfying the expected demands from a variety of water users, including irrigation, drinking water supply, hydropower production and recreation. In the previous EGU General Assembly, a fuzzy-boundary-interval linear programming methodology, based on Li et al. (2010) and Bekri et al. (2012), has been presented for optimal water allocation under uncertain and vague system conditions in the Alfeios River Basin. Uncertainties associated with the benefit and cost coefficient in the objective function of the main water uses (hydropower production and irrigation) were expressed as probability distributions and fuzzy boundary intervals derived by associated α-cut levels. The uncertainty of the monthly water inflows was not incorporated in the previous initial application and the analysis of all other sources of uncertainty has been applied to two extreme hydrologic years represented by a selected wet and dry year. To manage and operate the river system, decision makers should be able to analyze and evaluate the impact of various hydrologic scenarios. In the present work, the critical uncertain parameter of water inflows is analyzed and its incorporation as an additional type of uncertainty in the suggested methodology is investigated, in order to enable the assessment of optimal water allocation for hydrologic and socio-economic scenarios based both on historical data and projected climate change conditions. For this purpose, stochastic simulation analysis for a part of the Alfeios river system is undertaken, testing various stochastic models from simple stationary ones (AR and ARMA), Thomas-Fiering, ARIMA as well as more sophisticated and complete such as CASTALIA. A short description and comparison of their assumptions, the differences between them and the presentation of the results are included. Li, Y.P., Huang, G.H. and S.L., Nie, (2010), Planning water resources management systems using a fuzzy boundary interval-stochastic programming method, Elsevier Ltd, Advances in Water Resources, 33: 1105-1117. doi:10.1016/j.advwatres.2010.06.015 Bekri, E.S., Disse, M. and P.C.,Yannopoulos, (2012), Methodological framework for correction of quick river discharge measurements using quality characteristics, Session of Environmental Hydraulics - Hydrodynamics, 2nd Common Conference of Hellenic Hydrotechnical Association and Greek Committee for Water Resources Management, Volume: 546-557 (in Greek).

  1. Identification of growth phases and influencing factors in cultivations with AGE1.HN cells using set-based methods.

    PubMed

    Borchers, Steffen; Freund, Susann; Rath, Alexander; Streif, Stefan; Reichl, Udo; Findeisen, Rolf

    2013-01-01

    Production of bio-pharmaceuticals in cell culture, such as mammalian cells, is challenging. Mathematical models can provide support to the analysis, optimization, and the operation of production processes. In particular, unstructured models are suited for these purposes, since they can be tailored to particular process conditions. To this end, growth phases and the most relevant factors influencing cell growth and product formation have to be identified. Due to noisy and erroneous experimental data, unknown kinetic parameters, and the large number of combinations of influencing factors, currently there are only limited structured approaches to tackle these issues. We outline a structured set-based approach to identify different growth phases and the factors influencing cell growth and metabolism. To this end, measurement uncertainties are taken explicitly into account to bound the time-dependent specific growth rate based on the observed increase of the cell concentration. Based on the bounds on the specific growth rate, we can identify qualitatively different growth phases and (in-)validate hypotheses on the factors influencing cell growth and metabolism. We apply the approach to a mammalian suspension cell line (AGE1.HN). We show that growth in batch culture can be divided into two main growth phases. The initial phase is characterized by exponential growth dynamics, which can be described consistently by a relatively simple unstructured and segregated model. The subsequent phase is characterized by a decrease in the specific growth rate, which, as shown, results from substrate limitation and the pH of the medium. An extended model is provided which describes the observed dynamics of cell growth and main metabolites, and the corresponding kinetic parameters as well as their confidence intervals are estimated. The study is complemented by an uncertainty and outlier analysis. Overall, we demonstrate utility of set-based methods for analyzing cell growth and metabolism under conditions of uncertainty.

  2. Identification of Growth Phases and Influencing Factors in Cultivations with AGE1.HN Cells Using Set-Based Methods

    PubMed Central

    Borchers, Steffen; Freund, Susann; Rath, Alexander; Streif, Stefan; Reichl, Udo; Findeisen, Rolf

    2013-01-01

    Production of bio-pharmaceuticals in cell culture, such as mammalian cells, is challenging. Mathematical models can provide support to the analysis, optimization, and the operation of production processes. In particular, unstructured models are suited for these purposes, since they can be tailored to particular process conditions. To this end, growth phases and the most relevant factors influencing cell growth and product formation have to be identified. Due to noisy and erroneous experimental data, unknown kinetic parameters, and the large number of combinations of influencing factors, currently there are only limited structured approaches to tackle these issues. We outline a structured set-based approach to identify different growth phases and the factors influencing cell growth and metabolism. To this end, measurement uncertainties are taken explicitly into account to bound the time-dependent specific growth rate based on the observed increase of the cell concentration. Based on the bounds on the specific growth rate, we can identify qualitatively different growth phases and (in-)validate hypotheses on the factors influencing cell growth and metabolism. We apply the approach to a mammalian suspension cell line (AGE1.HN). We show that growth in batch culture can be divided into two main growth phases. The initial phase is characterized by exponential growth dynamics, which can be described consistently by a relatively simple unstructured and segregated model. The subsequent phase is characterized by a decrease in the specific growth rate, which, as shown, results from substrate limitation and the pH of the medium. An extended model is provided which describes the observed dynamics of cell growth and main metabolites, and the corresponding kinetic parameters as well as their confidence intervals are estimated. The study is complemented by an uncertainty and outlier analysis. Overall, we demonstrate utility of set-based methods for analyzing cell growth and metabolism under conditions of uncertainty. PMID:23936299

  3. An Overview of the Tropospheric Aerosol Radiative Forcing Observational Experiment

    NASA Technical Reports Server (NTRS)

    Russell, P. B.; Chan, K. Roland (Technical Monitor)

    1997-01-01

    Aerosol effects on atmospheric radiation are a leading source of uncertainty in predicting future climate. As a result, the International Global Atmospheric Chemistry Program has established a Focus on Atmospheric Aerosols (IGAC/FAA) and endorsed a series of aerosol field campaigns. TARFOX, the second in the IGAC/FAA series, was designed to reduce this uncertainty by measuring aerosol properties and effects in the US eastern seaboard, where one of the world's major plumes of industrial haze moves from the continent over the Atlantic Ocean. TARFOX's objectives are to: 1. Make simultaneous measurements of: (a) aerosol effects on radiation fields, and (b) the chemical, physical, and optical properties of the aerosols causing those effects. 2. Perform a variety of closure studies by using overdetermined data sets to test the mutual consistency of measurements and calculations of a wide range of aerosol properties and effects. 3. Use the results of the closure studies to assess and reduce uncertainties in estimates of aerosol radiative forcing, as well as to guide future field programs. An important subset of the closure studies is tests and improvements of algorithms used to derive aerosol properties and radiative effects from satellite measurements. The TARFOX Intensive Field Period (IFP) was conducted July 10-31, 1996. It included coordinated measurements from four satellites (GOES-8, NOAA-14, ERS-2, LANDSAT), four aircraft (ER-2, C-130, C-131, and a modified Cessna), land sites, and ships. A variety of aerosol conditions was sampled, ranging from relatively clean behind frontal passages to moderately polluted with aerosol optical depths exceeding 0.5 at mid-visible wavelengths. The latter conditions included separate incidents of enhancements caused primarily by anthropogenic sources and another incident of enhancement apparently influenced by recent fog processing. Spatial gradients of aerosol optical thickness were sampled to aid in isolating aerosol effects from other radiative effects and to more tightly constrain closure tests, including those of satellite retrievals. This talk gives an overview of TARFOX goals, rationale, methods, and initial key findings.

  4. The resolution of ambiguity as the basis for life: A cellular bridge between Western reductionism and Eastern holism.

    PubMed

    Torday, John S; Miller, William B

    2017-12-01

    Boundary conditions enable cellular life through negentropy, chemiosmosis, and homeostasis as identifiable First Principles of Physiology. Self-referential awareness of status arises from this organized state to sustain homeostatic imperatives. Preferred homeostatic status is dependent upon the appraisal of information and its communication. However, among living entities, sources of information and their dissemination are always imprecise. Consequently, living systems exist within an innate state of ambiguity. It is presented that cellular life and evolutionary development are a self-organizing cellular response to uncertainty in iterative conformity with its basal initiating parameters. Viewing the life circumstance in this manner permits a reasoned unification between Western rational reductionism and Eastern holism. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Integral sliding mode-based attitude coordinated tracking for spacecraft formation with communication delays

    NASA Astrophysics Data System (ADS)

    Zhang, Jian; Hu, Qinglei; Xie, Wenbo

    2017-11-01

    This paper investigates the attitude coordinated tracking control for a group of rigid spacecraft under directed communication topology, in which inertia uncertainties, external disturbances, input saturation and constant time-delays between the formation members are handled. Initially, the nominal system with communication delays is studied. A delay-dependent controller is proposed by using Lyapunov-Krasovskii function and sufficient condition for system stability is derived. Then, an integral sliding manifold is designed and adaptive control approach is employed to deal with the total perturbation. Meanwhile, the boundary layer method is introduced to alleviate the unexpected chattering as system trajectories cross the switching surface. Finally, numerical simulation results are presented to validate the effectiveness and robustness of the proposed control strategy.

  6. Initial Probabilistic Evaluation of Reactor Pressure Vessel Fracture with Grizzly and Raven

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spencer, Benjamin; Hoffman, William; Sen, Sonat

    2015-10-01

    The Grizzly code is being developed with the goal of creating a general tool that can be applied to study a variety of degradation mechanisms in nuclear power plant components. The first application of Grizzly has been to study fracture in embrittled reactor pressure vessels (RPVs). Grizzly can be used to model the thermal/mechanical response of an RPV under transient conditions that would be observed in a pressurized thermal shock (PTS) scenario. The global response of the vessel provides boundary conditions for local models of the material in the vicinity of a flaw. Fracture domain integrals are computed to obtainmore » stress intensity factors, which can in turn be used to assess whether a fracture would initiate at a pre-existing flaw. These capabilities have been demonstrated previously. A typical RPV is likely to contain a large population of pre-existing flaws introduced during the manufacturing process. This flaw population is characterized stastistically through probability density functions of the flaw distributions. The use of probabilistic techniques is necessary to assess the likelihood of crack initiation during a transient event. This report documents initial work to perform probabilistic analysis of RPV fracture during a PTS event using a combination of the RAVEN risk analysis code and Grizzly. This work is limited in scope, considering only a single flaw with deterministic geometry, but with uncertainty introduced in the parameters that influence fracture toughness. These results are benchmarked against equivalent models run in the FAVOR code. When fully developed, the RAVEN/Grizzly methodology for modeling probabilistic fracture in RPVs will provide a general capability that can be used to consider a wider variety of vessel and flaw conditions that are difficult to consider with current tools. In addition, this will provide access to advanced probabilistic techniques provided by RAVEN, including adaptive sampling and parallelism, which can dramatically decrease run times.« less

  7. Early adolescent adversity inflates threat estimation in females and promotes alcohol use initiation in both sexes.

    PubMed

    Walker, Rachel A; Andreansky, Christopher; Ray, Madelyn H; McDannald, Michael A

    2018-06-01

    Childhood adversity is associated with exaggerated threat processing and earlier alcohol use initiation. Conclusive links remain elusive, as childhood adversity typically co-occurs with detrimental socioeconomic factors, and its impact is likely moderated by biological sex. To unravel the complex relationships among childhood adversity, sex, threat estimation, and alcohol use initiation, we exposed female and male Long-Evans rats to early adolescent adversity (EAA). In adulthood, >50 days following the last adverse experience, threat estimation was assessed using a novel fear discrimination procedure in which cues predict a unique probability of footshock: danger (p = 1.00), uncertainty (p = .25), and safety (p = .00). Alcohol use initiation was assessed using voluntary access to 20% ethanol, >90 days following the last adverse experience. During development, EAA slowed body weight gain in both females and males. In adulthood, EAA selectively inflated female threat estimation, exaggerating fear to uncertainty and safety, but promoted alcohol use initiation across sexes. Meaningful relationships between threat estimation and alcohol use initiation were not observed, underscoring the independent effects of EAA. Results isolate the contribution of EAA to adult threat estimation, alcohol use initiation, and reveal moderation by biological sex. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  8. Advances in Medical Analytics Solutions for Autonomous Medical Operations on Long-Duration Missions

    NASA Technical Reports Server (NTRS)

    Thompson, David E.; Lindsey, Antonia Edward

    2017-01-01

    A review will be presented on the progress made under STMDGame Changing Development Program Funding towards the development of a Medical Decision Support System for augmenting crew capabilities during long-duration missions, such as Mars Transit. To create an MDSS, initial work requires acquiring images and developing models that analyze and assess the features in such medical biosensor images that support medical assessment of pathologies. For FY17, the project has focused on ultrasound images towards cardiac pathologies: namely, evaluation and assessment of pericardial effusion identification and discrimination from related pneumothorax and even bladder-induced infections that cause inflammation around the heart. This identification is substantially changed due to uncertainty due to conditions of fluid behavior under space-microgravity. This talk will present and discuss the work-to-date in this Project, recognizing conditions under which various machine learning technologies, deep-learning via convolutional neural nets, and statistical learning methods for feature identification and classification can be employed and conditioned to graphical format in preparation for attachment to an inference engine that eventually creates decision support recommendations to remote crew in a triage setting.

  9. Application of synthetic scenarios to address water resource concerns: A management-guided case study from the Upper Colorado River Basin

    USGS Publications Warehouse

    McAfee, Stephanie A.; Pederson, Gregory T.; Woodhouse, Connie A.; McCabe, Gregory

    2017-01-01

    Water managers are increasingly interested in better understanding and planning for projected resource impacts from climate change. In this management-guided study, we use a very large suite of synthetic climate scenarios in a statistical modeling framework to simultaneously evaluate how (1) average temperature and precipitation changes, (2) initial basin conditions, and (3) temporal characteristics of the input climate data influence water-year flow in the Upper Colorado River. The results here suggest that existing studies may underestimate the degree of uncertainty in future streamflow, particularly under moderate temperature and precipitation changes. However, we also find that the relative severity of future flow projections within a given climate scenario can be estimated with simple metrics that characterize the input climate data and basin conditions. These results suggest that simple testing, like the analyses presented in this paper, may be helpful in understanding differences between existing studies or in identifying specific conditions for physically based mechanistic modeling. Both options could reduce overall cost and improve the efficiency of conducting climate change impacts studies.

  10. CFD Simulations to Improve Ventilation in Low-Income Housing

    NASA Astrophysics Data System (ADS)

    Ho, Rosemond; Gorle, Catherine

    2017-11-01

    Quality of housing plays an important role in public health. In Dhaka, Bangladesh, the leading causes of death include tuberculosis, lower respiratory infections, and chronic obstructive pulmonary disease, so improving home ventilation could potentially mitigate these negative health effects. The goal of this project is to use computational fluid dynamics (CFD) to predict the relative effectiveness of different ventilation strategies for Dhaka homes. A Reynolds-averaged Navier-Stokes CFD model of a standard Dhaka home with apertures of different sizes and locations was developed to predict air exchange rates. Our initial focus is on simulating ventilation driven by buoyancy-alone conditions, which is often considered the limiting case in natural ventilation design. We explore the relationship between ventilation rate and aperture area to determine the most promising configurations for optimal ventilation solutions. Future research will include the modeling of wind-driven conditions, and extensive uncertainty quantification studies to investigate the effect of variability in the layout of homes and neighborhoods, and in local wind and temperature conditions. The ultimate objective is to formulate robust design recommendations that can reduce risks of respiratory illness in low-income housing.

  11. Upper Cervical Epidural Abscess in Clinical Practice: Diagnosis and Management

    PubMed Central

    Al-Hourani, Khalid; Al-Aref, Rami; Mesfin, Addisu

    2015-01-01

    Study Design Narrative review. Objective Upper cervical epidural abscess (UCEA) is a rare surgical emergency. Despite increasing incidence, uncertainty remains as to how it should initially be managed. Risk factors for UCEA include immunocompromised hosts, diabetes mellitus, and intravenous drug use. Our objective is to provide a comprehensive overview of the literature including the history, clinical manifestations, diagnosis, and management of UCEA. Methods Using PubMed, studies published prior to 2015 were analyzed. We used the keywords “Upper cervical epidural abscess,” “C1 osteomyelitis,” “C2 osteomyelitis,” “C1 epidural abscess,” “C2 epidural abscess.” We excluded cases with tuberculosis. Results The review addresses epidemiology, etiology, imaging, microbiology, and diagnosis of this condition. We also address the nonoperative and operative management options and the relative indications for each as reviewed in the literature. Conclusion A high index of suspicion is required to diagnose this rare condition with magnetic resonance imaging being the imaging modality of choice. There has been a shift toward surgical management of this condition in recent times, with favorable outcomes. PMID:27190742

  12. Cumulative uncertainty in measured streamflow and water quality data for small watersheds

    USGS Publications Warehouse

    Harmel, R.D.; Cooper, R.J.; Slade, R.M.; Haney, R.L.; Arnold, J.G.

    2006-01-01

    The scientific community has not established an adequate understanding of the uncertainty inherent in measured water quality data, which is introduced by four procedural categories: streamflow measurement, sample collection, sample preservation/storage, and laboratory analysis. Although previous research has produced valuable information on relative differences in procedures within these categories, little information is available that compares the procedural categories or presents the cumulative uncertainty in resulting water quality data. As a result, quality control emphasis is often misdirected, and data uncertainty is typically either ignored or accounted for with an arbitrary margin of safety. Faced with the need for scientifically defensible estimates of data uncertainty to support water resource management, the objectives of this research were to: (1) compile selected published information on uncertainty related to measured streamflow and water quality data for small watersheds, (2) use a root mean square error propagation method to compare the uncertainty introduced by each procedural category, and (3) use the error propagation method to determine the cumulative probable uncertainty in measured streamflow, sediment, and nutrient data. Best case, typical, and worst case "data quality" scenarios were examined. Averaged across all constituents, the calculated cumulative probable uncertainty (??%) contributed under typical scenarios ranged from 6% to 19% for streamflow measurement, from 4% to 48% for sample collection, from 2% to 16% for sample preservation/storage, and from 5% to 21% for laboratory analysis. Under typical conditions, errors in storm loads ranged from 8% to 104% for dissolved nutrients, from 8% to 110% for total N and P, and from 7% to 53% for TSS. Results indicated that uncertainty can increase substantially under poor measurement conditions and limited quality control effort. This research provides introductory scientific estimates of uncertainty in measured water quality data. The results and procedures presented should also assist modelers in quantifying the "quality"of calibration and evaluation data sets, determining model accuracy goals, and evaluating model performance.

  13. Anomaly transform methods based on total energy and ocean heat content norms for generating ocean dynamic disturbances for ensemble climate forecasts

    NASA Astrophysics Data System (ADS)

    Romanova, Vanya; Hense, Andreas

    2017-08-01

    In our study we use the anomaly transform, a special case of ensemble transform method, in which a selected set of initial oceanic anomalies in space, time and variables are defined and orthogonalized. The resulting orthogonal perturbation patterns are designed such that they pick up typical balanced anomaly structures in space and time and between variables. The metric used to set up the eigen problem is taken either as the weighted total energy with its zonal, meridional kinetic and available potential energy terms having equal contributions, or the weighted ocean heat content in which a disturbance is applied only to the initial temperature fields. The choices of a reference state for defining the initial anomalies are such that either perturbations on seasonal timescales and or on interannual timescales are constructed. These project a-priori only the slow modes of the ocean physical processes, such that the disturbances grow mainly in the Western Boundary Currents, in the Antarctic Circumpolar Current and the El Nino Southern Oscillation regions. An additional set of initial conditions is designed to fit in a least square sense data from global ocean reanalysis. Applying the AT produced sets of disturbances to oceanic initial conditions initialized by observations of the MPIOM-ESM coupled model on T63L47/GR15 resolution, four ensemble and one hind-cast experiments were performed. The weighted total energy norm is used to monitor the amplitudes and rates of the fastest growing error modes. The results showed minor dependence of the instabilities or error growth on the selected metric but considerable change due to the magnitude of the scaling amplitudes of the perturbation patterns. In contrast to similar atmospheric applications, we find an energy conversion from kinetic to available potential energy, which suggests a different source of uncertainty generation in the ocean than in the atmosphere mainly associated with changes in the density field.

  14. Uncertainty assessment of future land use in Brazil under increasing demand for bioenergy

    NASA Astrophysics Data System (ADS)

    van der Hilst, F.; Verstegen, J. A.; Karssenberg, D.; Faaij, A.

    2013-12-01

    Environmental impacts of a future increase in demand for bioenergy depend on the magnitude, location and pattern of the direct and indirect land use change of energy cropland expansion. Here we aim at 1) projecting the spatio-temporal pattern of sugar cane expansion and the effect on other land uses in Brazil towards 2030, and 2) assessing the uncertainty herein. For the spatio-temporal projection, three model components are used: 1) an initial land use map that shows the initial amount and location of sugar cane and all other relevant land use classes in the system, 2) a model to project the quantity of change of all land uses, and 3) a spatially explicit land use model that determines the location of change of all land uses. All three model components are sources of uncertainty, which is quantified by defining error models for all components and their inputs and propagating these errors through the chain of components. No recent accurate land use map is available for Brazil, so municipal census data and the global land cover map GlobCover are combined to create the initial land use map. The census data are disaggregated stochastically using GlobCover as a probability surface, to obtain a stochastic land use raster map for 2006. Since bioenergy is a global market, the quantity of change in sugar cane in Brazil depends on dynamics in both Brazil itself and other parts of the world. Therefore, a computable general equilibrium (CGE) model, MAGNET, is run to produce a time series of the relative change of all land uses given an increased future demand for bioenergy. A sensitivity analysis finds the upper and lower boundaries hereof, to define this component's error model. An initial selection of drivers of location for each land use class is extracted from literature. Using a Bayesian data assimilation technique and census data from 2007 to 2011 as observational data, the model is identified, meaning that the final selection and optimal relative importance of the drivers of location are determined. The data assimilation technique takes into account uncertainty in the observational data and yields a stochastic representation of the identified model. Using all stochastic inputs, this land use change model is run to find at which locations the future land use changes occur and to quantify the associated uncertainty. The results indicate that in the initial land use map especially the locations of pastures are uncertain. Since the dynamics in the livestock sector play a major role in the land use development of Brazil, the effect of this uncertainty on the model output is large. Results of the data assimilation indicate that the drivers of location of the land uses vary over time (variations up to 50% in the importance of the drivers) making it difficult to find a solid stationary system representation. Overall, we conclude that projection up to 2030 is only of use for quantifying impacts that act on a larger aggregation level, because at local level uncertainty is too large.

  15. Quantifying the effectiveness of conservation measures to control the spread of anthropogenic hybridization in stream salmonids: a climate adaptation case study

    USGS Publications Warehouse

    Al-Chokhachy, Robert K.; Muhlfeld, Clint C.; Boyer, Matthew; Jones, Leslie A.; Steed, Amber; Kershner, Jeffrey L.

    2014-01-01

    Quantifying the effectiveness of management actions to mitigate the effects of changing climatic conditions (i.e., climate adaptation) can be difficult, yet critical for conservation. We used population genetic data from 1984 to 2011 to assess the degree to which ambient climatic conditions and targeted suppression of sources of nonnative Rainbow Trout Oncorhynchus mykiss have influenced the spread of introgressive hybridization in native populations of Westslope Cutthroat Trout O. clarkii lewisi. We found rapid expansion in the spatial distribution and proportion of nonnative genetic admixture in hybridized populations from 1984 to 2004, but minimal change since 2004. The spread of hybridization was negatively correlated with the number of streamflow events in May that exceeded the 75th percentile of historic flows (r = −0.98) and positively correlated with August stream temperatures (r = 0.89). Concomitantly, suppression data showed a 60% decline in catch per unit effort for fish with a high proportion of Rainbow Trout admixture, rendering some uncertainty as to the relative strength of factors controlling the spread of hybridization. Our results illustrate the importance of initiating management actions to mitigate the potential effects of climate change, even where data describing the effectiveness of such actions are initially limited but the risks are severe.

  16. Stochastic Coastal/Regional Uncertainty Modelling: a Copernicus marine research project in the framework of Service Evolution

    NASA Astrophysics Data System (ADS)

    Vervatis, Vassilios; De Mey, Pierre; Ayoub, Nadia; Kailas, Marios; Sofianos, Sarantis

    2017-04-01

    The project entitled Stochastic Coastal/Regional Uncertainty Modelling (SCRUM) aims at strengthening CMEMS in the areas of ocean uncertainty quantification, ensemble consistency verification and ensemble data assimilation. The project has been initiated by the University of Athens and LEGOS/CNRS research teams, in the framework of CMEMS Service Evolution. The work is based on stochastic modelling of ocean physics and biogeochemistry in the Bay of Biscay, on an identical sub-grid configuration of the IBI-MFC system in its latest CMEMS operational version V2. In a first step, we use a perturbed tendencies scheme to generate ensembles describing uncertainties in open ocean and on the shelf, focusing on upper ocean processes. In a second step, we introduce two methodologies (i.e. rank histograms and array modes) aimed at checking the consistency of the above ensembles with respect to TAC data and arrays. Preliminary results highlight that wind uncertainties dominate all other atmosphere-ocean sources of model errors. The ensemble spread in medium-range ensembles is approximately 0.01 m for SSH and 0.15 °C for SST, though these values vary depending on season and cross shelf regions. Ecosystem model uncertainties emerging from perturbations in physics appear to be moderately larger than those perturbing the concentration of the biogeochemical compartments, resulting in total chlorophyll spread at about 0.01 mg.m-3. First consistency results show that the model ensemble and the pseudo-ensemble of OSTIA (L4) observation SSTs appear to exhibit nonzero joint probabilities with each other since error vicinities overlap. Rank histograms show that the model ensemble is initially under-dispersive, though results improve in the context of seasonal-range ensembles.

  17. Default risk modeling beyond the first-passage approximation: Extended Black-Cox model

    NASA Astrophysics Data System (ADS)

    Katz, Yuri A.; Shokhirev, Nikolai V.

    2010-07-01

    We develop a generalization of the Black-Cox structural model of default risk. The extended model captures uncertainty related to firm’s ability to avoid default even if company’s liabilities momentarily exceeding its assets. Diffusion in a linear potential with the radiation boundary condition is used to mimic a company’s default process. The exact solution of the corresponding Fokker-Planck equation allows for derivation of analytical expressions for the cumulative probability of default and the relevant hazard rate. Obtained closed formulas fit well the historical data on global corporate defaults and demonstrate the split behavior of credit spreads for bonds of companies in different categories of speculative-grade ratings with varying time to maturity. Introduction of the finite rate of default at the boundary improves valuation of credit risk for short time horizons, which is the key advantage of the proposed model. We also consider the influence of uncertainty in the initial distance to the default barrier on the outcome of the model and demonstrate that this additional source of incomplete information may be responsible for nonzero credit spreads for bonds with very short time to maturity.

  18. Adsorption of n-butane on graphene/Ru(0001)—A molecular beam scattering study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sivapragasam, Nilushni; Nayakasinghe, Mindika T.; Burghaus, Uwe, E-mail: uwe.burghaus@ndsu.edu

    2016-07-15

    Adsorption kinetics/dynamics of n-butane on graphene, physical vapor deposited on Ru(0001) (hereafter G/Ru), and bare Ru(0001) (hereafter Ru) are discussed. The chemical activity of the supported-graphene as well as the support was probed by thermal desorption spectroscopy (adsorption kinetics). In addition and to the best of our knowledge, for the first time, molecular beam scattering data of larger molecules were collected for graphene (probing the adsorption dynamics). Furthermore, samples were inspected by x-ray photoelectron spectroscopy and Auger electron spectroscopy. At the measuring conditions used here, n-butane adsorption kinetics/dynamics are molecular and nonactivated. Binding energies of butane on Ru and G/Rumore » are indistinguishable within experimental uncertainty. Thus, G/Ru is “kinetically transparent.” Initial adsorption probabilities, S{sub 0}, of n-butane decrease with increasing impact energy (0.76–1.72 eV) and are adsorption temperature independent for both Ru and G/Ru, again consistent with molecular adsorption. Also, S{sub 0} of Ru and G/Ru are indistinguishable within experimental uncertainty. Thus, G/Ru is “dynamically transparent.” Coverage dependent adsorption probabilities indicate precursor effects for graphene/Ru.« less

  19. Development of a fuzzy logic expert system for pile selection. Master's thesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ulshafer, M.L.

    1989-01-01

    This thesis documents the development of prototype expert system for pile selection for use on microcomputers. It concerns the initial selection of a pile foundation taking into account the parameters such as soil condition, pile length, loading scenario, material availability, contractor experience, and noise or vibration constraints. The prototype expert system called Pile Selection, version 1 (PS1) was developed using an expert system shell FLOPS. FLOPS is a shell based on the AI language OPS5 with many unique features. The system PS1 utilizes all of these unique features. Among the features used are approximate reasoning with fuzzy set theory, themore » blackboard architecture, and the emulated parallel processing of fuzzy production rules. A comprehensive review of the parameters used in selecting a pile was made, and the effects of the uncertainties associated with the vagueness of these parameters was examined in detail. Fuzzy set theory was utilized to deal with such uncertainties and provides the basis for developing a method for determining the best possible choice of piles for a given situation. Details of the development of PS1, including documenting and collating pile information for use in the expert knowledge data bases, are discussed.« less

  20. Bayesian analysis for erosion modelling of sediments in combined sewer systems.

    PubMed

    Kanso, A; Chebbo, G; Tassin, B

    2005-01-01

    Previous research has confirmed that the sediments at the bed of combined sewer systems are the main source of particulate and organic pollution during rain events contributing to combined sewer overflows. However, existing urban stormwater models utilize inappropriate sediment transport formulas initially developed from alluvial hydrodynamics. Recently, a model has been formulated and profoundly assessed based on laboratory experiments to simulate the erosion of sediments in sewer pipes taking into account the increase in strength with depth in the weak layer of deposits. In order to objectively evaluate this model, this paper presents a Bayesian analysis of the model using field data collected in sewer pipes in Paris under known hydraulic conditions. The test has been performed using a MCMC sampling method for calibration and uncertainty assessment. Results demonstrate the capacity of the model to reproduce erosion as a direct response to the increase in bed shear stress. This is due to the model description of the erosional strength in the deposits and to the shape of the measured bed shear stress. However, large uncertainties in some of the model parameters suggest that the model could be over-parameterised and necessitates a large amount of informative data for its calibration.

  1. A novel approach for incremental uncertainty rule generation from databases with missing values handling: application to dynamic medical databases.

    PubMed

    Konias, Sokratis; Chouvarda, Ioanna; Vlahavas, Ioannis; Maglaveras, Nicos

    2005-09-01

    Current approaches for mining association rules usually assume that the mining is performed in a static database, where the problem of missing attribute values does not practically exist. However, these assumptions are not preserved in some medical databases, like in a home care system. In this paper, a novel uncertainty rule algorithm is illustrated, namely URG-2 (Uncertainty Rule Generator), which addresses the problem of mining dynamic databases containing missing values. This algorithm requires only one pass from the initial dataset in order to generate the item set, while new metrics corresponding to the notion of Support and Confidence are used. URG-2 was evaluated over two medical databases, introducing randomly multiple missing values for each record's attribute (rate: 5-20% by 5% increments) in the initial dataset. Compared with the classical approach (records with missing values are ignored), the proposed algorithm was more robust in mining rules from datasets containing missing values. In all cases, the difference in preserving the initial rules ranged between 30% and 60% in favour of URG-2. Moreover, due to its incremental nature, URG-2 saved over 90% of the time required for thorough re-mining. Thus, the proposed algorithm can offer a preferable solution for mining in dynamic relational databases.

  2. An algorithm for U-Pb isotope dilution data reduction and uncertainty propagation

    NASA Astrophysics Data System (ADS)

    McLean, N. M.; Bowring, J. F.; Bowring, S. A.

    2011-06-01

    High-precision U-Pb geochronology by isotope dilution-thermal ionization mass spectrometry is integral to a variety of Earth science disciplines, but its ultimate resolving power is quantified by the uncertainties of calculated U-Pb dates. As analytical techniques have advanced, formerly small sources of uncertainty are increasingly important, and thus previous simplifications for data reduction and uncertainty propagation are no longer valid. Although notable previous efforts have treated propagation of correlated uncertainties for the U-Pb system, the equations, uncertainties, and correlations have been limited in number and subject to simplification during propagation through intermediary calculations. We derive and present a transparent U-Pb data reduction algorithm that transforms raw isotopic data and measured or assumed laboratory parameters into the isotopic ratios and dates geochronologists interpret without making assumptions about the relative size of sample components. To propagate uncertainties and their correlations, we describe, in detail, a linear algebraic algorithm that incorporates all input uncertainties and correlations without limiting or simplifying covariance terms to propagate them though intermediate calculations. Finally, a weighted mean algorithm is presented that utilizes matrix elements from the uncertainty propagation algorithm to propagate random and systematic uncertainties for data comparison between other U-Pb labs and other geochronometers. The linear uncertainty propagation algorithms are verified with Monte Carlo simulations of several typical analyses. We propose that our algorithms be considered by the community for implementation to improve the collaborative science envisioned by the EARTHTIME initiative.

  3. Use of Linear Prediction Uncertainty Analysis to Guide Conditioning of Models Simulating Surface-Water/Groundwater Interactions

    NASA Astrophysics Data System (ADS)

    Hughes, J. D.; White, J.; Doherty, J.

    2011-12-01

    Linear prediction uncertainty analysis in a Bayesian framework was applied to guide the conditioning of an integrated surface water/groundwater model that will be used to predict the effects of groundwater withdrawals on surface-water and groundwater flows. Linear prediction uncertainty analysis is an effective approach for identifying (1) raw and processed data most effective for model conditioning prior to inversion, (2) specific observations and periods of time critically sensitive to specific predictions, and (3) additional observation data that would reduce model uncertainty relative to specific predictions. We present results for a two-dimensional groundwater model of a 2,186 km2 area of the Biscayne aquifer in south Florida implicitly coupled to a surface-water routing model of the actively managed canal system. The model domain includes 5 municipal well fields withdrawing more than 1 Mm3/day and 17 operable surface-water control structures that control freshwater releases from the Everglades and freshwater discharges to Biscayne Bay. More than 10 years of daily observation data from 35 groundwater wells and 24 surface water gages are available to condition model parameters. A dense parameterization was used to fully characterize the contribution of the inversion null space to predictive uncertainty and included bias-correction parameters. This approach allows better resolution of the boundary between the inversion null space and solution space. Bias-correction parameters (e.g., rainfall, potential evapotranspiration, and structure flow multipliers) absorb information that is present in structural noise that may otherwise contaminate the estimation of more physically-based model parameters. This allows greater precision in predictions that are entirely solution-space dependent, and reduces the propensity for bias in predictions that are not. Results show that application of this analysis is an effective means of identifying those surface-water and groundwater data, both raw and processed, that minimize predictive uncertainty, while simultaneously identifying the maximum solution-space dimensionality of the inverse problem supported by the data.

  4. X3 expansion tube driver gas spectroscopy and temperature measurements

    NASA Astrophysics Data System (ADS)

    Parekh, V.; Gildfind, D.; Lewis, S.; James, C.

    2018-07-01

    The University of Queensland's X3 facility is a large, free-piston driven expansion tube used for super-orbital and high Mach number scramjet aerothermodynamic studies. During recent development of new scramjet test flow conditions, experimentally measured shock speeds were found to be significantly lower than that predicted by initial driver performance calculations. These calculations were based on ideal, isentropic compression of the driver gas and indicated that loss mechanisms, not accounted for in the preliminary analysis, were significant. The critical determinant of shock speed is peak driver gas sound speed, which for a given gas composition depends on the peak driver gas temperature. This temperature may be inaccurately estimated if an incorrect fill temperature is assumed, or if heat losses during driver gas compression are significant but not accounted for. For this study, the ideal predicted peak temperature was 3750 K, without accounting for losses. However, a much lower driver temperature of 2400 K is suggested based on measured experimental shock speeds. This study aimed to measure initial and peak driver gas temperatures for a representative X3 operating condition. Examination of the transient temperatures of the driver gas and compression tube steel wall during the initial fill process showed that once the filling process was complete, the steady-state driver gas temperature closely matched the tube wall temperature. Therefore, while assuming the gas is initially at the ambient laboratory temperature is not a significant source of error, it can be entirely mitigated by simply monitoring tube wall temperature. Optical emission spectroscopy was used to determine the driver gas spectra after diaphragm rupture; the driver gas emission spectrum exhibited a significant continuum radiation component, with prominent spectral lines attributed to contamination of the gas. A graybody approximation of the continuum suggested a peak driver gas temperature of 3200 K; uncertainty associated with the blackbody curve fit is ±100 K. However, work is required to quantify additional sources of uncertainty due to the graybody assumption and the presence of contaminant particles in the driver gas; these are potentially significant. The estimate of the driver gas temperature suggests that driver heat losses are not the dominant contributor to the lower-than-expected shock speeds for X3. Since both the driver temperature and pressure have been measured, investigation of total pressure losses during driver gas expansion across the diaphragm and driver-to-driven tube area change (currently not accounted for) is recommended for future studies as the likely mechanism for the observed performance gap.

  5. X3 expansion tube driver gas spectroscopy and temperature measurements

    NASA Astrophysics Data System (ADS)

    Parekh, V.; Gildfind, D.; Lewis, S.; James, C.

    2017-11-01

    The University of Queensland's X3 facility is a large, free-piston driven expansion tube used for super-orbital and high Mach number scramjet aerothermodynamic studies. During recent development of new scramjet test flow conditions, experimentally measured shock speeds were found to be significantly lower than that predicted by initial driver performance calculations. These calculations were based on ideal, isentropic compression of the driver gas and indicated that loss mechanisms, not accounted for in the preliminary analysis, were significant. The critical determinant of shock speed is peak driver gas sound speed, which for a given gas composition depends on the peak driver gas temperature. This temperature may be inaccurately estimated if an incorrect fill temperature is assumed, or if heat losses during driver gas compression are significant but not accounted for. For this study, the ideal predicted peak temperature was 3750 K, without accounting for losses. However, a much lower driver temperature of 2400 K is suggested based on measured experimental shock speeds. This study aimed to measure initial and peak driver gas temperatures for a representative X3 operating condition. Examination of the transient temperatures of the driver gas and compression tube steel wall during the initial fill process showed that once the filling process was complete, the steady-state driver gas temperature closely matched the tube wall temperature. Therefore, while assuming the gas is initially at the ambient laboratory temperature is not a significant source of error, it can be entirely mitigated by simply monitoring tube wall temperature. Optical emission spectroscopy was used to determine the driver gas spectra after diaphragm rupture; the driver gas emission spectrum exhibited a significant continuum radiation component, with prominent spectral lines attributed to contamination of the gas. A graybody approximation of the continuum suggested a peak driver gas temperature of 3200 K; uncertainty associated with the blackbody curve fit is ±100 K. However, work is required to quantify additional sources of uncertainty due to the graybody assumption and the presence of contaminant particles in the driver gas; these are potentially significant. The estimate of the driver gas temperature suggests that driver heat losses are not the dominant contributor to the lower-than-expected shock speeds for X3. Since both the driver temperature and pressure have been measured, investigation of total pressure losses during driver gas expansion across the diaphragm and driver-to-driven tube area change (currently not accounted for) is recommended for future studies as the likely mechanism for the observed performance gap.

  6. Evaluation of habitat suitability index models by global sensitivity and uncertainty analyses: a case study for submerged aquatic vegetation

    USGS Publications Warehouse

    Zajac, Zuzanna; Stith, Bradley M.; Bowling, Andrea C.; Langtimm, Catherine A.; Swain, Eric D.

    2015-01-01

    Habitat suitability index (HSI) models are commonly used to predict habitat quality and species distributions and are used to develop biological surveys, assess reserve and management priorities, and anticipate possible change under different management or climate change scenarios. Important management decisions may be based on model results, often without a clear understanding of the level of uncertainty associated with model outputs. We present an integrated methodology to assess the propagation of uncertainty from both inputs and structure of the HSI models on model outputs (uncertainty analysis: UA) and relative importance of uncertain model inputs and their interactions on the model output uncertainty (global sensitivity analysis: GSA). We illustrate the GSA/UA framework using simulated hydrology input data from a hydrodynamic model representing sea level changes and HSI models for two species of submerged aquatic vegetation (SAV) in southwest Everglades National Park: Vallisneria americana (tape grass) and Halodule wrightii (shoal grass). We found considerable spatial variation in uncertainty for both species, but distributions of HSI scores still allowed discrimination of sites with good versus poor conditions. Ranking of input parameter sensitivities also varied spatially for both species, with high habitat quality sites showing higher sensitivity to different parameters than low-quality sites. HSI models may be especially useful when species distribution data are unavailable, providing means of exploiting widely available environmental datasets to model past, current, and future habitat conditions. The GSA/UA approach provides a general method for better understanding HSI model dynamics, the spatial and temporal variation in uncertainties, and the parameters that contribute most to model uncertainty. Including an uncertainty and sensitivity analysis in modeling efforts as part of the decision-making framework will result in better-informed, more robust decisions.

  7. Managing uncertainty in collaborative robotics engineering projects: The influence of task structure and peer interaction

    NASA Astrophysics Data System (ADS)

    Jordan, Michelle

    Uncertainty is ubiquitous in life, and learning is an activity particularly likely to be fraught with uncertainty. Previous research suggests that students and teachers struggle in their attempts to manage the psychological experience of uncertainty and that students often fail to experience uncertainty when uncertainty may be warranted. Yet, few educational researchers have explicitly and systematically observed what students do, their behaviors and strategies, as they attempt to manage the uncertainty they experience during academic tasks. In this study I investigated how students in one fifth grade class managed uncertainty they experienced while engaged in collaborative robotics engineering projects, focusing particularly on how uncertainty management was influenced by task structure and students' interactions with their peer collaborators. The study was initiated at the beginning of instruction related to robotics engineering and preceded through the completion of several long-term collaborative robotics projects, one of which was a design project. I relied primarily on naturalistic observation of group sessions, semi-structured interviews, and collection of artifacts. My data analysis was inductive and interpretive, using qualitative discourse analysis techniques and methods of grounded theory. Three theoretical frameworks influenced the conception and design of this study: community of practice, distributed cognition, and complex adaptive systems theory. Uncertainty was a pervasive experience for the students collaborating in this instructional context. Students experienced uncertainty related to the project activity and uncertainty related to the social system as they collaborated to fulfill the requirements of their robotics engineering projects. They managed their uncertainty through a diverse set of tactics for reducing, ignoring, maintaining, and increasing uncertainty. Students experienced uncertainty from more different sources and used more and different types of uncertainty management strategies in the less structured task setting than in the more structured task setting. Peer interaction was influential because students relied on supportive social response to enact most of their uncertainty management strategies. When students could not garner socially supportive response from their peers, their options for managing uncertainty were greatly reduced.

  8. Entropy bound of local quantum field theory with generalized uncertainty principle

    NASA Astrophysics Data System (ADS)

    Kim, Yong-Wan; Lee, Hyung Won; Myung, Yun Soo

    2009-03-01

    We study the entropy bound for local quantum field theory (LQFT) with generalized uncertainty principle. The generalized uncertainty principle provides naturally a UV cutoff to the LQFT as gravity effects. Imposing the non-gravitational collapse condition as the UV-IR relation, we find that the maximal entropy of a bosonic field is limited by the entropy bound A 3 / 4 rather than A with A the boundary area.

  9. Optimal test selection for prediction uncertainty reduction

    DOE PAGES

    Mullins, Joshua; Mahadevan, Sankaran; Urbina, Angel

    2016-12-02

    Economic factors and experimental limitations often lead to sparse and/or imprecise data used for the calibration and validation of computational models. This paper addresses resource allocation for calibration and validation experiments, in order to maximize their effectiveness within given resource constraints. When observation data are used for model calibration, the quality of the inferred parameter descriptions is directly affected by the quality and quantity of the data. This paper characterizes parameter uncertainty within a probabilistic framework, which enables the uncertainty to be systematically reduced with additional data. The validation assessment is also uncertain in the presence of sparse and imprecisemore » data; therefore, this paper proposes an approach for quantifying the resulting validation uncertainty. Since calibration and validation uncertainty affect the prediction of interest, the proposed framework explores the decision of cost versus importance of data in terms of the impact on the prediction uncertainty. Often, calibration and validation tests may be performed for different input scenarios, and this paper shows how the calibration and validation results from different conditions may be integrated into the prediction. Then, a constrained discrete optimization formulation that selects the number of tests of each type (calibration or validation at given input conditions) is proposed. Furthermore, the proposed test selection methodology is demonstrated on a microelectromechanical system (MEMS) example.« less

  10. A probabilistic fatigue analysis of multiple site damage

    NASA Technical Reports Server (NTRS)

    Rohrbaugh, S. M.; Ruff, D.; Hillberry, B. M.; Mccabe, G.; Grandt, A. F., Jr.

    1994-01-01

    The variability in initial crack size and fatigue crack growth is incorporated in a probabilistic model that is used to predict the fatigue lives for unstiffened aluminum alloy panels containing multiple site damage (MSD). The uncertainty of the damage in the MSD panel is represented by a distribution of fatigue crack lengths that are analytically derived from equivalent initial flaw sizes. The variability in fatigue crack growth rate is characterized by stochastic descriptions of crack growth parameters for a modified Paris crack growth law. A Monte-Carlo simulation explicitly describes the MSD panel by randomly selecting values from the stochastic variables and then grows the MSD cracks with a deterministic fatigue model until the panel fails. Different simulations investigate the influences of the fatigue variability on the distributions of remaining fatigue lives. Six cases that consider fixed and variable conditions of initial crack size and fatigue crack growth rate are examined. The crack size distribution exhibited a dominant effect on the remaining fatigue life distribution, and the variable crack growth rate exhibited a lesser effect on the distribution. In addition, the probabilistic model predicted that only a small percentage of the life remains after a lead crack develops in the MSD panel.

  11. Investigating Initial Disclosures and Reactions to Unexpected, Positive HPV Diagnosis.

    PubMed

    Smith, Rachel A; Hernandez, Rachael; Catona, Danielle

    2014-07-01

    Initial disclosures of health conditions are critical communication moments. Existing research focuses on disclosers; integrating confidants into studies of initial disclosures is needed. Guided by the disclosure decision-making model (DD-MM; Greene, 2009), this study examined what diagnosed persons and confidants may say when faced with unexpected test results and unexpected disclosures, respectively. Participants ( N = 151) recorded an audio-visual message for another person, after imagining that they or the other person had just received unexpected, positive HPV test results. The qualitative analysis revealed four themes: (1) impression management and social distance, (2) invisible symptoms and advice regarding future disclosures, (3) expressing and acknowledging emotional reactions, and (4) misunderstandings and lacking knowledge about HPV. These findings suggested that DD-MM may be a relevant framework for understanding not only when disclosers share, but what disclosers and confidants say in early conversations about new diagnoses. While disclosers' and confidants' messages showed marked similarities, important differences appeared. For example, confidants focused on assuaging disclosers' fear about the consequences, whereas disclosers expressed distress related to their uncertainty about the prognosis of an HPV infection and how to prepare for next steps. The discussion highlighted implications for the DD-MM, HPV disclosures, and future interventions.

  12. Approaches to Evaluating Probability of Collision Uncertainty

    NASA Technical Reports Server (NTRS)

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  13. Uncertainties in real-world decisions on medical technologies.

    PubMed

    Lu, C Y

    2014-08-01

    Patients, clinicians, payers and policy makers face substantial uncertainties in their respective healthcare decisions as they attempt to achieve maximum value, or the greatest level of benefit possible at a given cost. Uncertainties largely come from incomplete information at the time that decisions must be made. This is true in all areas of medicine because evidence from clinical trials is often incongruent with real-world patient care. This article highlights key uncertainties around the (comparative) benefits and harms of medical technologies. Initiatives and strategies such as comparative effectiveness research and coverage with evidence development may help to generate reliable and relevant evidence for decisions on coverage and treatment. These efforts could result in better decisions that improve patient outcomes and better use of scarce medical resources. © 2014 John Wiley & Sons Ltd.

  14. Reduced uncertainty of regional scale CLM predictions of net carbon fluxes and leaf area indices with estimated plant-specific parameters

    NASA Astrophysics Data System (ADS)

    Post, Hanna; Hendricks Franssen, Harrie-Jan; Han, Xujun; Baatz, Roland; Montzka, Carsten; Schmidt, Marius; Vereecken, Harry

    2016-04-01

    Reliable estimates of carbon fluxes and states at regional scales are required to reduce uncertainties in regional carbon balance estimates and to support decision making in environmental politics. In this work the Community Land Model version 4.5 (CLM4.5-BGC) was applied at a high spatial resolution (1 km2) for the Rur catchment in western Germany. In order to improve the model-data consistency of net ecosystem exchange (NEE) and leaf area index (LAI) for this study area, five plant functional type (PFT)-specific CLM4.5-BGC parameters were estimated with time series of half-hourly NEE data for one year in 2011/2012, using the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm, a Markov Chain Monte Carlo (MCMC) approach. The parameters were estimated separately for four different plant functional types (needleleaf evergreen temperate tree, broadleaf deciduous temperate tree, C3-grass and C3-crop) at four different sites. The four sites are located inside or close to the Rur catchment. We evaluated modeled NEE for one year in 2012/2013 with NEE measured at seven eddy covariance sites in the catchment, including the four parameter estimation sites. Modeled LAI was evaluated by means of LAI derived from remotely sensed RapidEye images of about 18 days in 2011/2012. Performance indices were based on a comparison between measurements and (i) a reference run with CLM default parameters, and (ii) a 60 instance CLM ensemble with parameters sampled from the DREAM posterior probability density functions (pdfs). The difference between the observed and simulated NEE sum reduced 23% if estimated parameters instead of default parameters were used as input. The mean absolute difference between modeled and measured LAI was reduced by 59% on average. Simulated LAI was not only improved in terms of the absolute value but in some cases also in terms of the timing (beginning of vegetation onset), which was directly related to a substantial improvement of the NEE estimates in spring. In order to obtain a more comprehensive estimate of the model uncertainty, a second CLM ensemble was set up, where initial conditions and atmospheric forcings were perturbed in addition to the parameter estimates. This resulted in very high standard deviations (STD) of the modeled annual NEE sums for C3-grass and C3-crop PFTs, ranging between 24.1 and 225.9 gC m-2 y-1, compared to STD = 0.1 - 3.4 gC m-2 y-1 (effect of parameter uncertainty only, without additional perturbation of initial states and atmospheric forcings). The higher spread of modeled NEE for the C3-crop and C3-grass indicated that the model uncertainty was notably higher for those PFTs compared to the forest-PFTs. Our findings highlight the potential of parameter and uncertainty estimation to support the understanding and further development of land surface models such as CLM.

  15. Geographical scenario uncertainty in generic fate and exposure factors of toxic pollutants for life-cycle impact assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huijbregts, Mark A.J.; Lundi, Sven; McKone, Thomas E.

    In environmental life-cycle assessments (LCA), fate and exposure factors account for the general fate and exposure properties of chemicals under generic environmental conditions by means of 'evaluative' multi-media fate and exposure box models. To assess the effect of using different generic environmental conditions, fate and exposure factors of chemicals emitted under typical conditions of (1) Western Europe, (2) Australia and (3) the United States of America were compared with the multi-media fate and exposure box model USES-LCA. Comparing the results of the three evaluative environments, it was found that the uncertainty in fate and exposure factors for ecosystems and humansmore » due to choice of an evaluative environment, as represented by the ratio of the 97.5th and 50th percentile, is between a factor 2 and 10. Particularly, fate and exposure factors of emissions causing effects in fresh water ecosystems and effects on human health have relatively high uncertainty. This uncertainty i s mainly caused by the continental difference in the average soil erosion rate, the dimensions of the fresh water and agricultural soil compartment, and the fraction of drinking water coming from ground water.« less

  16. Uncertainty Quantification of Medium-Term Heat Storage From Short-Term Geophysical Experiments Using Bayesian Evidential Learning

    NASA Astrophysics Data System (ADS)

    Hermans, Thomas; Nguyen, Frédéric; Klepikova, Maria; Dassargues, Alain; Caers, Jef

    2018-04-01

    In theory, aquifer thermal energy storage (ATES) systems can recover in winter the heat stored in the aquifer during summer to increase the energy efficiency of the system. In practice, the energy efficiency is often lower than expected from simulations due to spatial heterogeneity of hydraulic properties or non-favorable hydrogeological conditions. A proper design of ATES systems should therefore consider the uncertainty of the prediction related to those parameters. We use a novel framework called Bayesian Evidential Learning (BEL) to estimate the heat storage capacity of an alluvial aquifer using a heat tracing experiment. BEL is based on two main stages: pre- and postfield data acquisition. Before data acquisition, Monte Carlo simulations and global sensitivity analysis are used to assess the information content of the data to reduce the uncertainty of the prediction. After data acquisition, prior falsification and machine learning based on the same Monte Carlo are used to directly assess uncertainty on key prediction variables from observations. The result is a full quantification of the posterior distribution of the prediction conditioned to observed data, without any explicit full model inversion. We demonstrate the methodology in field conditions and validate the framework using independent measurements.

  17. Our Changing Planet: The U.S. Climate Change Science Program for Fiscal Year 2006

    DTIC Science & Technology

    2005-11-01

    any remaining uncertainties for the Amazon region of South America.These results are expected to greatly reduce errors and uncertainties concerning...changing the concentration of atmospheric CO2 are fossil -fuel burning, deforestation, land-use change, and cement production.These processes have...the initial phases of work on the remaining products. Specific plans for enhanced decision-support resources include: – Developing decision-support

  18. Uncertainty-Sensitive Heterogeneous Information Fusion: Assessing Threat with Soft, Uncertain, and Conflicting Evidence

    DTIC Science & Technology

    2016-01-01

    planning exercises and wargaming. Initial Experimentation Late in the research , the prototype platform and the various fusion methods came together. This...Chapter Four points to prior research 2 Uncertainty-Sensitive Heterogeneous Information Fusion in mind multimethod fusing of complex information...our research is assessing the threat of terrorism posed by individuals or groups under scrutiny. Broadly, the ultimate objec- tives, which go well

  19. Assessing predictability of a hydrological stochastic-dynamical system

    NASA Astrophysics Data System (ADS)

    Gelfan, Alexander

    2014-05-01

    The water cycle includes the processes with different memory that creates potential for predictability of hydrological system based on separating its long and short memory components and conditioning long-term prediction on slower evolving components (similar to approaches in climate prediction). In the face of the Panta Rhei IAHS Decade questions, it is important to find a conceptual approach to classify hydrological system components with respect to their predictability, define predictable/unpredictable patterns, extend lead-time and improve reliability of hydrological predictions based on the predictable patterns. Representation of hydrological systems as the dynamical systems subjected to the effect of noise (stochastic-dynamical systems) provides possible tool for such conceptualization. A method has been proposed for assessing predictability of hydrological system caused by its sensitivity to both initial and boundary conditions. The predictability is defined through a procedure of convergence of pre-assigned probabilistic measure (e.g. variance) of the system state to stable value. The time interval of the convergence, that is the time interval during which the system losses memory about its initial state, defines limit of the system predictability. The proposed method was applied to assess predictability of soil moisture dynamics in the Nizhnedevitskaya experimental station (51.516N; 38.383E) located in the agricultural zone of the central European Russia. A stochastic-dynamical model combining a deterministic one-dimensional model of hydrothermal regime of soil with a stochastic model of meteorological inputs was developed. The deterministic model describes processes of coupled heat and moisture transfer through unfrozen/frozen soil and accounts for the influence of phase changes on water flow. The stochastic model produces time series of daily meteorological variables (precipitation, air temperature and humidity), whose statistical properties are similar to those of the corresponding series of the actual data measured at the station. Beginning from the initial conditions and being forced by Monte-Carlo generated synthetic meteorological series, the model simulated diverging trajectories of soil moisture characteristics (water content of soil column, moisture of different soil layers, etc.). Limit of predictability of the specific characteristic was determined through time of stabilization of variance of the characteristic between the trajectories, as they move away from the initial state. Numerical experiments were carried out with the stochastic-dynamical model to analyze sensitivity of the soil moisture predictability assessments to uncertainty in the initial conditions, to determine effects of the soil hydraulic properties and processes of soil freezing on the predictability. It was found, particularly, that soil water content predictability is sensitive to errors in the initial conditions and strongly depends on the hydraulic properties of soil under both unfrozen and frozen conditions. Even if the initial conditions are "well-established", the assessed predictability of water content of unfrozen soil does not exceed 30-40 days, while for frozen conditions it may be as long as 3-4 months. The latter creates opportunity for utilizing the autumn water content of soil as the predictor for spring snowmelt runoff in the region under consideration.

  20. Probabilistic evaluation of uncertainties and risks in aerospace components

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Shiao, M. C.; Nagpal, V. K.; Chamis, C. C.

    1992-01-01

    This paper summarizes a methodology developed at NASA Lewis Research Center which computationally simulates the structural, material, and load uncertainties associated with Space Shuttle Main Engine (SSME) components. The methodology was applied to evaluate the scatter in static, buckling, dynamic, fatigue, and damage behavior of the SSME turbo pump blade. Also calculated are the probability densities of typical critical blade responses, such as effective stress, natural frequency, damage initiation, most probable damage path, etc. Risk assessments were performed for different failure modes, and the effect of material degradation on the fatigue and damage behaviors of a blade were calculated using a multi-factor interaction equation. Failure probabilities for different fatigue cycles were computed and the uncertainties associated with damage initiation and damage propagation due to different load cycle were quantified. Evaluations on the effects of mistuned blades on a rotor were made; uncertainties in the excitation frequency were found to significantly amplify the blade responses of a mistuned rotor. The effects of the number of blades on a rotor were studied. The autocorrelation function of displacements and the probability density function of the first passage time for deterministic and random barriers for structures subjected to random processes also were computed. A brief discussion was included on the future direction of probabilistic structural analysis.

  1. Tectonic predictions with mantle convection models

    NASA Astrophysics Data System (ADS)

    Coltice, Nicolas; Shephard, Grace E.

    2018-04-01

    Over the past 15 yr, numerical models of convection in Earth's mantle have made a leap forward: they can now produce self-consistent plate-like behaviour at the surface together with deep mantle circulation. These digital tools provide a new window into the intimate connections between plate tectonics and mantle dynamics, and can therefore be used for tectonic predictions, in principle. This contribution explores this assumption. First, initial conditions at 30, 20, 10 and 0 Ma are generated by driving a convective flow with imposed plate velocities at the surface. We then compute instantaneous mantle flows in response to the guessed temperature fields without imposing any boundary conditions. Plate boundaries self-consistently emerge at correct locations with respect to reconstructions, except for small plates close to subduction zones. As already observed for other types of instantaneous flow calculations, the structure of the top boundary layer and upper-mantle slab is the dominant character that leads to accurate predictions of surface velocities. Perturbations of the rheological parameters have little impact on the resulting surface velocities. We then compute fully dynamic model evolution from 30 and 10 to 0 Ma, without imposing plate boundaries or plate velocities. Contrary to instantaneous calculations, errors in kinematic predictions are substantial, although the plate layout and kinematics in several areas remain consistent with the expectations for the Earth. For these calculations, varying the rheological parameters makes a difference for plate boundary evolution. Also, identified errors in initial conditions contribute to first-order kinematic errors. This experiment shows that the tectonic predictions of dynamic models over 10 My are highly sensitive to uncertainties of rheological parameters and initial temperature field in comparison to instantaneous flow calculations. Indeed, the initial conditions and the rheological parameters can be good enough for an accurate prediction of instantaneous flow, but not for a prediction after 10 My of evolution. Therefore, inverse methods (sequential or data assimilation methods) using short-term fully dynamic evolution that predict surface kinematics are promising tools for a better understanding of the state of the Earth's mantle.

  2. Effects of Uncertainty on ERPs to Emotional Pictures Depend on Emotional Valence

    PubMed Central

    Lin, Huiyan; Jin, Hua; Liang, Jiafeng; Yin, Ruru; Liu, Ting; Wang, Yiwen

    2015-01-01

    Uncertainty about the emotional content of an upcoming event has found to modulate neural activity to the event before its occurrence. However, it is still under debate whether the uncertainty effects occur after the occurrence of the event. To address this issue, participants were asked to view emotional pictures that were shortly after a cue, which either indicated a certain emotion of the picture or not. Both certain and uncertain cues were used by neutral symbols. The anticipatory phase (i.e., inter-trial interval, ITI) between the cue and the picture was short to enhance the effects of uncertainty. In addition, we used positive and negative pictures that differed only in valence but not in arousal to investigate whether the uncertainty effect was dependent on emotional valence. Electroencephalography (EEG) was recorded during the presentation of the pictures. Event-related potential (ERP) results showed that negative pictures evoked smaller P2 and late LPP but larger N2 in the uncertain as compared to the certain condition; whereas we did not find the uncertainty effect in early LPP. For positive pictures, the early LPP was larger in the uncertain as compared to the certain condition; however, there were no uncertainty effects in some other ERP components (e.g., P2, N2, and late LPP). The findings suggest that uncertainty modulates neural activity to emotional pictures and this modulation is altered by the valence of the pictures, indicating that individuals alter the allocation of attentional resources toward uncertain emotional pictures dependently on the valence of the pictures. PMID:26733916

  3. Photovoltaic Calibrations at the National Renewable Energy Laboratory and Uncertainty Analysis Following the ISO 17025 Guidelines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Emery, Keith

    The measurement of photovoltaic (PV) performance with respect to reference conditions requires measuring current versus voltage for a given tabular reference spectrum, junction temperature, and total irradiance. This report presents the procedures implemented by the PV Cell and Module Performance Characterization Group at the National Renewable Energy Laboratory (NREL) to achieve the lowest practical uncertainty. A rigorous uncertainty analysis of these procedures is presented, which follows the International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement. This uncertainty analysis is required for the team’s laboratory accreditation under ISO standard 17025, “General Requirements for the Competence ofmore » Testing and Calibration Laboratories.” The report also discusses additional areas where the uncertainty can be reduced.« less

  4. Geological maps and models: are we certain how uncertain they are?

    NASA Astrophysics Data System (ADS)

    Mathers, Steve; Waters, Colin; McEvoy, Fiona

    2014-05-01

    Geological maps and latterly 3D models provide the spatial framework for geology at diverse scales or resolutions. As demands continue to rise for sustainable use of the subsurface, use of these maps and models is informing decisions on management of natural resources, hazards and environmental change. Inaccuracies and uncertainties in geological maps and models can impact substantially on the perception, assessment and management of opportunities and the associated risks . Lithostratigraphical classification schemes predominate, and are used in most geological mapping and modelling. The definition of unit boundaries, as 2D lines or 3D surfaces is the prime objective. The intervening area or volume is rarely described other than by its bulk attributes, those relating to the whole unit. Where sufficient data exist on the spatial and/or statistical distribution of properties it can be gridded or voxelated with integrity. Here we only discuss the uncertainty involved in defining the boundary conditions. The primary uncertainty of any geological map or model is the accuracy of the geological boundaries, i.e. tops, bases, limits, fault intersections etc. Traditionally these have been depicted on BGS maps using three line styles that reflect the uncertainty of the boundary, e.g. observed, inferred, conjectural. Most geological maps tend to neglect the subsurface expression (subcrops etc). Models could also be built with subsurface geological boundaries (as digital node strings) tagged with levels of uncertainty; initial experience suggests three levels may again be practicable. Once tagged these values could be used to autogenerate uncertainty plots. Whilst maps are predominantly explicit and based upon evidence and the conceptual the understanding of the geologist, models of this type are less common and tend to be restricted to certain software methodologies. Many modelling packages are implicit, being driven by simple statistical interpolation or complex algorithms for building surfaces in ways that are invisible and so not controlled by the working geologist. Such models have the advantage of being replicable within a software package and so can discount some interpretational differences between modellers. They can however create geologically implausible results unless good geological rules and control are established prior to model calculation. Comparisons of results from varied software packages yield surprisingly diverse results. This is a significant and often overlooked source of uncertainty in models. Expert elicitation is commonly employed to establish values used in statistical treatments of model uncertainty. However this introduces another possible source of uncertainty created by the different judgements of the modellers. The pragmatic solution appears to be using panels of experienced geologists to elicit the values. Treatments of uncertainty in maps and models yield relative rather than absolute values even though many of these are expressed numerically. This makes it extremely difficult to devise standard methodologies to determine uncertainty or propose fixed numerical scales for expressing the results. Furthermore, these may give a misleading impression of greater certainty than actually exists. This contribution outlines general perceptions with regard to uncertainty in our maps and models and presents results from recent BGS studies

  5. Removal of Asperger's syndrome from the DSM V: community response to uncertainty.

    PubMed

    Parsloe, Sarah M; Babrow, Austin S

    2016-01-01

    The May 2013 release of the new version of the Diagnostic and Statistical Manual of Mental Disorders (DSM V) subsumed Asperger's syndrome under the wider diagnostic label of autism spectrum disorder (ASD). The revision has created much uncertainty in the community affected by this condition. This study uses problematic integration theory and thematic analysis to investigate how participants in Wrong Planet, a large online community associated with autism and Asperger's syndrome, have constructed these uncertainties. The analysis illuminates uncertainties concerning both the likelihood of diagnosis and value of diagnosis, and it details specific issues within these two general areas of uncertainty. The article concludes with both conceptual and practical implications.

  6. The role of future scenarios to understand deep uncertainty

    EPA Science Inventory

    The environment and its interaction with human systems(economic, social and political) is complex and dynamic. Key drivers may disrupt system dynamics in unforeseen ways, making it difficult to predict future conditions precisely. This kind of deep uncertainty presents a challe...

  7. Re-evaluation of the sorption behaviour of Bromide and Sulfamethazine under field conditions using leaching data and modelling methods

    NASA Astrophysics Data System (ADS)

    Gassmann, Matthias; Olsson, Oliver; Höper, Heinrich; Hamscher, Gerd; Kümmerer, Klaus

    2016-04-01

    The simulation of reactive transport in the aquatic environment is hampered by the ambiguity of environmental fate process conceptualizations for a specific substance in the literature. Concepts are usually identified by experimental studies and inverse modelling under controlled lab conditions in order to reduce environmental uncertainties such as uncertain boundary conditions and input data. However, since environmental conditions affect substance behaviour, a re-evaluation might be necessary under environmental conditions which might, in turn, be affected by uncertainties. Using a combination of experimental data and simulations of the leaching behaviour of the veterinary antibiotic Sulfamethazine (SMZ; synonym: sulfadimidine) and the hydrological tracer Bromide (Br) in a field lysimeter, we re-evaluated the sorption concepts of both substances under uncertain field conditions. Sampling data of a field lysimeter experiment in which both substances were applied twice a year with manure and sampled at the bottom of two lysimeters during three subsequent years was used for model set-up and evaluation. The total amount of leached SMZ and Br were 22 μg and 129 mg, respectively. A reactive transport model was parameterized to the conditions of the two lysimeters filled with monoliths (depth 2 m, area 1 m²) of a sandy soil showing a low pH value under which Bromide is sorptive. We used different sorption concepts such as constant and organic-carbon dependent sorption coefficients and instantaneous and kinetic sorption equilibrium. Combining the sorption concepts resulted in four scenarios per substance with different equations for sorption equilibrium and sorption kinetics. The GLUE (Generalized Likelihood Uncertainty Estimation) method was applied to each scenario using parameter ranges found in experimental and modelling studies. The parameter spaces for each scenario were sampled using a Latin Hypercube method which was refined around local model efficiency maxima. Results of the cumulative SMZ leaching simulations suggest a best conceptualization combination of instantaneous sorption to organic carbon which is consistent with the literature. The best Nash-Sutcliffe efficiency (Neff) was 0.96 and the 5th and 95th percentile of the uncertainty estimation were 18 and 27 μg. In contrast, both scenarios of kinetic Br sorption had similar results (Neff =0.99, uncertainty bounds 110-176 mg and 112-176 mg) but were clearly better than instantaneous sorption scenarios. Therefore, only the concept of sorption kinetics could be identified for Br modelling whereas both tested sorption equilibrium coefficient concepts performed equally well. The reasons for this specific case of equifinality may be uncertainties of model input data under field conditions or an insensitivity of the sorption equilibrium method due to relatively low adsorption of Br. Our results show that it may be possible to identify or at least falsify specific sorption concepts under uncertain field conditions using a long-term leaching experiment and modelling methods. Cases of environmental fate concept equifinality arouse the possibility of future model structure uncertainty analysis using an ensemble of models with different environmental fate concepts.

  8. A Novel Uncertainty Framework for Improving Discharge Data Quality Using Hydraulic Modelling.

    NASA Astrophysics Data System (ADS)

    Mansanarez, V.; Westerberg, I.; Lyon, S. W.; Lam, N.

    2017-12-01

    Flood risk assessments rely on accurate discharge data records. Establishing a reliable stage-discharge (SD) rating curve for calculating discharge from stage at a gauging station normally takes years of data collection efforts. Estimation of high flows is particularly difficult as high flows occur rarely and are often practically difficult to gauge. Hydraulically-modelled rating curves can be derived based on as few as two concurrent stage-discharge and water-surface slope measurements at different flow conditions. This means that a reliable rating curve can, potentially, be derived much faster than a traditional rating curve based on numerous stage-discharge gaugings. We introduce an uncertainty framework using hydraulic modelling for developing SD rating curves and estimating their uncertainties. The proposed framework incorporates information from both the hydraulic configuration (bed slope, roughness, vegetation) and the information available in the stage-discharge observation data (gaugings). This method provides a direct estimation of the hydraulic configuration (slope, bed roughness and vegetation roughness). Discharge time series are estimated propagating stage records through posterior rating curve results.We applied this novel method to two Swedish hydrometric stations, accounting for uncertainties in the gaugings for the hydraulic model. Results from these applications were compared to discharge measurements and official discharge estimations.Sensitivity analysis was performed. We focused analyses on high-flow uncertainty and the factors that could reduce this uncertainty. In particular, we investigated which data uncertainties were most important, and at what flow conditions the gaugings should preferably be taken.

  9. Under the influence: Effects of adolescent ethanol exposure and anxiety on motivation for uncertain gambling-like cues in male and female rats.

    PubMed

    Hellberg, Samantha N; Levit, Jeremy D; Robinson, Mike J F

    2018-01-30

    Gambling disorder (GD) frequently co-occurs with alcohol use and anxiety disorders, suggesting possible shared mechanisms. Recent research suggests reward uncertainty may powerfully enhance attraction towards reward cues. Here, we examined the effects of adolescent ethanol exposure, anxiety, and reward uncertainty on cue-triggered motivation. Male and female adolescent rats were given free access to ethanol or control jello for 20days. Following withdrawal, rats underwent autoshaping on a certain (100%-1) or uncertain (50%-1-2-3) reward contingency, followed by single-session conditioned reinforcement and progressive ratio tasks, and 7days of omission training, during which lever pressing resulted in omission of reward. Finally, anxiety levels were quantified on the elevated plus maze. Here, we found that uncertainty narrowed cue attraction by significantly increasing the ratio of sign-tracking to goal-tracking, particularly amongst control jello and high anxiety animals, but not in animals exposed to ethanol during adolescence. In addition, attentional bias towards the lever cue was more persistent under uncertain conditions following omission training. We also found that females consumed more ethanol, and that uncertainty mitigated the anxiolytic effects of ethanol exposure observed in high ethanol intake animals under certainty conditions. Our results further support that reward uncertainty biases attraction towards reward cues, suggesting also that heightened anxiety may enhance vulnerability to the effects of reward uncertainty. Chronic, elevated alcohol consumption may contribute to heightened anxiety levels, while high anxiety may promote the over-attribution of incentive value to reward cues, highlighting possible mechanisms that may drive concurrent anxiety, heavy drinking, and problematic gambling. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Individual Uncertainty and the Uncertainty of Science: The Impact of Perceived Conflict and General Self-Efficacy on the Perception of Tentativeness and Credibility of Scientific Information.

    PubMed

    Flemming, Danny; Feinkohl, Insa; Cress, Ulrike; Kimmerle, Joachim

    2015-01-01

    We examined in two empirical studies how situational and personal aspects of uncertainty influence laypeople's understanding of the uncertainty of scientific information, with focus on the detection of tentativeness and perception of scientific credibility. In the first study (N = 48), we investigated the impact of a perceived conflict due to contradicting information as a situational, text-inherent aspect of uncertainty. The aim of the second study (N = 61) was to explore the role of general self-efficacy as an intra-personal uncertainty factor. In Study 1, participants read one of two versions of an introductory text in a between-group design. This text provided them with an overview about the neurosurgical procedure of deep brain stimulation (DBS). The text expressed a positive attitude toward DBS in one experimental condition or focused on the negative aspects of this method in the other condition. Then participants in both conditions read the same text that dealt with a study about DBS as experimental treatment in a small sample of patients with major depression. Perceived conflict between the two texts was found to increase the perception of tentativeness and to decrease the perception of scientific credibility, implicating that text-inherent aspects have significant effects on critical appraisal. The results of Study 2 demonstrated that participants with higher general self-efficacy detected the tentativeness to a lesser degree and assumed a higher level of scientific credibility, indicating a more naïve understanding of scientific information. This appears to be contradictory to large parts of previous findings that showed positive effects of high self-efficacy on learning. Both studies showed that perceived tentativeness and perceived scientific credibility of medical information contradicted each other. We conclude that there is a need for supporting laypeople in understanding the uncertainty of scientific information and that scientific writers should consider how to present scientific results when compiling pertinent texts.

  11. Individual Uncertainty and the Uncertainty of Science: The Impact of Perceived Conflict and General Self-Efficacy on the Perception of Tentativeness and Credibility of Scientific Information

    PubMed Central

    Flemming, Danny; Feinkohl, Insa; Cress, Ulrike; Kimmerle, Joachim

    2015-01-01

    We examined in two empirical studies how situational and personal aspects of uncertainty influence laypeople’s understanding of the uncertainty of scientific information, with focus on the detection of tentativeness and perception of scientific credibility. In the first study (N = 48), we investigated the impact of a perceived conflict due to contradicting information as a situational, text-inherent aspect of uncertainty. The aim of the second study (N = 61) was to explore the role of general self-efficacy as an intra-personal uncertainty factor. In Study 1, participants read one of two versions of an introductory text in a between-group design. This text provided them with an overview about the neurosurgical procedure of deep brain stimulation (DBS). The text expressed a positive attitude toward DBS in one experimental condition or focused on the negative aspects of this method in the other condition. Then participants in both conditions read the same text that dealt with a study about DBS as experimental treatment in a small sample of patients with major depression. Perceived conflict between the two texts was found to increase the perception of tentativeness and to decrease the perception of scientific credibility, implicating that text-inherent aspects have significant effects on critical appraisal. The results of Study 2 demonstrated that participants with higher general self-efficacy detected the tentativeness to a lesser degree and assumed a higher level of scientific credibility, indicating a more naïve understanding of scientific information. This appears to be contradictory to large parts of previous findings that showed positive effects of high self-efficacy on learning. Both studies showed that perceived tentativeness and perceived scientific credibility of medical information contradicted each other. We conclude that there is a need for supporting laypeople in understanding the uncertainty of scientific information and that scientific writers should consider how to present scientific results when compiling pertinent texts. PMID:26648902

  12. Partitioning uncertainty in streamflow projections under nonstationary model conditions

    NASA Astrophysics Data System (ADS)

    Chawla, Ila; Mujumdar, P. P.

    2018-02-01

    Assessing the impacts of Land Use (LU) and climate change on future streamflow projections is necessary for efficient management of water resources. However, model projections are burdened with significant uncertainty arising from various sources. Most of the previous studies have considered climate models and scenarios as major sources of uncertainty, but uncertainties introduced by land use change and hydrologic model assumptions are rarely investigated. In this paper an attempt is made to segregate the contribution from (i) general circulation models (GCMs), (ii) emission scenarios, (iii) land use scenarios, (iv) stationarity assumption of the hydrologic model, and (v) internal variability of the processes, to overall uncertainty in streamflow projections using analysis of variance (ANOVA) approach. Generally, most of the impact assessment studies are carried out with unchanging hydrologic model parameters in future. It is, however, necessary to address the nonstationarity in model parameters with changing land use and climate. In this paper, a regression based methodology is presented to obtain the hydrologic model parameters with changing land use and climate scenarios in future. The Upper Ganga Basin (UGB) in India is used as a case study to demonstrate the methodology. The semi-distributed Variable Infiltration Capacity (VIC) model is set-up over the basin, under nonstationary conditions. Results indicate that model parameters vary with time, thereby invalidating the often-used assumption of model stationarity. The streamflow in UGB under the nonstationary model condition is found to reduce in future. The flows are also found to be sensitive to changes in land use. Segregation results suggest that model stationarity assumption and GCMs along with their interactions with emission scenarios, act as dominant sources of uncertainty. This paper provides a generalized framework for hydrologists to examine stationarity assumption of models before considering them for future streamflow projections and segregate the contribution of various sources to the uncertainty.

  13. Uncertainty analysis of the simulations of effects of discharging treated wastewater to the Red River of the North at Fargo, North Dakota, and Moorhead, Minnesota

    USGS Publications Warehouse

    Wesolowski, Edwin A.

    1996-01-01

    Two separate studies to simulate the effects of discharging treated wastewater to the Red River of the North at Fargo, North Dakota, and Moorhead, Minnesota, have been completed. In the first study, the Red River at Fargo Water-Quality Model was calibrated and verified for icefree conditions. In the second study, the Red River at Fargo Ice-Cover Water-Quality Model was verified for ice-cover conditions.To better understand and apply the Red River at Fargo Water-Quality Model and the Red River at Fargo Ice-Cover Water-Quality Model, the uncertainty associated with simulated constituent concentrations and property values was analyzed and quantified using the Enhanced Stream Water Quality Model-Uncertainty Analysis. The Monte Carlo simulation and first-order error analysis methods were used to analyze the uncertainty in simulated values for six constituents and properties at sites 5, 10, and 14 (upstream to downstream order). The constituents and properties analyzed for uncertainty are specific conductance, total organic nitrogen (reported as nitrogen), total ammonia (reported as nitrogen), total nitrite plus nitrate (reported as nitrogen), 5-day carbonaceous biochemical oxygen demand for ice-cover conditions and ultimate carbonaceous biochemical oxygen demand for ice-free conditions, and dissolved oxygen. Results are given in detail for both the ice-cover and ice-free conditions for specific conductance, total ammonia, and dissolved oxygen.The sensitivity and uncertainty of the simulated constituent concentrations and property values to input variables differ substantially between ice-cover and ice-free conditions. During ice-cover conditions, simulated specific-conductance values are most sensitive to the headwatersource specific-conductance values upstream of site 10 and the point-source specific-conductance values downstream of site 10. These headwater-source and point-source specific-conductance values also are the key sources of uncertainty. Simulated total ammonia concentrations are most sensitive to the point-source total ammonia concentrations at all three sites. Other input variables that contribute substantially to the variability of simulated total ammonia concentrations are the headwater-source total ammonia and the instream reaction coefficient for biological decay of total ammonia to total nitrite. Simulated dissolved-oxygen concentrations at all three sites are most sensitive to headwater-source dissolved-oxygen concentration. This input variable is the key source of variability for simulated dissolved-oxygen concentrations at sites 5 and 10. Headwatersource and point-source dissolved-oxygen concentrations are the key sources of variability for simulated dissolved-oxygen concentrations at site 14.During ice-free conditions, simulated specific-conductance values at all three sites are most sensitive to the headwater-source specific-conductance values. Headwater-source specificconductance values also are the key source of uncertainty. The input variables to which total ammonia and dissolved oxygen are most sensitive vary from site to site and may or may not correspond to the input variables that contribute the most to the variability. The input variables that contribute the most to the variability of simulated total ammonia concentrations are pointsource total ammonia, instream reaction coefficient for biological decay of total ammonia to total nitrite, and Manning's roughness coefficient. The input variables that contribute the most to the variability of simulated dissolved-oxygen concentrations are reaeration rate, sediment oxygen demand rate, and headwater-source algae as chlorophyll a.

  14. Possible future changes in extreme events over Northern Eurasia

    NASA Astrophysics Data System (ADS)

    Monier, Erwan; Sokolov, Andrei; Scott, Jeffery

    2013-04-01

    In this study, we investigate possible future climate change over Northern Eurasia and its impact on extreme events. Northern Eurasia is a major player in the global carbon budget because of boreal forests and peatlands. Circumpolar boreal forests alone contain more than five times the amount of carbon of temperate forests and almost double the amount of carbon of the world's tropical forests. Furthermore, severe permafrost degradation associated with climate change could result in peatlands releasing large amounts of carbon dioxide and methane. Meanwhile, changes in the frequency and magnitude of extreme events, such as extreme precipitation, heat waves or frost days are likely to have substantial impacts on Northern Eurasia ecosystems. For this reason, it is very important to quantify the possible climate change over Northern Eurasia under different emissions scenarios, while accounting for the uncertainty in the climate response and changes in extreme events. For several decades, the Massachusetts Institute of Technology (MIT) Joint Program on the Science and Policy of Global Change has been investigating uncertainty in climate change using the MIT Integrated Global System Model (IGSM) framework, an integrated assessment model that couples an earth system model of intermediate complexity (with a 2D zonal-mean atmosphere) to a human activity model. In this study, regional change is investigated using the MIT IGSM-CAM framework that links the IGSM to the National Center for Atmospheric Research (NCAR) Community Atmosphere Model (CAM). New modules were developed and implemented in CAM to allow climate parameters to be changed to match those of the IGSM. The simulations presented in this paper were carried out for two emission scenarios, a "business as usual" scenario and a 660 ppm of CO2-equivalent stabilization, which are similar to, respectively, the Representative Concentration Pathways RCP8.5 and RCP4.5 scenarios. Values of climate sensitivity and net aerosol forcing used in the simulations within the IGSM-CAM framework provide a good approximation for the median, and the lower and upper bound of 90% probability distribution of 21st century climate change. Five member ensembles were carried out for each choice of parameters using different initial conditions. With these simulations, we investigate the role of emissions scenarios (climate policies), the global climate response (climate sensitivity) and natural variability (initial conditions) on the uncertainty in future climate changes over Northern Eurasia. A particular emphasis is made on future changes in extreme events, including frost days, extreme summer temperature and extreme summer and winter precipitation.

  15. Monthly streamflow forecasting based on hidden Markov model and Gaussian Mixture Regression

    NASA Astrophysics Data System (ADS)

    Liu, Yongqi; Ye, Lei; Qin, Hui; Hong, Xiaofeng; Ye, Jiajun; Yin, Xingli

    2018-06-01

    Reliable streamflow forecasts can be highly valuable for water resources planning and management. In this study, we combined a hidden Markov model (HMM) and Gaussian Mixture Regression (GMR) for probabilistic monthly streamflow forecasting. The HMM is initialized using a kernelized K-medoids clustering method, and the Baum-Welch algorithm is then executed to learn the model parameters. GMR derives a conditional probability distribution for the predictand given covariate information, including the antecedent flow at a local station and two surrounding stations. The performance of HMM-GMR was verified based on the mean square error and continuous ranked probability score skill scores. The reliability of the forecasts was assessed by examining the uniformity of the probability integral transform values. The results show that HMM-GMR obtained reasonably high skill scores and the uncertainty spread was appropriate. Different HMM states were assumed to be different climate conditions, which would lead to different types of observed values. We demonstrated that the HMM-GMR approach can handle multimodal and heteroscedastic data.

  16. Multi-sensor system for in situ shape monitoring and damage identification of high-speed composite rotors

    NASA Astrophysics Data System (ADS)

    Philipp, K.; Filippatos, A.; Kuschmierz, R.; Langkamp, A.; Gude, M.; Fischer, A.; Czarske, J.

    2016-08-01

    Glass fibre-reinforced polymer (GFRP) composites offer a higher stiffness-to-weight ratio than conventional rotor materials used in turbomachinery. However, the material behaviour of GFRP high-speed rotors is difficult to predict due to the complexity of the composite material and the dynamic loading conditions. Consequently dynamic expansion measurements of GRFP rotors are required in situ and with micron precision. However, the whirling motion amplitude is about two orders of magnitude higher than the desired precision. To overcome this problem, a multi-sensor system capable of separating rotor expansion and whirling motion is proposed. High measurement rates well above the rotational frequency and micron uncertainty are achieved at whirling amplitudes up to 120μm and surface velocities up to 300 m/s. The dynamic elliptical expansion of a GFRP rotor is investigated in a rotor loading test rig under vacuum conditions. In situ measurements identified not only the introduced damage but also damage initiation and propagation.

  17. Estimation of the probability of success in petroleum exploration

    USGS Publications Warehouse

    Davis, J.C.

    1977-01-01

    A probabilistic model for oil exploration can be developed by assessing the conditional relationship between perceived geologic variables and the subsequent discovery of petroleum. Such a model includes two probabilistic components, the first reflecting the association between a geologic condition (structural closure, for example) and the occurrence of oil, and the second reflecting the uncertainty associated with the estimation of geologic variables in areas of limited control. Estimates of the conditional relationship between geologic variables and subsequent production can be found by analyzing the exploration history of a "training area" judged to be geologically similar to the exploration area. The geologic variables are assessed over the training area using an historical subset of the available data, whose density corresponds to the present control density in the exploration area. The success or failure of wells drilled in the training area subsequent to the time corresponding to the historical subset provides empirical estimates of the probability of success conditional upon geology. Uncertainty in perception of geological conditions may be estimated from the distribution of errors made in geologic assessment using the historical subset of control wells. These errors may be expressed as a linear function of distance from available control. Alternatively, the uncertainty may be found by calculating the semivariogram of the geologic variables used in the analysis: the two procedures will yield approximately equivalent results. The empirical probability functions may then be transferred to the exploration area and used to estimate the likelihood of success of specific exploration plays. These estimates will reflect both the conditional relationship between the geological variables used to guide exploration and the uncertainty resulting from lack of control. The technique is illustrated with case histories from the mid-Continent area of the U.S.A. ?? 1977 Plenum Publishing Corp.

  18. Topology optimization under stochastic stiffness

    NASA Astrophysics Data System (ADS)

    Asadpoure, Alireza

    Topology optimization is a systematic computational tool for optimizing the layout of materials within a domain for engineering design problems. It allows variation of structural boundaries and connectivities. This freedom in the design space often enables discovery of new, high performance designs. However, solutions obtained by performing the optimization in a deterministic setting may be impractical or suboptimal when considering real-world engineering conditions with inherent variabilities including (for example) variabilities in fabrication processes and operating conditions. The aim of this work is to provide a computational methodology for topology optimization in the presence of uncertainties associated with structural stiffness, such as uncertain material properties and/or structural geometry. Existing methods for topology optimization under deterministic conditions are first reviewed. Modifications are then proposed to improve the numerical performance of the so-called Heaviside Projection Method (HPM) in continuum domains. Next, two approaches, perturbation and Polynomial Chaos Expansion (PCE), are proposed to account for uncertainties in the optimization procedure. These approaches are intrusive, allowing tight and efficient coupling of the uncertainty quantification with the optimization sensitivity analysis. The work herein develops a robust topology optimization framework aimed at reducing the sensitivity of optimized solutions to uncertainties. The perturbation-based approach combines deterministic topology optimization with a perturbation method for the quantification of uncertainties. The use of perturbation transforms the problem of topology optimization under uncertainty to an augmented deterministic topology optimization problem. The PCE approach combines the spectral stochastic approach for the representation and propagation of uncertainties with an existing deterministic topology optimization technique. The resulting compact representations for the response quantities allow for efficient and accurate calculation of sensitivities of response statistics with respect to the design variables. The proposed methods are shown to be successful at generating robust optimal topologies. Examples from topology optimization in continuum and discrete domains (truss structures) under uncertainty are presented. It is also shown that proposed methods lead to significant computational savings when compared to Monte Carlo-based optimization which involve multiple formations and inversions of the global stiffness matrix and that results obtained from the proposed method are in excellent agreement with those obtained from a Monte Carlo-based optimization algorithm.

  19. Water, heat, and vapor flow in a deep vadose zone under arid and hyper-arid conditions: a numerical study.

    NASA Astrophysics Data System (ADS)

    Madi, Raneem; de Rooij, Gerrit H.

    2017-04-01

    Groundwater recharge in arid regions is notoriously difficult to quantify. One reason is data scarcity: reliable weather records (rainfall, potential evapotranspiration rate, temperature) are typically lacking, the soil properties over the entire extent of the often very deep vadose zone are usually unknown, and the effect of sparse vegetation, wadis, (biological) soil crusts, and hard pans on infiltration and evaporation is difficult to quantify. Another reason is the difficulty of modeling the intricately coupled relevant processes over extended periods of time: coupled flow of liquid water, water vapor, and heat in a very deep soil in view of considerable uncertainty at the soil surface as indicated above, and over large spatial extents. In view of this myriad of problems, we limited ourselves to the simulation of 1-dimensional coupled flow of water, heat, and vapor in an unvegetated deep vadose zone. The conventional parameterizations of the soil hydraulic properties perform poorly under very dry conditions. We therefore selected an alternative that was developed specifically for dry circumstances and modified another to eliminate the physically implausible residual water content that rendered it of limited use for desert environments. The issue of data scarcity was resolved by using numerically generated rainfall records combined with a simple model for annual and daily temperature fluctuations. The soil was uniform, and the groundwater depth was constant at 100 m depth, which provided the lower boundary condition. The geothermal gradient determined the temperature at the groundwater level. We generated two scenarios with 120 years of weather in an arid and a hyper-arid climate. The initial condition was established by first starting with a somewhat arbitrary unit gradient initial condition corresponding to a small fraction of the annual average rainfall and let the model run through the 120-year atmospheric forcing. The resulting profile of matric potential and temperature was used as the initial condition for the warm-up period of the model (240 years) during which the weather record was repeated, which was then followed by the 120-year cycle we used for analysis. We will present the initial results of our analysis: - the dynamics (or lack thereof) of groundwater recharge and the role of wet years (or clusters of years) and droughts on the amount of recharge - the speed with which the atmospheric input signal travels downward, and the damping of the signal on its way down - the role of vapor flow under geothermal conditions

  20. Isokinetic TWC Evaporator Probe: Calculations and Systemic Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Davison, Craig R.; Strapp, J. Walter; Lilie, Lyle; Ratvasky, Thomas P.; Dumont, Christopher

    2016-01-01

    A new Isokinetic Total Water Content Evaporator (IKP2) was downsized from a prototype instrument, specifically to make airborne measurements of hydrometeor total water content (TWC) in deep tropical convective clouds to assess the new ice crystal Appendix D icing envelope. The probe underwent numerous laboratory and wind tunnel investigations to ensure reliable operation under the difficult high altitude/speed/TWC conditions under which other TWC instruments have been known to either fail, or have unknown performance characteristics and the results are presented in a companion paper. This paper presents the equations used to determine the total water content (TWC) of the sampled atmosphere from the values measured by the IKP2 or necessary ancillary data from other instruments. The uncertainty in the final TWC is determined by propagating the uncertainty in the measured values through the calculations to the final result. Two techniques were used and the results compared. The first is a typical analytical method of propagating uncertainty and the second performs a Monte Carlo simulation. The results are very similar with differences that are insignificant for practical purposes. The uncertainty is between 2 percent and 3 percent at most practical operating conditions. The capture efficiency of the IKP2 was also examined based on a computational fluid dynamic simulation of the original IKP and scaled down to the IKP2. Particles above 24 microns were found to have a capture efficiency greater than 99 percent at all operating conditions.

Top