Science.gov

Sample records for flooring sensitivity uncertainty

  1. Dark matter astrophysical uncertainties and the neutrino floor

    NASA Astrophysics Data System (ADS)

    O'Hare, Ciaran A. J.

    2016-09-01

    The search for weakly interacting massive particles (WIMPs) by direct detection faces an encroaching background due to coherent neutrino-nucleus scattering. For a given WIMP mass the cross section at which neutrinos constitute a dominant background is dependent on the uncertainty on the flux of each neutrino source, principally from the Sun, supernovae or atmospheric cosmic ray collisions. However there are also considerable uncertainties with regard to the astrophysical ingredients of the predicted WIMP signal. Uncertainties in the velocity of the Sun with respect to the Milky Way dark matter halo, the local density of WIMPs, and the shape of the local WIMP speed distribution all have an effect on the expected event rate in direct detection experiments and hence will change the region of the WIMP parameter space for which neutrinos are a significant background. In this work we extend the neutrino floor calculation to account for the uncertainty in the astrophysics dependence of the WIMP signal. We show the effect of uncertainties on projected discovery limits with an emphasis on low WIMP masses (less than 10 GeV) when solar neutrino backgrounds are most important. We find that accounting for astrophysical uncertainties changes the shape of the neutrino floor as a function of WIMP mass but also causes it to appear at cross sections up to an order of magnitude larger, extremely close to existing experimental limits, indicating that neutrino backgrounds will become an issue sooner than previously thought. We also explore how neutrinos hinder the estimation of WIMP parameters and how astrophysical uncertainties impact the discrimination of WIMPs and neutrinos with the use of their respective time dependencies.

  2. Uncertainty and Sensitivity Analyses Plan

    SciTech Connect

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  3. Dark matter vs. neutrinos: the effect of astrophysical uncertainties and timing information on the neutrino floor

    SciTech Connect

    Davis, Jonathan H.

    2015-03-09

    Future multi-tonne Direct Detection experiments will be sensitive to solar neutrino induced nuclear recoils which form an irreducible background to light Dark Matter searches. Indeed for masses around 6 GeV the spectra of neutrinos and Dark Matter are so similar that experiments are said to run into a neutrino floor, for which sensitivity increases only marginally with exposure past a certain cross section. In this work we show that this floor can be overcome using the different annual modulation expected from solar neutrinos and Dark Matter. Specifically for cross sections below the neutrino floor the DM signal is observable through a phase shift and a smaller amplitude for the time-dependent event rate. This allows the exclusion power to be improved by up to an order of magnitude for large exposures. In addition we demonstrate that, using only spectral information, the neutrino floor exists over a wider mass range than has been previously shown, since the large uncertainties in the Dark Matter velocity distribution make the signal spectrum harder to distinguish from the neutrino background. However for most velocity distributions it can still be surpassed using timing information, and so the neutrino floor is not an absolute limit on the sensitivity of Direct Detection experiments.

  4. Dark matter vs. neutrinos: the effect of astrophysical uncertainties and timing information on the neutrino floor

    SciTech Connect

    Davis, Jonathan H.

    2015-03-01

    Future multi-tonne Direct Detection experiments will be sensitive to solar neutrino induced nuclear recoils which form an irreducible background to light Dark Matter searches. Indeed for masses around 6 GeV the spectra of neutrinos and Dark Matter are so similar that experiments are said to run into a neutrino floor, for which sensitivity increases only marginally with exposure past a certain cross section. In this work we show that this floor can be overcome using the different annual modulation expected from solar neutrinos and Dark Matter. Specifically for cross sections below the neutrino floor the DM signal is observable through a phase shift and a smaller amplitude for the time-dependent event rate. This allows the exclusion power to be improved by up to an order of magnitude for large exposures. In addition we demonstrate that, using only spectral information, the neutrino floor exists over a wider mass range than has been previously shown, since the large uncertainties in the Dark Matter velocity distribution make the signal spectrum harder to distinguish from the neutrino background. However for most velocity distributions it can still be surpassed using timing information, and so the neutrino floor is not an absolute limit on the sensitivity of Direct Detection experiments.

  5. Sensitivity and Uncertainty Analysis Shell

    1999-04-20

    SUNS (Sensitivity and Uncertainty Analysis Shell) is a 32-bit application that runs under Windows 95/98 and Windows NT. It is designed to aid in statistical analyses for a broad range of applications. The class of problems for which SUNS is suitable is generally defined by two requirements: 1. A computer code is developed or acquired that models some processes for which input is uncertain and the user is interested in statistical analysis of the outputmore » of that code. 2. The statistical analysis of interest can be accomplished using the Monte Carlo analysis. The implementation then requires that the user identify which input to the process model is to be manipulated for statistical analysis. With this information, the changes required to loosely couple SUNS with the process model can be completed. SUNS is then used to generate the required statistical sample and the user-supplied process model analyses the sample. The SUNS post processor displays statistical results from any existing file that contains sampled input and output values.« less

  6. Extended Forward Sensitivity Analysis for Uncertainty Quantification

    SciTech Connect

    Haihua Zhao; Vincent A. Mousseau

    2013-01-01

    This paper presents the extended forward sensitivity analysis as a method to help uncertainty qualification. By including time step and potentially spatial step as special sensitivity parameters, the forward sensitivity method is extended as one method to quantify numerical errors. Note that by integrating local truncation errors over the whole system through the forward sensitivity analysis process, the generated time step and spatial step sensitivity information reflect global numerical errors. The discretization errors can be systematically compared against uncertainties due to other physical parameters. This extension makes the forward sensitivity method a much more powerful tool to help uncertainty qualification. By knowing the relative sensitivity of time and space steps with other interested physical parameters, the simulation is allowed to run at optimized time and space steps without affecting the confidence of the physical parameter sensitivity results. The time and space steps forward sensitivity analysis method can also replace the traditional time step and grid convergence study with much less computational cost. Two well-defined benchmark problems with manufactured solutions are utilized to demonstrate the method.

  7. LCA data quality: sensitivity and uncertainty analysis.

    PubMed

    Guo, M; Murphy, R J

    2012-10-01

    Life cycle assessment (LCA) data quality issues were investigated by using case studies on products from starch-polyvinyl alcohol based biopolymers and petrochemical alternatives. The time horizon chosen for the characterization models was shown to be an important sensitive parameter for the environmental profiles of all the polymers. In the global warming potential and the toxicity potential categories the comparison between biopolymers and petrochemical counterparts altered as the time horizon extended from 20 years to infinite time. These case studies demonstrated that the use of a single time horizon provide only one perspective on the LCA outcomes which could introduce an inadvertent bias into LCA outcomes especially in toxicity impact categories and thus dynamic LCA characterization models with varying time horizons are recommended as a measure of the robustness for LCAs especially comparative assessments. This study also presents an approach to integrate statistical methods into LCA models for analyzing uncertainty in industrial and computer-simulated datasets. We calibrated probabilities for the LCA outcomes for biopolymer products arising from uncertainty in the inventory and from data variation characteristics this has enabled assigning confidence to the LCIA outcomes in specific impact categories for the biopolymer vs. petrochemical polymer comparisons undertaken. Uncertainty combined with the sensitivity analysis carried out in this study has led to a transparent increase in confidence in the LCA findings. We conclude that LCAs lacking explicit interpretation of the degree of uncertainty and sensitivities are of limited value as robust evidence for decision making or comparative assertions. PMID:22854094

  8. Uncertainty Quantification of Equilibrium Climate Sensitivity

    NASA Astrophysics Data System (ADS)

    Lucas, D. D.; Brandon, S. T.; Covey, C. C.; Domyancic, D. M.; Johannesson, G.; Klein, R.; Tannahill, J.; Zhang, Y.

    2011-12-01

    Significant uncertainties exist in the temperature response of the climate system to changes in the levels of atmospheric carbon dioxide. We report progress to quantify the uncertainties of equilibrium climate sensitivity using perturbed parameter ensembles of the Community Earth System Model (CESM). Through a strategic initiative at the Lawrence Livermore National Laboratory, we have been developing uncertainty quantification (UQ) methods and incorporating them into a software framework called the UQ Pipeline. We have applied this framework to generate a large number of ensemble simulations using Latin Hypercube and other schemes to sample up to three dozen uncertain parameters in the atmospheric (CAM) and sea ice (CICE) model components of CESM. The parameters sampled are related to many highly uncertain processes, including deep and shallow convection, boundary layer turbulence, cloud optical and microphysical properties, and sea ice albedo. An extensive ensemble database comprised of more than 46,000 simulated climate-model-years of recent climate conditions has been assembled. This database is being used to train surrogate models of CESM responses and to perform statistical calibrations of the CAM and CICE models given observational data constraints. The calibrated models serve as a basis for propagating uncertainties forward through climate change simulations using a slab ocean model configuration of CESM. This procedure is being used to quantify the probability density function of equilibrium climate sensitivity accounting for uncertainties in climate model processes. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013. (LLNL-ABS-491765)

  9. Photovoltaic System Modeling. Uncertainty and Sensitivity Analyses

    SciTech Connect

    Hansen, Clifford W.; Martin, Curtis E.

    2015-08-01

    We report an uncertainty and sensitivity analysis for modeling AC energy from ph otovoltaic systems . Output from a PV system is predicted by a sequence of models. We quantify u ncertainty i n the output of each model using empirical distribution s of each model's residuals. We propagate uncertainty through the sequence of models by sampli ng these distributions to obtain a n empirical distribution of a PV system's output. We consider models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane - of - array irradiance; (2) estimate effective irradiance; (3) predict cell temperature; (4) estimate DC voltage, current and power ; (5) reduce DC power for losses due to inefficient maximum power point tracking or mismatch among modules; and (6) convert DC to AC power . O ur analysis consider s a notional PV system com prising an array of FirstSolar FS - 387 modules and a 250 kW AC inverter ; we use measured irradiance and weather at Albuquerque, NM. We found the uncertainty in PV syste m output to be relatively small, on the order of 1% for daily energy. We found that unce rtainty in the models for POA irradiance and effective irradiance to be the dominant contributors to uncertainty in predicted daily energy. Our analysis indicates that efforts to reduce the uncertainty in PV system output predictions may yield the greatest improvements by focusing on the POA and effective irradiance models.

  10. Uncertainty and Sensitivity in Surface Dynamics Modeling

    NASA Astrophysics Data System (ADS)

    Kettner, Albert J.; Syvitski, James P. M.

    2016-05-01

    Papers for this special issue on 'Uncertainty and Sensitivity in Surface Dynamics Modeling' heralds from papers submitted after the 2014 annual meeting of the Community Surface Dynamics Modeling System or CSDMS. CSDMS facilitates a diverse community of experts (now in 68 countries) that collectively investigate the Earth's surface-the dynamic interface between lithosphere, hydrosphere, cryosphere, and atmosphere, by promoting, developing, supporting and disseminating integrated open source software modules. By organizing more than 1500 researchers, CSDMS has the privilege of identifying community strengths and weaknesses in the practice of software development. We recognize, for example, that progress has been slow on identifying and quantifying uncertainty and sensitivity in numerical modeling of earth's surface dynamics. This special issue is meant to raise awareness for these important subjects and highlight state-of-the-art progress.

  11. Flood damage analysis: uncertainties for first floor elevation yielded from LiDAR data

    NASA Astrophysics Data System (ADS)

    Bodoque, Jose Maria; Aroca-Jimenez, Estefania; Guardiola-Albert, Carolina; Eguibar, Miguel Angel

    2016-04-01

    The use of high resolution ground-base light detection and ranging (LiDAR) datasets provide the spatial density and vertical precisión to obtain Digital Elevation Models (DEMs) highly accurate. As a result, reliability of flood damage analysis has been improved significantly, as accuracy of hydrodinamic model is increased. Additionally, an important error reduction also takes place in estimating first floor elevation, which is a critical parameter to determine structural and content damages in buildings. However, justlike any discrete measurement technique, LiDAR data contain object space ambiguities, especially in urban areas where the presence of buildings and the floodplain determines a highly complex landscape that is largely corrected by using data ancillory information based on breaklines. Here, we provide an uncertainty assessment based on: a) improvement of DEMs to be used in flood damage based on adding breaklines as ancillary information; b) geostatistical estimation of errors in DEMs; c) implementing a 2D hydrodynamic model considering the 500 yr flood return period; and d) determining first floor elevation uncertainty. As main conclusion of this study, worth to outline the need of processing raw LiDAR in order to generate efficient and high-quality DEMs that minimize the uncertainty of determining first-floor elevation and, as a result, reliability of flood damage assessment is increased.

  12. Temperature targets revisited under climate sensitivity uncertainty

    NASA Astrophysics Data System (ADS)

    Neubersch, Delf; Roth, Robert; Held, Hermann

    2015-04-01

    While the 2° target has become an official goal of the COP (Conference of the Parties) process recent work has shown that it requires re-interpretation if climate sensitivity uncertainty in combination with anticipated future learning is considered (Schmidt et al., 2011). A strict probabilistic limit as suggested by the Copenhagen diagnosis may lead to conceptual flaws in view of future learning such a negative expected value of information or even ill-posed policy recommendations. Instead Schmidt et al. suggest trading off the probabilistic transgression of a temperature target against mitigation-induced welfare losses and call this procedure cost risk analysis (CRA). Here we spell out CRA for the integrated assessment model MIND and derive necessary conditions for the exact nature of that trade-off. With CRA at hand it is for the first time that the expected value of climate information, for a given temperature target, can meaningfully be assessed. When focusing on a linear risk function as the most conservative of all possible risk functions, we find that 2° target-induced mitigation costs could be reduced by up to 1/3 if the climate response to carbon dioxide emissions were known with certainty, amounting to hundreds of billions of Euros per year (Neubersch et al., 2014). Further benefits of CRA over strictly formulated temperature targets are discussed. References: D. Neubersch, H. Held, A. Otto, Operationalizing climate targets under learning: An application of cost-risk analysis, Climatic Change, 126 (3), 305-318, DOI 10.1007/s10584-014-1223-z (2014). M. G. W. Schmidt, A. Lorenz, H. Held, E. Kriegler, Climate Targets under Uncertainty: Challenges and Remedies, Climatic Change Letters, 104 (3-4), 783-791, DOI 10.1007/s10584-010-9985-4 (2011).

  13. Sensitivity and Uncertainty Analysis of the keff for VHTR fuel

    NASA Astrophysics Data System (ADS)

    Han, Tae Young; Lee, Hyun Chul; Noh, Jae Man

    2014-06-01

    For the uncertainty and sensitivity analysis of PMR200 designed as a VHTR in KAERI, MUSAD was implemented based on the deterministic method in the connection with DeCART/CAPP code system. The sensitivity of the multiplication factor was derived using the classical perturbation theory and the sensitivity coefficients for the individual cross sections were obtained by the adjoint method within the framework of the transport equation. Then, the uncertainty of the multiplication factor was calculated from the product of the covariance matrix and the sensitivity. For the verification calculation of the implemented code, the uncertainty analysis on GODIVA benchmark and PMR200 pin cell problem were carried out and the results were compared with the reference codes, TSUNAMI and McCARD. As a result, they are in a good agreement except the uncertainty by the scattering cross section which was calculated using the different scattering moment.

  14. Double barreled Uncertainties: Behavioral Assumptions and Geophysical Sensitivity

    NASA Astrophysics Data System (ADS)

    Schneider, S. H.

    2008-12-01

    How many people will there be in the world, what standards of living will they demand and what technologies will they use to achieve such standards is the first area of uncertainties--it determines the behavioral assumptions behind emissions scenarios. It will be argued that overshoot scenarios of greenhouse gas concentrations will be the most probable behavioral assumption. Then, estimating climate sensitivity uncertainties from available literature constitutes the second area of uncertainty, that is a co-factor with emissions uncertainties in determining eventual climate changes and impacts.

  15. Sensitivity analysis for handling uncertainty in an economic evaluation.

    PubMed

    Limwattananon, Supon

    2014-05-01

    To meet updated international standards, this paper revises the previous Thai guidelines for conducting sensitivity analyses as part of the decision analysis model for health technology assessment. It recommends both deterministic and probabilistic sensitivity analyses to handle uncertainty of the model parameters, which are best represented graphically. Two new methodological issues are introduced-a threshold analysis of medicines' unit prices for fulfilling the National Lists of Essential Medicines' requirements and the expected value of information for delaying decision-making in contexts where there are high levels of uncertainty. Further research is recommended where parameter uncertainty is significant and where the cost of conducting the research is not prohibitive. PMID:24964700

  16. SCALE-6 Sensitivity/Uncertainty Methods and Covariance Data

    SciTech Connect

    Williams, Mark L; Rearden, Bradley T

    2008-01-01

    Computational methods and data used for sensitivity and uncertainty analysis within the SCALE nuclear analysis code system are presented. The methodology used to calculate sensitivity coefficients and similarity coefficients and to perform nuclear data adjustment is discussed. A description is provided of the SCALE-6 covariance library based on ENDF/B-VII and other nuclear data evaluations, supplemented by 'low-fidelity' approximate covariances. SCALE (Standardized Computer Analyses for Licensing Evaluation) is a modular code system developed by Oak Ridge National Laboratory (ORNL) to perform calculations for criticality safety, reactor physics, and radiation shielding applications. SCALE calculations typically use sequences that execute a predefined series of executable modules to compute particle fluxes and responses like the critical multiplication factor. SCALE also includes modules for sensitivity and uncertainty (S/U) analysis of calculated responses. The S/U codes in SCALE are collectively referred to as TSUNAMI (Tools for Sensitivity and UNcertainty Analysis Methodology Implementation). SCALE-6-scheduled for release in 2008-contains significant new capabilities, including important enhancements in S/U methods and data. The main functions of TSUNAMI are to (a) compute nuclear data sensitivity coefficients and response uncertainties, (b) establish similarity between benchmark experiments and design applications, and (c) reduce uncertainty in calculated responses by consolidating integral benchmark experiments. TSUNAMI includes easy-to-use graphical user interfaces for defining problem input and viewing three-dimensional (3D) geometries, as well as an integrated plotting package.

  17. VSHOT measurement uncertainty and sensitivity study

    SciTech Connect

    Jones, S.A.; Gruetzner, J.K.; Houser, R.M.; Edgar, R.M.; Wendelin, T.J.

    1997-08-01

    The Video Scanning Hartmann Optical Tester (VSHOT) is a slope-measuring tool for large, imprecise reflectors. It is a laser ray trace device developed to measure the optical quality of point-focus solar concentrating mirrors. A unique tool was needed because of the diverse geometry and very large size of solar concentrators, plus their large optical errors. To study the accuracy of VSHOT as well as its sensitivity to changes in test setup variables, a series of experiments were performed with a very precise, astronomical-grade mirror. The slope errors of the reference mirror were much smaller than the resolution of the VSHOT, so that any measured slope errors were caused by the instrument itself rather than the mirror. The VSHOT exceeded its accuracy goals by achieving about {+-}0.5% (68% confidence) error in the determination of focal length and {+-} 0.1 mrad (68% confidence) error in the determination of RMS slope error. Displacement of the test mirror from the optical axis caused the largest source of measured errors.

  18. Peer review of HEDR uncertainty and sensitivity analyses plan

    SciTech Connect

    Hoffman, F.O.

    1993-06-01

    This report consists of a detailed documentation of the writings and deliberations of the peer review panel that met on May 24--25, 1993 in Richland, Washington to evaluate your draft report ``Uncertainty/Sensitivity Analysis Plan`` (PNWD-2124 HEDR). The fact that uncertainties are being considered in temporally and spatially varying parameters through the use of alternative time histories and spatial patterns deserves special commendation. It is important to identify early those model components and parameters that will have the most influence on the magnitude and uncertainty of the dose estimates. These are the items that should be investigated most intensively prior to committing to a final set of results.

  19. Sensitivity of wildlife habitat models to uncertainties in GIS data

    NASA Technical Reports Server (NTRS)

    Stoms, David M.; Davis, Frank W.; Cogan, Christopher B.

    1992-01-01

    Decision makers need to know the reliability of output products from GIS analysis. For many GIS applications, it is not possible to compare these products to an independent measure of 'truth'. Sensitivity analysis offers an alternative means of estimating reliability. In this paper, we present a CIS-based statistical procedure for estimating the sensitivity of wildlife habitat models to uncertainties in input data and model assumptions. The approach is demonstrated in an analysis of habitat associations derived from a GIS database for the endangered California condor. Alternative data sets were generated to compare results over a reasonable range of assumptions about several sources of uncertainty. Sensitivity analysis indicated that condor habitat associations are relatively robust, and the results have increased our confidence in our initial findings. Uncertainties and methods described in the paper have general relevance for many GIS applications.

  20. Simulated Climate Sensitivity Uncertainty: Control Climate Bias vs. Perturbed Physics

    NASA Astrophysics Data System (ADS)

    Dommenget, D.

    2013-12-01

    In this talk I address the relationship between climate model biases in the control climate and the simulated climate sensitivity. On the basis of a globally resolved energy balance (GREB) model a number of perturbed physics ensembles are discussed. It is illustrated that the uncertainties in the simulated climate sensitivity can be conceptually split into two parts: a direct effect of the perturbed physics on the climate sensitivity independent of the control mean climate and an indirect effect of the perturbed physics by changing the control mean climate, which in turn changes the climate sensitivity, as the climate sensitivity itself is depending on the control climate. It is shown that the two effects are opposing each other. Biases in the control climate are negatively correlated with the climate sensitivity (colder climates have large sensitivities) and perturbed physics are in average positively correlated with the climate sensitivity (perturbed parameters that lead to warmer control climates lead to larger climate sensitivities). In the GREB model the biases in the control climate are more important effect for the regional climate sensitivity uncertainties, but on the global mean climate sensitivity both, the biases in the control climate and the perturbed physics are equally important.

  1. Sensitivity and uncertainty analysis for Abreu & Johnson numerical vapor intrusion model.

    PubMed

    Ma, Jie; Yan, Guangxu; Li, Haiyan; Guo, Shaohui

    2016-03-01

    This study conducted one-at-a-time (OAT) sensitivity and uncertainty analysis for a numerical vapor intrusion model for nine input parameters, including soil porosity, soil moisture, soil air permeability, aerobic biodegradation rate, building depressurization, crack width, floor thickness, building volume, and indoor air exchange rate. Simulations were performed for three soil types (clay, silt, and sand), two source depths (3 and 8m), and two source concentrations (1 and 400 g/m(3)). Model sensitivity and uncertainty for shallow and high-concentration vapor sources (3m and 400 g/m(3)) are much smaller than for deep and low-concentration sources (8m and 1g/m(3)). For high-concentration sources, soil air permeability, indoor air exchange rate, and building depressurization (for high permeable soil like sand) are key contributors to model output uncertainty. For low-concentration sources, soil porosity, soil moisture, aerobic biodegradation rate and soil gas permeability are key contributors to model output uncertainty. Another important finding is that impacts of aerobic biodegradation on vapor intrusion potential of petroleum hydrocarbons are negligible when vapor source concentration is high, because of insufficient oxygen supply that limits aerobic biodegradation activities.

  2. Sensitivity and uncertainty analysis for Abreu & Johnson numerical vapor intrusion model.

    PubMed

    Ma, Jie; Yan, Guangxu; Li, Haiyan; Guo, Shaohui

    2016-03-01

    This study conducted one-at-a-time (OAT) sensitivity and uncertainty analysis for a numerical vapor intrusion model for nine input parameters, including soil porosity, soil moisture, soil air permeability, aerobic biodegradation rate, building depressurization, crack width, floor thickness, building volume, and indoor air exchange rate. Simulations were performed for three soil types (clay, silt, and sand), two source depths (3 and 8m), and two source concentrations (1 and 400 g/m(3)). Model sensitivity and uncertainty for shallow and high-concentration vapor sources (3m and 400 g/m(3)) are much smaller than for deep and low-concentration sources (8m and 1g/m(3)). For high-concentration sources, soil air permeability, indoor air exchange rate, and building depressurization (for high permeable soil like sand) are key contributors to model output uncertainty. For low-concentration sources, soil porosity, soil moisture, aerobic biodegradation rate and soil gas permeability are key contributors to model output uncertainty. Another important finding is that impacts of aerobic biodegradation on vapor intrusion potential of petroleum hydrocarbons are negligible when vapor source concentration is high, because of insufficient oxygen supply that limits aerobic biodegradation activities. PMID:26619051

  3. Uncertainty and sensitivity analysis for photovoltaic system modeling.

    SciTech Connect

    Hansen, Clifford W.; Pohl, Andrew Phillip; Jordan, Dirk

    2013-12-01

    We report an uncertainty and sensitivity analysis for modeling DC energy from photovoltaic systems. We consider two systems, each comprised of a single module using either crystalline silicon or CdTe cells, and located either at Albuquerque, NM, or Golden, CO. Output from a PV system is predicted by a sequence of models. Uncertainty in the output of each model is quantified by empirical distributions of each model's residuals. We sample these distributions to propagate uncertainty through the sequence of models to obtain an empirical distribution for each PV system's output. We considered models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane-of-array irradiance; (2) estimate effective irradiance from plane-of-array irradiance; (3) predict cell temperature; and (4) estimate DC voltage, current and power. We found that the uncertainty in PV system output to be relatively small, on the order of 1% for daily energy. Four alternative models were considered for the POA irradiance modeling step; we did not find the choice of one of these models to be of great significance. However, we observed that the POA irradiance model introduced a bias of upwards of 5% of daily energy which translates directly to a systematic difference in predicted energy. Sensitivity analyses relate uncertainty in the PV system output to uncertainty arising from each model. We found that the residuals arising from the POA irradiance and the effective irradiance models to be the dominant contributors to residuals for daily energy, for either technology or location considered. This analysis indicates that efforts to reduce the uncertainty in PV system output should focus on improvements to the POA and effective irradiance models.

  4. Uncertainty and Sensitivity Analyses of Duct Propagation Models

    NASA Technical Reports Server (NTRS)

    Nark, Douglas M.; Watson, Willie R.; Jones, Michael G.

    2008-01-01

    This paper presents results of uncertainty and sensitivity analyses conducted to assess the relative merits of three duct propagation codes. Results from this study are intended to support identification of a "working envelope" within which to use the various approaches underlying these propagation codes. This investigation considers a segmented liner configuration that models the NASA Langley Grazing Incidence Tube, for which a large set of measured data was available. For the uncertainty analysis, the selected input parameters (source sound pressure level, average Mach number, liner impedance, exit impedance, static pressure and static temperature) are randomly varied over a range of values. Uncertainty limits (95% confidence levels) are computed for the predicted values from each code, and are compared with the corresponding 95% confidence intervals in the measured data. Generally, the mean values of the predicted attenuation are observed to track the mean values of the measured attenuation quite well and predicted confidence intervals tend to be larger in the presence of mean flow. A two-level, six factor sensitivity study is also conducted in which the six inputs are varied one at a time to assess their effect on the predicted attenuation. As expected, the results demonstrate the liner resistance and reactance to be the most important input parameters. They also indicate the exit impedance is a significant contributor to uncertainty in the predicted attenuation.

  5. Sensitivity and uncertainty analysis applied to the JHR reactivity prediction

    SciTech Connect

    Leray, O.; Vaglio-Gaudard, C.; Hudelot, J. P.; Santamarina, A.; Noguere, G.; Di-Salvo, J.

    2012-07-01

    The on-going AMMON program in EOLE reactor at CEA Cadarache (France) provides experimental results to qualify the HORUS-3D/N neutronics calculation scheme used for the design and safety studies of the new Material Testing Jules Horowitz Reactor (JHR). This paper presents the determination of technological and nuclear data uncertainties on the core reactivity and the propagation of the latter from the AMMON experiment to JHR. The technological uncertainty propagation was performed with a direct perturbation methodology using the 3D French stochastic code TRIPOLI4 and a statistical methodology using the 2D French deterministic code APOLLO2-MOC which leads to a value of 289 pcm (1{sigma}). The Nuclear Data uncertainty propagation relies on a sensitivity study on the main isotopes and the use of a retroactive marginalization method applied to the JEFF 3.1.1 {sup 27}Al evaluation in order to obtain a realistic multi-group covariance matrix associated with the considered evaluation. This nuclear data uncertainty propagation leads to a K{sub eff} uncertainty of 624 pcm for the JHR core and 684 pcm for the AMMON reference configuration core. Finally, transposition and reduction of the prior uncertainty were made using the Representativity method which demonstrates the similarity of the AMMON experiment with JHR (the representativity factor is 0.95). The final impact of JEFF 3.1.1 nuclear data on the Begin Of Life (BOL) JHR reactivity calculated by the HORUS-3D/N V4.0 is a bias of +216 pcm with an associated posterior uncertainty of 304 pcm (1{sigma}). (authors)

  6. Sensitivity and uncertainty analysis of a polyurethane foam decomposition model

    SciTech Connect

    HOBBS,MICHAEL L.; ROBINSON,DAVID G.

    2000-03-14

    Sensitivity/uncertainty analyses are not commonly performed on complex, finite-element engineering models because the analyses are time consuming, CPU intensive, nontrivial exercises that can lead to deceptive results. To illustrate these ideas, an analytical sensitivity/uncertainty analysis is used to determine the standard deviation and the primary factors affecting the burn velocity of polyurethane foam exposed to firelike radiative boundary conditions. The complex, finite element model has 25 input parameters that include chemistry, polymer structure, and thermophysical properties. The response variable was selected as the steady-state burn velocity calculated as the derivative of the burn front location versus time. The standard deviation of the burn velocity was determined by taking numerical derivatives of the response variable with respect to each of the 25 input parameters. Since the response variable is also a derivative, the standard deviation is essentially determined from a second derivative that is extremely sensitive to numerical noise. To minimize the numerical noise, 50-micron elements and approximately 1-msec time steps were required to obtain stable uncertainty results. The primary effect variable was shown to be the emissivity of the foam.

  7. Sensitivity of quantitative photoacoustic tomography inversion schemes to experimental uncertainty

    NASA Astrophysics Data System (ADS)

    Fonseca, Martina; Saratoon, Teedah; Zeqiri, Bajram; Beard, Paul; Cox, Ben

    2016-03-01

    The ability to accurately quantify chromophore concentration from photoacoustic images would have a major impact on pre-clinical and clinical imaging. Recent years have seen significant advances in the theoretical understanding of quantitative photoacoustic imaging and in the development of model-based inversion strategies that overcome issues such as non-uniqueness and non-linearity. Nevertheless, their full in vivo implementation has not successfully been achieved, partially because experimental uncertainties complicate the transition. In this study, a sensitivity analysis is performed to assess the impact on accuracy of having uncertainty in critical experimental parameters such as scattering, beam diameter, beam position and calibration factor. This study was performed using two virtual phantoms, at one illumination and four optical wavelengths. The model-based inversion was applied in 3 variants - one just inverting for chromophores and two others further inverting for either a scaling factor or the scatterer concentration. The performance of these model-based inversions is also compared to linear unmixing strategies - with and without fluence correction. The results show that experimental uncertainties in a priori fixed parameters - especially calibration factor and scatterer concentration - significantly affect accuracy of model-based inversions and therefore measures to ameliorate this uncertainty should be considered. Including a scaling parameter in the inversion appears to improve quantification estimates. Furthermore, even with realistic levels of experimental uncertainty in model-based input parameters, they outperform linear unmixing approaches. If parameter uncertainty is large and has significant impact on accuracy, the parameter can be included as an unknown in model-based schemes.

  8. SENSIT: a cross-section and design sensitivity and uncertainty analysis code. [In FORTRAN for CDC-7600, IBM 360

    SciTech Connect

    Gerstl, S.A.W.

    1980-01-01

    SENSIT computes the sensitivity and uncertainty of a calculated integral response (such as a dose rate) due to input cross sections and their uncertainties. Sensitivity profiles are computed for neutron and gamma-ray reaction cross sections of standard multigroup cross section sets and for secondary energy distributions (SEDs) of multigroup scattering matrices. In the design sensitivity mode, SENSIT computes changes in an integral response due to design changes and gives the appropriate sensitivity coefficients. Cross section uncertainty analyses are performed for three types of input data uncertainties: cross-section covariance matrices for pairs of multigroup reaction cross sections, spectral shape uncertainty parameters for secondary energy distributions (integral SED uncertainties), and covariance matrices for energy-dependent response functions. For all three types of data uncertainties SENSIT computes the resulting variance and estimated standard deviation in an integral response of interest, on the basis of generalized perturbation theory. SENSIT attempts to be more comprehensive than earlier sensitivity analysis codes, such as SWANLAKE.

  9. Employing Sensitivity Derivatives for Robust Optimization under Uncertainty in CFD

    NASA Technical Reports Server (NTRS)

    Newman, Perry A.; Putko, Michele M.; Taylor, Arthur C., III

    2004-01-01

    A robust optimization is demonstrated on a two-dimensional inviscid airfoil problem in subsonic flow. Given uncertainties in statistically independent, random, normally distributed flow parameters (input variables), an approximate first-order statistical moment method is employed to represent the Computational Fluid Dynamics (CFD) code outputs as expected values with variances. These output quantities are used to form the objective function and constraints. The constraints are cast in probabilistic terms; that is, the probability that a constraint is satisfied is greater than or equal to some desired target probability. Gradient-based robust optimization of this stochastic problem is accomplished through use of both first and second-order sensitivity derivatives. For each robust optimization, the effect of increasing both input standard deviations and target probability of constraint satisfaction are demonstrated. This method provides a means for incorporating uncertainty when considering small deviations from input mean values.

  10. Sensitivity and Uncertainty Study for Thermal Molten Salt Reactors

    NASA Astrophysics Data System (ADS)

    Bidaud, Adrien; Ivanona, Tatiana; Mastrangelo, Victor; Kodeli, Ivo

    2006-04-01

    The Thermal Molten Salt Reactor (TMSR) using the thorium cycle can achieve the GEN IV objectives of economy, safety, non-proliferation and durability. Its low production of higher actinides, coupled with its breeding capabilities - even with a thermal spectrum - are very valuable characteristics for an innovative reactor. Furthermore, the thorium cycle is more flexible than the uranium cycle since only a small fissile inventory (<2 tons by GWe) is required to start one reactor. The potential of these reactors is currently being extensively studied at the CNRS and EdF /1,2/. A simplified chemical reprocessing is envisaged compared to that used for the former Molten Salt Breeder Reactor (MSBR). The MSBR concept was developed at Oak Ridge National Laboratory (ORNL) in the 1970's based on the Molten Salt Reactor Experiment (MSRE). The main goals of our current studies are to achieve a reactor concept that enables breeding, improved safety and having chemical reprocessing needs reduced and simplified as much as reasonably possible. The neutronic properties of the new TMSR concept are presented in this paper. As the temperature coefficient is close to zero, we will see that the moderation ratio cannot be chosen to simultaneously achieve a high breeding ratio, long graphite lifetime and low uranium inventory. It is clear that any safety margin taken due to uncertainty in the nuclear data will significantly reduce the capability of this concept, thus a sensitivity analysis is vital to propose measurements which would allow to reduce at present high uncertainties in the design parameters of this reactor. Two methodologies, one based on OECD/NEA deterministic codes and one on IPPE (Obninsk) stochastic code, are compared for keff sensitivity analysis. The uncertainty analysis of keff using covariance matrices available in evaluated files has been performed. Furthermore, a comparison of temperature coefficient sensitivity profiles is presented for the most important reactions

  11. CALIBRATION, OPTIMIZATION, AND SENSITIVITY AND UNCERTAINTY ALGORITHMS APPLICATION PROGRAMMING INTERFACE (COSU-API)

    EPA Science Inventory

    The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and Parameter Estimation (UA/SA/PE API) tool development, here fore referred to as the Calibration, Optimization, and Sensitivity and Uncertainty Algorithms API (COSU-API), was initially d...

  12. Climate sensitivity uncertainty: when is good news bad?

    PubMed

    Freeman, Mark C; Wagner, Gernot; Zeckhauser, Richard J

    2015-11-28

    Climate change is real and dangerous. Exactly how bad it will get, however, is uncertain. Uncertainty is particularly relevant for estimates of one of the key parameters: equilibrium climate sensitivity--how eventual temperatures will react as atmospheric carbon dioxide concentrations double. Despite significant advances in climate science and increased confidence in the accuracy of the range itself, the 'likely' range has been 1.5-4.5°C for over three decades. In 2007, the Intergovernmental Panel on Climate Change (IPCC) narrowed it to 2-4.5°C, only to reverse its decision in 2013, reinstating the prior range. In addition, the 2013 IPCC report removed prior mention of 3°C as the 'best estimate'. We interpret the implications of the 2013 IPCC decision to lower the bottom of the range and excise a best estimate. Intuitively, it might seem that a lower bottom would be good news. Here we ask: when might apparently good news about climate sensitivity in fact be bad news in the sense that it lowers societal well-being? The lowered bottom value also implies higher uncertainty about the temperature increase, definitely bad news. Under reasonable assumptions, both the lowering of the lower bound and the removal of the 'best estimate' may well be bad news.

  13. Climate sensitivity uncertainty: when is good news bad?

    PubMed

    Freeman, Mark C; Wagner, Gernot; Zeckhauser, Richard J

    2015-11-28

    Climate change is real and dangerous. Exactly how bad it will get, however, is uncertain. Uncertainty is particularly relevant for estimates of one of the key parameters: equilibrium climate sensitivity--how eventual temperatures will react as atmospheric carbon dioxide concentrations double. Despite significant advances in climate science and increased confidence in the accuracy of the range itself, the 'likely' range has been 1.5-4.5°C for over three decades. In 2007, the Intergovernmental Panel on Climate Change (IPCC) narrowed it to 2-4.5°C, only to reverse its decision in 2013, reinstating the prior range. In addition, the 2013 IPCC report removed prior mention of 3°C as the 'best estimate'. We interpret the implications of the 2013 IPCC decision to lower the bottom of the range and excise a best estimate. Intuitively, it might seem that a lower bottom would be good news. Here we ask: when might apparently good news about climate sensitivity in fact be bad news in the sense that it lowers societal well-being? The lowered bottom value also implies higher uncertainty about the temperature increase, definitely bad news. Under reasonable assumptions, both the lowering of the lower bound and the removal of the 'best estimate' may well be bad news. PMID:26460117

  14. Sensitivity to Uncertainty in Asteroid Impact Risk Assessment

    NASA Astrophysics Data System (ADS)

    Mathias, D.; Wheeler, L.; Prabhu, D. K.; Aftosmis, M.; Dotson, J.; Robertson, D. K.

    2015-12-01

    The Engineering Risk Assessment (ERA) team at NASA Ames Research Center is developing a physics-based impact risk model for probabilistically assessing threats from potential asteroid impacts on Earth. The model integrates probabilistic sampling of asteroid parameter ranges with physics-based analyses of entry, breakup, and impact to estimate damage areas and casualties from various impact scenarios. Assessing these threats is a highly coupled, dynamic problem involving significant uncertainties in the range of expected asteroid characteristics, how those characteristics may affect the level of damage, and the fidelity of various modeling approaches and assumptions. The presented model is used to explore the sensitivity of impact risk estimates to these uncertainties in order to gain insight into what additional data or modeling refinements are most important for producing effective, meaningful risk assessments. In the extreme cases of very small or very large impacts, the results are generally insensitive to many of the characterization and modeling assumptions. However, the nature of the sensitivity can change across moderate-sized impacts. Results will focus on the value of additional information in this critical, mid-size range, and how this additional data can support more robust mitigation decisions.

  15. Uncertainty estimates in broadband seismometer sensitivities using microseisms

    USGS Publications Warehouse

    Ringler, Adam T.; Storm, Tyler L.; Gee, Lind S.; Hutt, Charles R.; Wilson, David C.

    2015-01-01

    The midband sensitivity of a seismic instrument is one of the fundamental parameters used in published station metadata. Any errors in this value can compromise amplitude estimates in otherwise high-quality data. To estimate an upper bound in the uncertainty of the midband sensitivity for modern broadband instruments, we compare daily microseism (4- to 8-s period) amplitude ratios between the vertical components of colocated broadband sensors across the IRIS/USGS (network code IU) seismic network. We find that the mean of the 145,972 daily ratios used between 2002 and 2013 is 0.9895 with a standard deviation of 0.0231. This suggests that the ratio between instruments shows a small bias and considerable scatter. We also find that these ratios follow a standard normal distribution (R 2 = 0.95442), which suggests that the midband sensitivity of an instrument has an error of no greater than ±6 % with a 99 % confidence interval. This gives an upper bound on the precision to which we know the sensitivity of a fielded instrument.

  16. A Peep into the Uncertainty-Complexity-Relevance Modeling Trilemma through Global Sensitivity and Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.

    2014-12-01

    Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping

  17. Uncertainty in the analysis of the overall equipment effectiveness on the shop floor

    NASA Astrophysics Data System (ADS)

    Rößler, M. P.; Abele, E.

    2013-06-01

    In this article an approach will be presented which supports transparency regarding the effectiveness of manufacturing equipment by combining the fuzzy set theory with the method of the overall equipment effectiveness analysis. One of the key principles of lean production and also a fundamental task in production optimization projects is the prior analysis of the current state of a production system by the use of key performance indicators to derive possible future states. The current state of the art in overall equipment effectiveness analysis is usually performed by cumulating different machine states by means of decentralized data collection without the consideration of uncertainty. In manual data collection or semi-automated plant data collection systems the quality of derived data often diverges and leads optimization teams to distorted conclusions about the real optimization potential of manufacturing equipment. The method discussed in this paper is to help practitioners to get more reliable results in the analysis phase and so better results of optimization projects. Under consideration of a case study obtained results are discussed.

  18. Sensitivity and Uncertainty in Detonation Shock Dynamics Parameterization

    NASA Astrophysics Data System (ADS)

    Chiquete, Carlos; Short, Mark; Jackson, Scott

    2013-06-01

    Detonation shock dynamics (DSD) is the timing component of an advanced programmed burn model of detonation propagation in high explosives (HE). In DSD theory, the detonation-driving zone is replaced with a propagating surface in which the surface normal velocity is a function of the local surface curvature, the so-called Dn - κ relation for the HE. This relation is calibrated by assuming a functional form relating Dn and κ, and then fitting the function parameters via minimization of a weighted error function of residuals based on shock-shape curves and a diameter effect curve. In general, for a given HE, the greater the available shock-shape data at different rate-stick radii, the less the uncertainty in the DSD fit. For a wide range of HEs, however, no shock shape data is available, and DSD calibrations must be based on diameter effect data alone. With this limited data, potentially large variations in the DSD parameters can occur that fit the diameter effect curve to within a given residual error. We explore uncertainty issues in DSD parameterization when limited calibration data is available and the implications of the resulting sensitivities in timing, highlighting differences between ideal, insensitive and non-ideal HEs such as Cyclotol, IMX-104 and ANFO.

  19. Fission spectrum covariance matrix and sensitivity coefficients for response parameter uncertainty estimation.

    SciTech Connect

    Yang, W. S.; Aliberti, G.; McKnight, R. D.; Kodeli, I.; Nuclear Engineering Division; IAEA Rep at OECD /NEA Data Bank

    2008-12-01

    This paper discusses the consistent usage of fission spectrum covariance matrices and sensitivity coefficients for response parameter uncertainty estimation. The effects of covariance matrix normalization on response parameter uncertainties are described from a mathematical point of view, along with their inter-relation with the constrained sensitivity coefficients. The numerical precision for practical renormalization of covariance matrices and the impact of the constrained sensitivity coefficients are also discussed by estimating the multiplication factor uncertainties due to fission spectrum uncertainties for a sodium-cooled fast burner core concept.

  20. Sensitivity of collective action to uncertainty about climate tipping points

    NASA Astrophysics Data System (ADS)

    Barrett, Scott; Dannenberg, Astrid

    2014-01-01

    Despite more than two decades of diplomatic effort, concentrations of greenhouse gases continue to trend upwards, creating the risk that we may someday cross a threshold for `dangerous' climate change. Although climate thresholds are very uncertain, new research is trying to devise `early warning signals' of an approaching tipping point. This research offers a tantalizing promise: whereas collective action fails when threshold uncertainty is large, reductions in this uncertainty may bring about the behavioural change needed to avert a climate `catastrophe'. Here we present the results of an experiment, rooted in a game-theoretic model, showing that behaviour differs markedly either side of a dividing line for threshold uncertainty. On one side of the dividing line, where threshold uncertainty is relatively large, free riding proves irresistible and trust illusive, making it virtually inevitable that the tipping point will be crossed. On the other side, where threshold uncertainty is small, the incentive to coordinate is strong and trust more robust, often leading the players to avoid crossing the tipping point. Our results show that uncertainty must be reduced to this `good' side of the dividing line to stimulate the behavioural shift needed to avoid `dangerous' climate change.

  1. Predicting Residential Exposure to Phthalate Plasticizer Emitted from Vinyl Flooring: Sensitivity, Uncertainty, and Implications for Biomonitoring

    EPA Science Inventory

    Given the ubiquitous nature of phthalates in the environment and the potential for adverse human health impacts, there is a need to understand the potential human exposure. A three-compartment model is developed to estimate the emission rate of di-2-ethylhexyl phthalate (DEHP) f...

  2. Sensitivity and Uncertainty Analysis to Burnup Estimates on ADS using the ACAB Code

    NASA Astrophysics Data System (ADS)

    Cabellos, O.; Sanz, J.; Rodríguez, A.; González, E.; Embid, M.; Alvarez, F.; Reyes, S.

    2005-05-01

    Within the scope of the Accelerator Driven System (ADS) concept for nuclear waste management applications, the burnup uncertainty estimates due to uncertainty in the activation cross sections (XSs) are important regarding both the safety and the efficiency of the waste burning process. We have applied both sensitivity analysis and Monte Carlo methodology to actinides burnup calculations in a lead-bismuth cooled subcritical ADS. The sensitivity analysis is used to identify the reaction XSs and the dominant chains that contribute most significantly to the uncertainty. The Monte Carlo methodology gives the burnup uncertainty estimates due to the synergetic/global effect of the complete set of XS uncertainties. These uncertainty estimates are valuable to assess the need of any experimental or systematic re-evaluation of some uncertainty XSs for ADS.

  3. Sensitivity and Uncertainty Analysis to Burn-up Estimates on ADS Using ACAB Code

    SciTech Connect

    Cabellos, O; Sanz, J; Rodriguez, A; Gonzalez, E; Embid, M; Alvarez, F; Reyes, S

    2005-02-11

    Within the scope of the Accelerator Driven System (ADS) concept for nuclear waste management applications, the burnup uncertainty estimates due to uncertainty in the activation cross sections (XSs) are important regarding both the safety and the efficiency of the waste burning process. We have applied both sensitivity analysis and Monte Carlo methodology to actinides burnup calculations in a lead-bismuth cooled subcritical ADS. The sensitivity analysis is used to identify the reaction XSs and the dominant chains that contribute most significantly to the uncertainty. The Monte Carlo methodology gives the burnup uncertainty estimates due to the synergetic/global effect of the complete set of XS uncertainties. These uncertainty estimates are valuable to assess the need of any experimental or systematic reevaluation of some uncertainty XSs for ADS.

  4. Sensitivity and Uncertainty Analysis to Burnup Estimates on ADS using the ACAB Code

    SciTech Connect

    Cabellos, O.; Sanz, J.; Rodriguez, A.; Gonzalez, E.; Embid, M.; Alvarez, F.; Reyes, S.

    2005-05-24

    Within the scope of the Accelerator Driven System (ADS) concept for nuclear waste management applications, the burnup uncertainty estimates due to uncertainty in the activation cross sections (XSs) are important regarding both the safety and the efficiency of the waste burning process. We have applied both sensitivity analysis and Monte Carlo methodology to actinides burnup calculations in a lead-bismuth cooled subcritical ADS. The sensitivity analysis is used to identify the reaction XSs and the dominant chains that contribute most significantly to the uncertainty. The Monte Carlo methodology gives the burnup uncertainty estimates due to the synergetic/global effect of the complete set of XS uncertainties. These uncertainty estimates are valuable to assess the need of any experimental or systematic re-evaluation of some uncertainty XSs for ADS.

  5. Uncertainty and Sensitivity Assessments of GPS and GIS Integrated Applications for Transportation

    PubMed Central

    Hong, Sungchul; Vonderohe, Alan P.

    2014-01-01

    Uncertainty and sensitivity analysis methods are introduced, concerning the quality of spatial data as well as that of output information from Global Positioning System (GPS) and Geographic Information System (GIS) integrated applications for transportation. In the methods, an error model and an error propagation method form a basis for formulating characterization and propagation of uncertainties. They are developed in two distinct approaches: analytical and simulation. Thus, an initial evaluation is performed to compare and examine uncertainty estimations from the analytical and simulation approaches. The evaluation results show that estimated ranges of output information from the analytical and simulation approaches are compatible, but the simulation approach rather than the analytical approach is preferred for uncertainty and sensitivity analyses, due to its flexibility and capability to realize positional errors in both input data. Therefore, in a case study, uncertainty and sensitivity analyses based upon the simulation approach is conducted on a winter maintenance application. The sensitivity analysis is used to determine optimum input data qualities, and the uncertainty analysis is then applied to estimate overall qualities of output information from the application. The analysis results show that output information from the non-distance-based computation model is not sensitive to positional uncertainties in input data. However, for the distance-based computational model, output information has a different magnitude of uncertainties, depending on position uncertainties in input data. PMID:24518894

  6. Uncertainty and sensitivity assessments of GPS and GIS integrated applications for transportation.

    PubMed

    Hong, Sungchul; Vonderohe, Alan P

    2014-02-10

    Uncertainty and sensitivity analysis methods are introduced, concerning the quality of spatial data as well as that of output information from Global Positioning System (GPS) and Geographic Information System (GIS) integrated applications for transportation. In the methods, an error model and an error propagation method form a basis for formulating characterization and propagation of uncertainties. They are developed in two distinct approaches: analytical and simulation. Thus, an initial evaluation is performed to compare and examine uncertainty estimations from the analytical and simulation approaches. The evaluation results show that estimated ranges of output information from the analytical and simulation approaches are compatible, but the simulation approach rather than the analytical approach is preferred for uncertainty and sensitivity analyses, due to its flexibility and capability to realize positional errors in both input data. Therefore, in a case study, uncertainty and sensitivity analyses based upon the simulation approach is conducted on a winter maintenance application. The sensitivity analysis is used to determine optimum input data qualities, and the uncertainty analysis is then applied to estimate overall qualities of output information from the application. The analysis results show that output information from the non-distance-based computation model is not sensitive to positional uncertainties in input data. However, for the distance-based computational model, output information has a different magnitude of uncertainties, depending on position uncertainties in input data.

  7. Uncertainty Reduction using Bayesian Inference and Sensitivity Analysis: A Sequential Approach to the NASA Langley Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar

    2016-01-01

    This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.

  8. TSUNAMI Primer: A Primer for Sensitivity/Uncertainty Calculations with SCALE

    SciTech Connect

    Rearden, Bradley T; Mueller, Don; Bowman, Stephen M; Busch, Robert D.; Emerson, Scott

    2009-01-01

    This primer presents examples in the application of the SCALE/TSUNAMI tools to generate k{sub eff} sensitivity data for one- and three-dimensional models using TSUNAMI-1D and -3D and to examine uncertainties in the computed k{sub eff} values due to uncertainties in the cross-section data used in their calculation. The proper use of unit cell data and need for confirming the appropriate selection of input parameters through direct perturbations are described. The uses of sensitivity and uncertainty data to identify and rank potential sources of computational bias in an application system and TSUNAMI tools for assessment of system similarity using sensitivity and uncertainty criteria are demonstrated. Uses of these criteria in trending analyses to assess computational biases, bias uncertainties, and gap analyses are also described. Additionally, an application of the data adjustment tool TSURFER is provided, including identification of specific details of sources of computational bias.

  9. Users manual for the FORSS sensitivity and uncertainty analysis code system

    SciTech Connect

    Lucius, J.L.; Weisbin, C.R.; Marable, J.H.; Drischler, J.D.; Wright, R.Q.; White, J.E.

    1981-01-01

    FORSS is a code system used to study relationships between nuclear reaction cross sections, integral experiments, reactor performance parameter predictions and associated uncertainties. This report describes the computing environment and the modules currently used to implement FORSS Sensitivity and Uncertainty Methodology.

  10. The Relationship between Intolerance of Uncertainty, Sensory Sensitivities, and Anxiety in Autistic and Typically Developing Children

    ERIC Educational Resources Information Center

    Neil, Louise; Olsson, Nora Choque; Pellicano, Elizabeth

    2016-01-01

    Guided by a recent theory that proposes fundamental differences in how autistic individuals deal with uncertainty, we investigated the extent to which the cognitive construct "intolerance of uncertainty" and anxiety were related to parental reports of sensory sensitivities in 64 autistic and 85 typically developing children aged…

  11. Sensitivity and uncertainty analysis of reactivities for UO2 and MOX fueled PWR cells

    NASA Astrophysics Data System (ADS)

    Foad, Basma; Takeda, Toshikazu

    2015-12-01

    The purpose of this paper is to apply our improved method for calculating sensitivities and uncertainties of reactivity responses for UO2 and MOX fueled pressurized water reactor cells. The improved method has been used to calculate sensitivity coefficients relative to infinite dilution cross-sections, where the self-shielding effect is taken into account. Two types of reactivities are considered: Doppler reactivity and coolant void reactivity, for each type of reactivity, the sensitivities are calculated for small and large perturbations. The results have demonstrated that the reactivity responses have larger relative uncertainty than eigenvalue responses. In addition, the uncertainty of coolant void reactivity is much greater than Doppler reactivity especially for large perturbations. The sensitivity coefficients and uncertainties of both reactivities were verified by comparing with SCALE code results using ENDF/B-VII library and good agreements have been found.

  12. Sensitivity and uncertainty analysis of reactivities for UO2 and MOX fueled PWR cells

    SciTech Connect

    Foad, Basma; Takeda, Toshikazu

    2015-12-31

    The purpose of this paper is to apply our improved method for calculating sensitivities and uncertainties of reactivity responses for UO{sub 2} and MOX fueled pressurized water reactor cells. The improved method has been used to calculate sensitivity coefficients relative to infinite dilution cross-sections, where the self-shielding effect is taken into account. Two types of reactivities are considered: Doppler reactivity and coolant void reactivity, for each type of reactivity, the sensitivities are calculated for small and large perturbations. The results have demonstrated that the reactivity responses have larger relative uncertainty than eigenvalue responses. In addition, the uncertainty of coolant void reactivity is much greater than Doppler reactivity especially for large perturbations. The sensitivity coefficients and uncertainties of both reactivities were verified by comparing with SCALE code results using ENDF/B-VII library and good agreements have been found.

  13. PROBABILISTIC SENSITIVITY AND UNCERTAINTY ANALYSIS WORKSHOP SUMMARY REPORT

    SciTech Connect

    Seitz, R

    2008-06-25

    Stochastic or probabilistic modeling approaches are being applied more frequently in the United States and globally to quantify uncertainty and enhance understanding of model response in performance assessments for disposal of radioactive waste. This increased use has resulted in global interest in sharing results of research and applied studies that have been completed to date. This technical report reflects the results of a workshop that was held to share results of research and applied work related to performance assessments conducted at United States Department of Energy sites. Key findings of this research and applied work are discussed and recommendations for future activities are provided.

  14. Soil moisture sensitivity of autotrophic and heterotrophic forest floor respiration in boreal xeric pine and mesic spruce forests

    NASA Astrophysics Data System (ADS)

    Ťupek, Boris; Launiainen, Samuli; Peltoniemi, Mikko; Heikkinen, Jukka; Lehtonen, Aleksi

    2016-04-01

    Litter decomposition rates of the most process based soil carbon models affected by environmental conditions are linked with soil heterotrophic CO2 emissions and serve for estimating soil carbon sequestration; thus due to the mass balance equation the variation in measured litter inputs and measured heterotrophic soil CO2 effluxes should indicate soil carbon stock changes, needed by soil carbon management for mitigation of anthropogenic CO2 emissions, if sensitivity functions of the applied model suit to the environmental conditions e.g. soil temperature and moisture. We evaluated the response forms of autotrophic and heterotrophic forest floor respiration to soil temperature and moisture in four boreal forest sites of the International Cooperative Programme on Assessment and Monitoring of Air Pollution Effects on Forests (ICP Forests) by a soil trenching experiment during year 2015 in southern Finland. As expected both autotrophic and heterotrophic forest floor respiration components were primarily controlled by soil temperature and exponential regression models generally explained more than 90% of the variance. Soil moisture regression models on average explained less than 10% of the variance and the response forms varied between Gaussian for the autotrophic forest floor respiration component and linear for the heterotrophic forest floor respiration component. Although the percentage of explained variance of soil heterotrophic respiration by the soil moisture was small, the observed reduction of CO2 emissions with higher moisture levels suggested that soil moisture response of soil carbon models not accounting for the reduction due to excessive moisture should be re-evaluated in order to estimate right levels of soil carbon stock changes. Our further study will include evaluation of process based soil carbon models by the annual heterotrophic respiration and soil carbon stocks.

  15. Calculating Sensitivities, Response and Uncertainties Within LODI for Precipitation Scavenging

    SciTech Connect

    Loosmore, G; Hsieh, H; Grant, K

    2004-01-21

    This paper describes an investigation into the uses of first-order, local sensitivity analysis in a Lagrangian dispersion code. The goal of the project is to gain knowledge not only about the sensitivity of the dispersion code predictions to the specific input parameters of interest, but also to better understand the uses and limitations of sensitivity analysis within such a context. The dispersion code of interest here is LODI, which is used for modeling emergency release scenarios at the Department of Energy's National Atmospheric Release Advisory Center (NARAC) at Lawrence Livermore National Laboratory. The NARAC system provides both real-time operational predictions and detailed assessments for atmospheric releases of hazardous materials. LODI is driven by a meteorological data assimilation model and an in-house version of COAMPS, the Naval Research Laboratory's mesoscale weather forecast model.

  16. The Role That Clouds Play in Uncertainty in the Climate Sensitivity

    NASA Astrophysics Data System (ADS)

    Dessler, A. E.

    2014-12-01

    Much of the uncertainty in evaluations of the climate sensitivity comes from the uncertainty in the cloud feedback. This comes from the unique property that clouds affect both the solar and infrared energy budgets of the planet, and these effects tend to offset. As a result, the net cloud effect is a small difference between large, offsetting terms. In addition, these estimates tend to be derived from short-term climate variations (e.g., ENSO). I will examine various estimates of the cloud feedback and investigate what they can tell us about the equilibrium climate sensitivity and its uncertainty.

  17. Uncertainty and Sensitivity Analysis in Performance Assessment for the Waste Isolation Pilot Plant

    SciTech Connect

    Helton, J.C.

    1998-12-17

    The Waste Isolation Pilot Plant (WIPP) is under development by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. This development has been supported by a sequence of performance assessments (PAs) carried out by Sandla National Laboratories (SNL) to assess what is known about the WIPP and to provide .tidance for future DOE research and development activities. Uncertainty and sensitivity analysis procedures based on Latin hypercube sampling and regression techniques play an important role in these PAs by providing an assessment of the uncertainty in important analysis outcomes and identi~ing the sources of thk uncertainty. Performance assessments for the WIPP are conceptually and computational] y interesting due to regulatory requirements to assess and display the effects of both stochastic (i.e., aleatory) and subjective (i.e., epistemic) uncertainty, where stochastic uncertainty arises from the possible disruptions that could occur over the 10,000 yr regulatory period associated with the WIPP and subjective uncertainty arises from an inability to unambi-aously characterize the many models and associated parameters required in a PA for the WIPP. The interplay between uncertainty analysis, sensitivity analysis, stochastic uncertainty and subjective uncertainty are discussed and illustrated in the context of a recent PA carried out by SNL to support an application by the DOE to the U.S. Environmental Protection Agency for the certification of the WIPP for the disposal of TRU waste.

  18. Approach for Input Uncertainty Propagation and Robust Design in CFD Using Sensitivity Derivatives

    NASA Technical Reports Server (NTRS)

    Putko, Michele M.; Taylor, Arthur C., III; Newman, Perry A.; Green, Lawrence L.

    2002-01-01

    An implementation of the approximate statistical moment method for uncertainty propagation and robust optimization for quasi 3-D Euler CFD code is presented. Given uncertainties in statistically independent, random, normally distributed input variables, first- and second-order statistical moment procedures are performed to approximate the uncertainty in the CFD output. Efficient calculation of both first- and second-order sensitivity derivatives is required. In order to assess the validity of the approximations, these moments are compared with statistical moments generated through Monte Carlo simulations. The uncertainties in the CFD input variables are also incorporated into a robust optimization procedure. For this optimization, statistical moments involving first-order sensitivity derivatives appear in the objective function and system constraints. Second-order sensitivity derivatives are used in a gradient-based search to successfully execute a robust optimization. The approximate methods used throughout the analyses are found to be valid when considering robustness about input parameter mean values.

  19. A Comparative Review of Sensitivity and Uncertainty Analysis of Large-Scale Systems - II: Statistical Methods

    SciTech Connect

    Cacuci, Dan G.; Ionescu-Bujor, Mihaela

    2004-07-15

    Part II of this review paper highlights the salient features of the most popular statistical methods currently used for local and global sensitivity and uncertainty analysis of both large-scale computational models and indirect experimental measurements. These statistical procedures represent sampling-based methods (random sampling, stratified importance sampling, and Latin Hypercube sampling), first- and second-order reliability algorithms (FORM and SORM, respectively), variance-based methods (correlation ratio-based methods, the Fourier Amplitude Sensitivity Test, and the Sobol Method), and screening design methods (classical one-at-a-time experiments, global one-at-a-time design methods, systematic fractional replicate designs, and sequential bifurcation designs). It is emphasized that all statistical uncertainty and sensitivity analysis procedures first commence with the 'uncertainty analysis' stage and only subsequently proceed to the 'sensitivity analysis' stage; this path is the exact reverse of the conceptual path underlying the methods of deterministic sensitivity and uncertainty analysis where the sensitivities are determined prior to using them for uncertainty analysis. By comparison to deterministic methods, statistical methods for uncertainty and sensitivity analysis are relatively easier to develop and use but cannot yield exact values of the local sensitivities. Furthermore, current statistical methods have two major inherent drawbacks as follows: 1. Since many thousands of simulations are needed to obtain reliable results, statistical methods are at best expensive (for small systems) or, at worst, impracticable (e.g., for large time-dependent systems).2. Since the response sensitivities and parameter uncertainties are inherently and inseparably amalgamated in the results produced by these methods, improvements in parameter uncertainties cannot be directly propagated to improve response uncertainties; rather, the entire set of simulations and

  20. Modelling survival: exposure pattern, species sensitivity and uncertainty.

    PubMed

    Ashauer, Roman; Albert, Carlo; Augustine, Starrlight; Cedergreen, Nina; Charles, Sandrine; Ducrot, Virginie; Focks, Andreas; Gabsi, Faten; Gergs, André; Goussen, Benoit; Jager, Tjalling; Kramer, Nynke I; Nyman, Anna-Maija; Poulsen, Veronique; Reichenberger, Stefan; Schäfer, Ralf B; Van den Brink, Paul J; Veltman, Karin; Vogel, Sören; Zimmer, Elke I; Preuss, Thomas G

    2016-01-01

    The General Unified Threshold model for Survival (GUTS) integrates previously published toxicokinetic-toxicodynamic models and estimates survival with explicitly defined assumptions. Importantly, GUTS accounts for time-variable exposure to the stressor. We performed three studies to test the ability of GUTS to predict survival of aquatic organisms across different pesticide exposure patterns, time scales and species. Firstly, using synthetic data, we identified experimental data requirements which allow for the estimation of all parameters of the GUTS proper model. Secondly, we assessed how well GUTS, calibrated with short-term survival data of Gammarus pulex exposed to four pesticides, can forecast effects of longer-term pulsed exposures. Thirdly, we tested the ability of GUTS to estimate 14-day median effect concentrations of malathion for a range of species and use these estimates to build species sensitivity distributions for different exposure patterns. We find that GUTS adequately predicts survival across exposure patterns that vary over time. When toxicity is assessed for time-variable concentrations species may differ in their responses depending on the exposure profile. This can result in different species sensitivity rankings and safe levels. The interplay of exposure pattern and species sensitivity deserves systematic investigation in order to better understand how organisms respond to stress, including humans. PMID:27381500

  1. Modelling survival: exposure pattern, species sensitivity and uncertainty

    NASA Astrophysics Data System (ADS)

    Ashauer, Roman; Albert, Carlo; Augustine, Starrlight; Cedergreen, Nina; Charles, Sandrine; Ducrot, Virginie; Focks, Andreas; Gabsi, Faten; Gergs, André; Goussen, Benoit; Jager, Tjalling; Kramer, Nynke I.; Nyman, Anna-Maija; Poulsen, Veronique; Reichenberger, Stefan; Schäfer, Ralf B.; van den Brink, Paul J.; Veltman, Karin; Vogel, Sören; Zimmer, Elke I.; Preuss, Thomas G.

    2016-07-01

    The General Unified Threshold model for Survival (GUTS) integrates previously published toxicokinetic-toxicodynamic models and estimates survival with explicitly defined assumptions. Importantly, GUTS accounts for time-variable exposure to the stressor. We performed three studies to test the ability of GUTS to predict survival of aquatic organisms across different pesticide exposure patterns, time scales and species. Firstly, using synthetic data, we identified experimental data requirements which allow for the estimation of all parameters of the GUTS proper model. Secondly, we assessed how well GUTS, calibrated with short-term survival data of Gammarus pulex exposed to four pesticides, can forecast effects of longer-term pulsed exposures. Thirdly, we tested the ability of GUTS to estimate 14-day median effect concentrations of malathion for a range of species and use these estimates to build species sensitivity distributions for different exposure patterns. We find that GUTS adequately predicts survival across exposure patterns that vary over time. When toxicity is assessed for time-variable concentrations species may differ in their responses depending on the exposure profile. This can result in different species sensitivity rankings and safe levels. The interplay of exposure pattern and species sensitivity deserves systematic investigation in order to better understand how organisms respond to stress, including humans.

  2. Modelling survival: exposure pattern, species sensitivity and uncertainty

    PubMed Central

    Ashauer, Roman; Albert, Carlo; Augustine, Starrlight; Cedergreen, Nina; Charles, Sandrine; Ducrot, Virginie; Focks, Andreas; Gabsi, Faten; Gergs, André; Goussen, Benoit; Jager, Tjalling; Kramer, Nynke I.; Nyman, Anna-Maija; Poulsen, Veronique; Reichenberger, Stefan; Schäfer, Ralf B.; Van den Brink, Paul J.; Veltman, Karin; Vogel, Sören; Zimmer, Elke I.; Preuss, Thomas G.

    2016-01-01

    The General Unified Threshold model for Survival (GUTS) integrates previously published toxicokinetic-toxicodynamic models and estimates survival with explicitly defined assumptions. Importantly, GUTS accounts for time-variable exposure to the stressor. We performed three studies to test the ability of GUTS to predict survival of aquatic organisms across different pesticide exposure patterns, time scales and species. Firstly, using synthetic data, we identified experimental data requirements which allow for the estimation of all parameters of the GUTS proper model. Secondly, we assessed how well GUTS, calibrated with short-term survival data of Gammarus pulex exposed to four pesticides, can forecast effects of longer-term pulsed exposures. Thirdly, we tested the ability of GUTS to estimate 14-day median effect concentrations of malathion for a range of species and use these estimates to build species sensitivity distributions for different exposure patterns. We find that GUTS adequately predicts survival across exposure patterns that vary over time. When toxicity is assessed for time-variable concentrations species may differ in their responses depending on the exposure profile. This can result in different species sensitivity rankings and safe levels. The interplay of exposure pattern and species sensitivity deserves systematic investigation in order to better understand how organisms respond to stress, including humans. PMID:27381500

  3. Sensitivity of Airburst Damage Prediction to Asteroid Characterization Uncertainty

    NASA Astrophysics Data System (ADS)

    Mathias, Donovan; Wheeler, Lorien; Dotson, Jessie L.

    2016-10-01

    Characterizing the level of risk posed by asteroid impacts is quintessential to developing informed mitigation criteria, response plans, and long-term survey and characterization strategies for potentially hazardous asteroids. A physics-based impact risk (PBIR) model has been created to assess the consequences of potential asteroid strikes by combining probabilistic sampling of uncertain impact parameters with numerical simulation of the atmospheric flight, breakup, and resulting ground damage for each sampled impact case. The model incudes a Monte Carlo framework that allows the uncertainties in the potential impact parameters to be described in terms of probability distributions, and produces statistical results that support inference regarding the threat level across those ranges. This work considers the PBIR model outputs in terms of potential threat characterization metrics for decision support. Several metrics are assessed, from the single estimated casualty (Ec) parameter to more descriptive distribution functions. Distributions are shown for aggregate risk, risk versus asteroid size, and risk to specific geographic regions. In addition, these results show how the uncertain properties of potential impactors can lead to different conclusions about optimal survey and characterization strategies.

  4. Sensitivity and uncertainty analysis for crop water footprint accounting at a basin level

    NASA Astrophysics Data System (ADS)

    Zhuo, L.; Mekonnen, M. M.; Hoekstra, A. Y.

    2013-12-01

    Water footprint has been recognized as a comprehensive indicator in water management to evaluate the human pressure on water resources from either production or consumption perspectives. The agricultural sector in particular crop production takes the largest share of the global water footprint. Water footprint of producing unit mass of a crop (m3/ ton) is normally expressed by single volumetric numbers referring to an average value for certain areas and periods. However, the divergence in crop water footprint accounts from different studies, primarily due to the input data quality, may confuse water users and managers. The study investigates the output sensitivity and uncertainty of the green (rainfall) and blue (irrigation water) crop water footprint to key input variables (reference evapotranspiration (ETo), precipitation (PR), crop coefficient (Kc) and crop calendar (D)) at a basin level. A grid-based daily water balance model was applied to compute water footprints of four major crops - maize, rice, soybean and wheat - in the Yellow River basin for 1996-2005 at a 5 by 5 arc minute resolution. Sensitivities of the yearly crop water footprints to individual input variability were assessed by the one-at-a-time (';sensitivity curve') method. Uncertainty in crop water footprint to input uncertainties were quantified through Monte Carlo simulations for selected years 1996 (wet), 2000 (dry) and 2005 (average). Results show that the crop water footprint is most sensitive to ETo and Kc, followed by D and PR. Blue water footprints were more sensitive than green water footprints to input variability. Interestingly, the smaller the annual blue water footprint, the higher its sensitivity to PR, ETo and Kc variability. The uncertainties in total crop water footprints to combined uncertainties in four key input variables was less than × 30% for total water footprints at 95% confidence level. The sensitivity and uncertainty level of crop water footprints also differs with

  5. Cross Section Sensitivity and Uncertainty Analysis Including Secondary Neutron Energy and Angular Distributions.

    1991-03-12

    Version 00 SUSD calculates sensitivity coefficients for one- and two-dimensional transport problems. Variance and standard deviation of detector responses or design parameters can be obtained using cross-section covariance matrices. In neutron transport problems, this code can perform sensitivity-uncertainty analysis for secondary angular distribution (SAD) or secondary energy distribution (SED).

  6. Amphetamine-induced sensitization and reward uncertainty similarly enhance incentive salience for conditioned cues.

    PubMed

    Robinson, Mike J F; Anselme, Patrick; Suchomel, Kristen; Berridge, Kent C

    2015-08-01

    Amphetamine and stress can sensitize mesolimbic dopamine-related systems. In Pavlovian autoshaping, repeated exposure to uncertainty of reward prediction can enhance motivated sign-tracking or attraction to a discrete reward-predicting cue (lever-conditioned stimulus; CS+), as well as produce cross-sensitization to amphetamine. However, it remains unknown how amphetamine sensitization or repeated restraint stress interact with uncertainty in controlling CS+ incentive salience attribution reflected in sign-tracking. Here rats were tested in 3 successive phases. First, different groups underwent either induction of amphetamine sensitization or repeated restraint stress, or else were not sensitized or stressed as control groups (either saline injections only, or no stress or injection at all). All next received Pavlovian autoshaping training under either certainty conditions (100% CS-UCS association) or uncertainty conditions (50% CS-UCS association and uncertain reward magnitude). During training, rats were assessed for sign-tracking to the CS+ lever versus goal-tracking to the sucrose dish. Finally, all groups were tested for psychomotor sensitization of locomotion revealed by an amphetamine challenge. Our results confirm that reward uncertainty enhanced sign-tracking attraction toward the predictive CS+ lever, at the expense of goal-tracking. We also reported that amphetamine sensitization promoted sign-tracking even in rats trained under CS-UCS certainty conditions, raising them to sign-tracking levels equivalent to the uncertainty group. Combining amphetamine sensitization and uncertainty conditions did not add together to elevate sign-tracking further above the relatively high levels induced by either manipulation alone. In contrast, repeated restraint stress enhanced subsequent amphetamine-elicited locomotion, but did not enhance CS+ attraction.

  7. Uncertainty

    USGS Publications Warehouse

    Hunt, Randall J.

    2012-01-01

    Management decisions will often be directly informed by model predictions. However, we now know there can be no expectation of a single ‘true’ model; thus, model results are uncertain. Understandable reporting of underlying uncertainty provides necessary context to decision-makers, as model results are used for management decisions. This, in turn, forms a mechanism by which groundwater models inform a risk-management framework because uncertainty around a prediction provides the basis for estimating the probability or likelihood of some event occurring. Given that the consequences of management decisions vary, it follows that the extent of and resources devoted to an uncertainty analysis may depend on the consequences. For events with low impact, a qualitative, limited uncertainty analysis may be sufficient for informing a decision. For events with a high impact, on the other hand, the risks might be better assessed and associated decisions made using a more robust and comprehensive uncertainty analysis. The purpose of this chapter is to provide guidance on uncertainty analysis through discussion of concepts and approaches, which can vary from heuristic (i.e. the modeller’s assessment of prediction uncertainty based on trial and error and experience) to a comprehensive, sophisticated, statistics-based uncertainty analysis. Most of the material presented here is taken from Doherty et al. (2010) if not otherwise cited. Although the treatment here is necessarily brief, the reader can find citations for the source material and additional references within this chapter.

  8. Uncertainty and Sensitivity of Alternative Rn-222 Flux Density Models Used in Performance Assessment

    SciTech Connect

    Greg J. Shott, Vefa Yucel, Lloyd Desotell Non-Nstec Authors: G. Pyles and Jon Carilli

    2007-06-01

    Performance assessments for the Area 5 Radioactive Waste Management Site on the Nevada Test Site have used three different mathematical models to estimate Rn-222 flux density. This study describes the performance, uncertainty, and sensitivity of the three models which include the U.S. Nuclear Regulatory Commission Regulatory Guide 3.64 analytical method and two numerical methods. The uncertainty of each model was determined by Monte Carlo simulation using Latin hypercube sampling. The global sensitivity was investigated using Morris one-at-time screening method, sample-based correlation and regression methods, the variance-based extended Fourier amplitude sensitivity test, and Sobol's sensitivity indices. The models were found to produce similar estimates of the mean and median flux density, but to have different uncertainties and sensitivities. When the Rn-222 effective diffusion coefficient was estimated using five different published predictive models, the radon flux density models were found to be most sensitive to the effective diffusion coefficient model selected, the emanation coefficient, and the radionuclide inventory. Using a site-specific measured effective diffusion coefficient significantly reduced the output uncertainty. When a site-specific effective-diffusion coefficient was used, the models were most sensitive to the emanation coefficient and the radionuclide inventory.

  9. Visualization tools for uncertainty and sensitivity analyses on thermal-hydraulic transients

    NASA Astrophysics Data System (ADS)

    Popelin, Anne-Laure; Iooss, Bertrand

    2014-06-01

    In nuclear engineering studies, uncertainty and sensitivity analyses of simulation computer codes can be faced to the complexity of the input and/or the output variables. If these variables represent a transient or a spatial phenomenon, the difficulty is to provide tool adapted to their functional nature. In this paper, we describe useful visualization tools in the context of uncertainty analysis of model transient outputs. Our application involves thermal-hydraulic computations for safety studies of nuclear pressurized water reactors.

  10. Sensitivity of Advanced Reactor and Fuel Cycle Performance Parameters to Nuclear Data Uncertainties

    NASA Astrophysics Data System (ADS)

    Aliberti, G.; Palmiotti, G.; Salvatores, M.; Kim, T. K.; Taiwo, T. A.; Kodeli, I.; Sartori, E.; Bosq, J. C.; Tommasi, J.

    2006-04-01

    As a contribution to the feasibility assessment of Gen IV and AFCI relevant systems, a sensitivity and uncertainty study has been performed to evaluate the impact of neutron cross section uncertainty on the most significant integral parameters related to the core and fuel cycle. Results of an extensive analysis indicate only a limited number of relevant parameters and do not show any potential major problem due to nuclear data in the assessment of the systems considered. However, the results obtained depend on the uncertainty data used, and it is suggested to focus some future evaluation work on the production of consistent, as far as possible complete and user oriented covariance data.

  11. Sensitivity and uncertainty in the effective delayed neutron fraction ({beta}{sub eff})

    SciTech Connect

    Kodeli, I. I.

    2012-07-01

    Precise knowledge of effective delayed neutron fraction ({beta}{sub eff}) and of the corresponding uncertainty is important for reactor safety analysis. The interest in developing the methodology for estimating the uncertainty in {beta}{sub eff} was expressed in the scope of the UAM project of the OECD/NEA. A novel approach for the calculation of the nuclear data sensitivity and uncertainty of the effective delayed neutron fraction is proposed, based on the linear perturbation theory. The method allows the detailed analysis of components of {beta}{sub eff} uncertainty. The procedure was implemented in the SUSD3D sensitivity and uncertainty code applied to several fast neutron benchmark experiments from the ICSBEP and IRPhE databases. According to the JENDL-4 covariance matrices and taking into account the uncertainty in the cross sections and in the prompt and delayed fission spectra the total uncertainty in {beta}eff was found to be of the order of {approx}2 to {approx}3.5 % for the studied fast experiments. (authors)

  12. Advancing Inverse Sensitivity/Uncertainty Methods for Nuclear Fuel Cycle Applications

    SciTech Connect

    Arbanas, G.; Williams, M.L.; Leal, L.C.; Dunn, M.E.; Khuwaileh, B.A.; Wang, C.; Abdel-Khalik, H.

    2015-01-15

    The inverse sensitivity/uncertainty quantification (IS/UQ) method has recently been implemented in the Inverse Sensitivity/UnceRtainty Estimator (INSURE) module of the AMPX cross section processing system [M.E. Dunn and N.M. Greene, “AMPX-2000: A Cross-Section Processing System for Generating Nuclear Data for Criticality Safety Applications,” Trans. Am. Nucl. Soc. 86, 118–119 (2002)]. The IS/UQ method aims to quantify and prioritize the cross section measurements along with uncertainties needed to yield a given nuclear application(s) target response uncertainty, and doing this at a minimum cost. Since in some cases the extant uncertainties of the differential cross section data are already near the limits of the present-day state-of-the-art measurements, requiring significantly smaller uncertainties may be unrealistic. Therefore, we have incorporated integral benchmark experiments (IBEs) data into the IS/UQ method using the generalized linear least-squares method, and have implemented it in the INSURE module. We show how the IS/UQ method could be applied to systematic and statistical uncertainties in a self-consistent way and how it could be used to optimize uncertainties of IBEs and differential cross section data simultaneously. We itemize contributions to the cost of differential data measurements needed to define a realistic cost function.

  13. Advancing Inverse Sensitivity/Uncertainty Methods for Nuclear Fuel Cycle Applications

    SciTech Connect

    Arbanas, Goran; Williams, Mark L; Leal, Luiz C; Dunn, Michael E; Khuwaileh, Bassam A.; Wang, C; Abdel-Khalik, Hany

    2015-01-01

    The inverse sensitivity/uncertainty quantification (IS/UQ) method has recently been implemented in the Inverse Sensitivity/UnceRtainty Estimiator (INSURE) module of the AMPX system [1]. The IS/UQ method aims to quantify and prioritize the cross section measurements along with uncertainties needed to yield a given nuclear application(s) target response uncertainty, and doing this at a minimum cost. Since in some cases the extant uncertainties of the differential cross section data are already near the limits of the present-day state-of-the-art measurements, requiring significantly smaller uncertainties may be unrealistic. Therefore we have incorporated integral benchmark experiments (IBEs) data into the IS/UQ method using the generalized linear least-squares method, and have implemented it in the INSURE module. We show how the IS/UQ method could be applied to systematic and statistical uncertainties in a self-consistent way. We show how the IS/UQ method could be used to optimize uncertainties of IBEs and differential cross section data simultaneously.

  14. Advancing Inverse Sensitivity/Uncertainty Methods for Nuclear Fuel Cycle Applications

    NASA Astrophysics Data System (ADS)

    Arbanas, G.; Williams, M. L.; Leal, L. C.; Dunn, M. E.; Khuwaileh, B. A.; Wang, C.; Abdel-Khalik, H.

    2015-01-01

    The inverse sensitivity/uncertainty quantification (IS/UQ) method has recently been implemented in the Inverse Sensitivity/UnceRtainty Estimator (INSURE) module of the AMPX cross section processing system [M.E. Dunn and N.M. Greene, "AMPX-2000: A Cross-Section Processing System for Generating Nuclear Data for Criticality Safety Applications," Trans. Am. Nucl. Soc. 86, 118-119 (2002)]. The IS/UQ method aims to quantify and prioritize the cross section measurements along with uncertainties needed to yield a given nuclear application(s) target response uncertainty, and doing this at a minimum cost. Since in some cases the extant uncertainties of the differential cross section data are already near the limits of the present-day state-of-the-art measurements, requiring significantly smaller uncertainties may be unrealistic. Therefore, we have incorporated integral benchmark experiments (IBEs) data into the IS/UQ method using the generalized linear least-squares method, and have implemented it in the INSURE module. We show how the IS/UQ method could be applied to systematic and statistical uncertainties in a self-consistent way and how it could be used to optimize uncertainties of IBEs and differential cross section data simultaneously. We itemize contributions to the cost of differential data measurements needed to define a realistic cost function.

  15. Three-dimensional lake water quality modeling: sensitivity and uncertainty analyses.

    PubMed

    Missaghi, Shahram; Hondzo, Miki; Melching, Charles

    2013-11-01

    Two sensitivity and uncertainty analysis methods are applied to a three-dimensional coupled hydrodynamic-ecological model (ELCOM-CAEDYM) of a morphologically complex lake. The primary goals of the analyses are to increase confidence in the model predictions, identify influential model parameters, quantify the uncertainty of model prediction, and explore the spatial and temporal variabilities of model predictions. The influence of model parameters on four model-predicted variables (model output) and the contributions of each of the model-predicted variables to the total variations in model output are presented. The contributions of predicted water temperature, dissolved oxygen, total phosphorus, and algal biomass contributed 3, 13, 26, and 58% of total model output variance, respectively. The fraction of variance resulting from model parameter uncertainty was calculated by two methods and used for evaluation and ranking of the most influential model parameters. Nine out of the top 10 parameters identified by each method agreed, but their ranks were different. Spatial and temporal changes of model uncertainty were investigated and visualized. Model uncertainty appeared to be concentrated around specific water depths and dates that corresponded to significant storm events. The results suggest that spatial and temporal variations in the predicted water quality variables are sensitive to the hydrodynamics of physical perturbations such as those caused by stream inflows generated by storm events. The sensitivity and uncertainty analyses identified the mineralization of dissolved organic carbon, sediment phosphorus release rate, algal metabolic loss rate, internal phosphorus concentration, and phosphorus uptake rate as the most influential model parameters.

  16. Uncertainty and Sensitivity Analyses Plan. Draft for Peer Review: Hanford Environmental Dose Reconstruction Project

    SciTech Connect

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy`s (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  17. Survey of sampling-based methods for uncertainty and sensitivity analysis.

    SciTech Connect

    Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. PhD.; Storlie, Curt B. (Colorado State University, Fort Collins, CO)

    2006-06-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.

  18. A Methodology For Performing Global Uncertainty And Sensitivity Analysis In Systems Biology

    PubMed Central

    Marino, Simeone; Hogue, Ian B.; Ray, Christian J.; Kirschner, Denise E.

    2008-01-01

    Accuracy of results from mathematical and computer models of biological systems is often complicated by the presence of uncertainties in experimental data that are used to estimate parameter values. Current mathematical modeling approaches typically use either single-parameter or local sensitivity analyses. However, these methods do not accurately assess uncertainty and sensitivity in the system as, by default they hold all other parameters fixed at baseline values. Using techniques described within we demonstrate how a multi-dimensional parameter space can be studied globally so all uncertainties can be identified. Further, uncertainty and sensitivity analysis techniques can help to identify and ultimately control uncertainties. In this work we develop methods for applying existing analytical tools to perform analyses on a variety of mathematical and computer models. We compare two specific types of global sensitivity analysis indexes that have proven to be among the most robust and efficient. Through familiar and new examples of mathematical and computer models, we provide a complete methodology for performing these analyses, both in deterministic and stochastic settings, and propose novel techniques to handle problems encountered during this type of analyses. PMID:18572196

  19. Uncertainty and Sensitivity Analysis of Afterbody Radiative Heating Predictions for Earth Entry

    NASA Technical Reports Server (NTRS)

    West, Thomas K., IV; Johnston, Christopher O.; Hosder, Serhat

    2016-01-01

    The objective of this work was to perform sensitivity analysis and uncertainty quantification for afterbody radiative heating predictions of Stardust capsule during Earth entry at peak afterbody radiation conditions. The radiation environment in the afterbody region poses significant challenges for accurate uncertainty quantification and sensitivity analysis due to the complexity of the flow physics, computational cost, and large number of un-certain variables. In this study, first a sparse collocation non-intrusive polynomial chaos approach along with global non-linear sensitivity analysis was used to identify the most significant uncertain variables and reduce the dimensions of the stochastic problem. Then, a total order stochastic expansion was constructed over only the important parameters for an efficient and accurate estimate of the uncertainty in radiation. Based on previous work, 388 uncertain parameters were considered in the radiation model, which came from the thermodynamics, flow field chemistry, and radiation modeling. The sensitivity analysis showed that only four of these variables contributed significantly to afterbody radiation uncertainty, accounting for almost 95% of the uncertainty. These included the electronic- impact excitation rate for N between level 2 and level 5 and rates of three chemical reactions in uencing N, N(+), O, and O(+) number densities in the flow field.

  20. Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model

    NASA Astrophysics Data System (ADS)

    Urrego-Blanco, Jorge R.; Urban, Nathan M.; Hunke, Elizabeth C.; Turner, Adrian K.; Jeffery, Nicole

    2016-04-01

    Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual model parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. It is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.

  1. Sensitivity Analysis and Insights into Hydrological Processes and Uncertainty at Different Scales

    NASA Astrophysics Data System (ADS)

    Haghnegahdar, A.; Razavi, S.; Wheater, H. S.; Gupta, H. V.

    2015-12-01

    Sensitivity analysis (SA) is an essential tool for providing insight into model behavior, and conducting model calibration and uncertainty assessment. Numerous techniques have been used in environmental modelling studies for sensitivity analysis. However, it is often overlooked that the scale of modelling study, and the metric choice can significantly change the assessment of model sensitivity and uncertainty. In order to identify important hydrological processes across various scales, we conducted a multi-criteria sensitivity analysis using a novel and efficient technique, Variogram Analysis of Response Surfaces (VARS). The analysis was conducted using three different hydrological models, HydroGeoSphere (HGS), Soil and Water Assessment Tool (SWAT), and Modélisation Environmentale-Surface et Hydrologie (MESH). Models were applied at various scales ranging from small (hillslope) to large (watershed) scales. In each case, the sensitivity of simulated streamflow to model processes (represented through parameters) were measured using different metrics selected based on various hydrograph characteristics such as high flows, low flows, and volume. We demonstrate how the scale of the case study and the choice of sensitivity metric(s) can change our assessment of sensitivity and uncertainty. We present some guidelines to better align the metric choice with the objective and scale of a modelling study.

  2. One-Dimensional, Multigroup Cross Section and Design Sensitivity and Uncertainty Analysis Code System - Generalized Perturbation Theory.

    1981-02-02

    Version: 00 SENSIT computes the sensitivity and uncertainty of a calculated integral response (such as a dose rate) due to input cross sections and their uncertainties. Sensitivity profiles are computed for neutron and gamma-ray reaction cross sections (of standard multigroup cross-section sets) and for secondary energy distributions (SED's) of multigroup scattering matrices.

  3. Sensitivity Analysis and Uncertainty Propagation in a General-Purpose Thermal Analysis Code

    SciTech Connect

    Blackwell, Bennie F.; Dowding, Kevin J.

    1999-08-04

    Methods are discussed for computing the sensitivity of field variables to changes in material properties and initial/boundary condition parameters for heat transfer problems. The method we focus on is termed the ''Sensitivity Equation Method'' (SEM). It involves deriving field equations for sensitivity coefficients by differentiating the original field equations with respect to the parameters of interest and numerically solving the resulting sensitivity field equations. Uncertainty in the model parameters are then propagated through the computational model using results derived from first-order perturbation theory; this technique is identical to the methodology typically used to propagate experimental uncertainty. Numerical results are presented for the design of an experiment to estimate the thermal conductivity of stainless steel using transient temperature measurements made on prototypical hardware of a companion contact conductance experiment. Comments are made relative to extending the SEM to conjugate heat transfer problems.

  4. Use of Forward Sensitivity Analysis Method to Improve Code Scaling, Applicability, and Uncertainty (CSAU) Methodology

    SciTech Connect

    Haihua Zhao; Vincent A. Mousseau; Nam T. Dinh

    2010-10-01

    Code Scaling, Applicability, and Uncertainty (CSAU) methodology was developed in late 1980s by US NRC to systematically quantify reactor simulation uncertainty. Basing on CSAU methodology, Best Estimate Plus Uncertainty (BEPU) methods have been developed and widely used for new reactor designs and existing LWRs power uprate. In spite of these successes, several aspects of CSAU have been criticized for further improvement: i.e., (1) subjective judgement in PIRT process; (2) high cost due to heavily relying large experimental database, needing many experts man-years work, and very high computational overhead; (3) mixing numerical errors with other uncertainties; (4) grid dependence and same numerical grids for both scaled experiments and real plants applications; (5) user effects; Although large amount of efforts have been used to improve CSAU methodology, the above issues still exist. With the effort to develop next generation safety analysis codes, new opportunities appear to take advantage of new numerical methods, better physical models, and modern uncertainty qualification methods. Forward sensitivity analysis (FSA) directly solves the PDEs for parameter sensitivities (defined as the differential of physical solution with respective to any constant parameter). When the parameter sensitivities are available in a new advanced system analysis code, CSAU could be significantly improved: (1) Quantifying numerical errors: New codes which are totally implicit and with higher order accuracy can run much faster with numerical errors quantified by FSA. (2) Quantitative PIRT (Q-PIRT) to reduce subjective judgement and improving efficiency: treat numerical errors as special sensitivities against other physical uncertainties; only parameters having large uncertainty effects on design criterions are considered. (3) Greatly reducing computational costs for uncertainty qualification by (a) choosing optimized time steps and spatial sizes; (b) using gradient information

  5. Sensitivity and uncertainty analyses for thermo-hydraulic calculation of research reactor

    SciTech Connect

    Hartini, Entin; Andiwijayakusuma, Dinan; Isnaeni, Muh Darwis

    2013-09-09

    The sensitivity and uncertainty analysis of input parameters on thermohydraulic calculations for a research reactor has successfully done in this research. The uncertainty analysis was carried out on input parameters for thermohydraulic calculation of sub-channel analysis using Code COOLOD-N. The input parameters include radial peaking factor, the increase bulk coolant temperature, heat flux factor and the increase temperature cladding and fuel meat at research reactor utilizing plate fuel element. The input uncertainty of 1% - 4% were used in nominal power calculation. The bubble detachment parameters were computed for S ratio (the safety margin against the onset of flow instability ratio) which were used to determine safety level in line with the design of 'Reactor Serba Guna-G. A. Siwabessy' (RSG-GA Siwabessy). It was concluded from the calculation results that using the uncertainty input more than 3% was beyond the safety margin of reactor operation.

  6. Status of XSUSA for Sampling Based Nuclear Data Uncertainty and Sensitivity Analysis

    NASA Astrophysics Data System (ADS)

    Zwermann, W.; Gallner, L.; Klein, M.; Krzykacz-Hausmann, B.; Pasichnyk, I.; Pautz, A.; Velkov, K.

    2013-03-01

    In the present contribution, an overview of the sampling based XSUSA method for sensitivity and uncertainty analysis with respect to nuclear data is given. The focus is on recent developments and applications of XSUSA. These applications include calculations for critical assemblies, fuel assembly depletion calculations, and steadystate as well as transient reactor core calculations. The analyses are partially performed in the framework of international benchmark working groups (UACSA - Uncertainty Analyses for Criticality Safety Assessment, UAM - Uncertainty Analysis in Modelling). It is demonstrated that particularly for full-scale reactor calculations the influence of the nuclear data uncertainties on the results can be substantial. For instance, for the radial fission rate distributions of mixed UO2/MOX light water reactor cores, the 2σ uncertainties in the core centre and periphery can reach values exceeding 10%. For a fast transient, the resulting time behaviour of the reactor power was covered by a wide uncertainty band. Overall, the results confirm the necessity of adding systematic uncertainty analyses to best-estimate reactor calculations.

  7. Asteroseismic inversions for radial differential rotation of Sun-like stars: Sensitivity to uncertainties

    NASA Astrophysics Data System (ADS)

    Schunker, H.; Schou, J.; Ball, W. H.

    2016-02-01

    Aims: We quantify the effect of observational spectroscopic and asteroseismic uncertainties on regularised least squares (RLS) inversions for the radial differential rotation of Sun-like and subgiant stars. Methods: We first solved the forward problem to model rotational splittings plus the observed uncertainties for models of a Sun-like star, HD 52265, and a subgiant star, KIC 7341231. We randomly perturbed the parameters of the stellar models within the uncertainties of the spectroscopic and asteroseismic constraints and used these perturbed stellar models to compute rotational splittings. We experimented with three rotation profiles: solid body rotation, a step function, and a smooth rotation profile decreasing with radius. We then solved the inverse problem to infer the radial differential rotation profile using a RLS inversion and kernels from the best-fit stellar model. We also compared RLS, optimally localised average (OLA) and direct functional fitting inversion techniques. Results: We found that the inversions for Sun-like stars with solar-like radial differential rotation profiles are insensitive to the uncertainties in the stellar models. The uncertainties in the splittings dominate the uncertainties in the inversions and solid body rotation is not excluded. We found that when the rotation rate below the convection zone is increased to six times that of the surface rotation rate the inferred rotation profile excluded solid body rotation. We showed that when we reduced the uncertainties in the splittings by a factor of about 100, the inversion is sensitive to the uncertainties in the stellar model. With the current observational uncertainties, we found that inversions of subgiant stars are sensitive to the uncertainties in the stellar model. Conclusions: Our findings suggest that inversions for the radial differential rotation of subgiant stars would benefit from more tightly constrained stellar models. We conclude that current observational uncertainties

  8. Use of SUSA in Uncertainty and Sensitivity Analysis for INL VHTR Coupled Codes

    SciTech Connect

    Gerhard Strydom

    2010-06-01

    The need for a defendable and systematic Uncertainty and Sensitivity approach that conforms to the Code Scaling, Applicability, and Uncertainty (CSAU) process, and that could be used for a wide variety of software codes, was defined in 2008.The GRS (Gesellschaft für Anlagen und Reaktorsicherheit) company of Germany has developed one type of CSAU approach that is particularly well suited for legacy coupled core analysis codes, and a trial version of their commercial software product SUSA (Software for Uncertainty and Sensitivity Analyses) was acquired on May 12, 2010. This interim milestone report provides an overview of the current status of the implementation and testing of SUSA at the INL VHTR Project Office.

  9. Uncertainty and sensitivity analysis of the retrieved essential climate variables from remotely sensed observations

    NASA Astrophysics Data System (ADS)

    Djepa, Vera; Badii, Atta

    2016-04-01

    The sensitivity of weather and climate system to sea ice thickness (SIT), Sea Ice Draft (SID) and Snow Depth (SD) in the Arctic is recognized from various studies. Decrease in SIT will affect atmospheric circulation, temperature, precipitation and wind speed in the Arctic and beyond. Ice thermodynamics and dynamic properties depend strongly on sea Ice Density (ID) and SD. SIT, SID, ID and SD are sensitive to environmental changes in the Polar region and impact the climate system. For accurate forecast of climate change, sea ice mass balance, ocean circulation and sea- atmosphere interactions it is required to have long term records of SIT, SID, SD and ID with errors and uncertainty analyses. The SID, SIT, ID and freeboard (F) have been retrieved from Radar Altimeter (RA) (on board ENVISAT) and IceBridge Laser Altimeter (LA) and validated, using over 10 years -collocated observations of SID and SD in the Arctic, provided from the European Space Agency (ESA CCI sea ice ECV project). Improved algorithms to retrieve SIT from LA and RA have been derived, applying statistical analysis. The snow depth is obtained from AMSR-E/Aqua and NASA IceBridge Snow Depth radar. The sea ice properties of pancake ice have been retrieved from ENVISAT/Synthetic Aperture Radar (ASAR). The uncertainties of the retrieved climate variables have been analysed and the impact of snow depth and sea ice density on retrieved SIT has been estimated. The sensitivity analysis illustrates the impact of uncertainties of input climate variables (ID and SD) on accuracy of the retrieved output variables (SIT and SID). The developed methodology of uncertainty and sensitivity analysis is essential for assessment of the impact of environmental variables on climate change and better understanding of the relationship between input and output variables. The uncertainty analysis quantifies the uncertainties of the model results and the sensitivity analysis evaluates the contribution of each input variable to

  10. Uncertainty and Sensitivity Analyses of a Two-Parameter Impedance Prediction Model

    NASA Technical Reports Server (NTRS)

    Jones, M. G.; Parrott, T. L.; Watson, W. R.

    2008-01-01

    This paper presents comparisons of predicted impedance uncertainty limits derived from Monte-Carlo-type simulations with a Two-Parameter (TP) impedance prediction model and measured impedance uncertainty limits based on multiple tests acquired in NASA Langley test rigs. These predicted and measured impedance uncertainty limits are used to evaluate the effects of simultaneous randomization of each input parameter for the impedance prediction and measurement processes. A sensitivity analysis is then used to further evaluate the TP prediction model by varying its input parameters on an individual basis. The variation imposed on the input parameters is based on measurements conducted with multiple tests in the NASA Langley normal incidence and grazing incidence impedance tubes; thus, the input parameters are assigned uncertainties commensurate with those of the measured data. These same measured data are used with the NASA Langley impedance measurement (eduction) processes to determine the corresponding measured impedance uncertainty limits, such that the predicted and measured impedance uncertainty limits (95% confidence intervals) can be compared. The measured reactance 95% confidence intervals encompass the corresponding predicted reactance confidence intervals over the frequency range of interest. The same is true for the confidence intervals of the measured and predicted resistance at near-resonance frequencies, but the predicted resistance confidence intervals are lower than the measured resistance confidence intervals (no overlap) at frequencies away from resonance. A sensitivity analysis indicates the discharge coefficient uncertainty is the major contributor to uncertainty in the predicted impedances for the perforate-over-honeycomb liner used in this study. This insight regarding the relative importance of each input parameter will be used to guide the design of experiments with test rigs currently being brought on-line at NASA Langley.

  11. How to assess the Efficiency and "Uncertainty" of Global Sensitivity Analysis?

    NASA Astrophysics Data System (ADS)

    Haghnegahdar, Amin; Razavi, Saman

    2016-04-01

    Sensitivity analysis (SA) is an important paradigm for understanding model behavior, characterizing uncertainty, improving model calibration, etc. Conventional "global" SA (GSA) approaches are rooted in different philosophies, resulting in different and sometime conflicting and/or counter-intuitive assessment of sensitivity. Moreover, most global sensitivity techniques are highly computationally demanding to be able to generate robust and stable sensitivity metrics over the entire model response surface. Accordingly, a novel sensitivity analysis method called Variogram Analysis of Response Surfaces (VARS) is introduced to overcome the aforementioned issues. VARS uses the Variogram concept to efficiently provide a comprehensive assessment of global sensitivity across a range of scales within the parameter space. Based on the VARS principles, in this study we present innovative ideas to assess (1) the efficiency of GSA algorithms and (2) the level of confidence we can assign to a sensitivity assessment. We use multiple hydrological models with different levels of complexity to explain the new ideas.

  12. PC-BASED SUPERCOMPUTING FOR UNCERTAINTY AND SENSITIVITY ANALYSIS OF MODELS

    EPA Science Inventory

    Evaluating uncertainty and sensitivity of multimedia environmental models that integrate assessments of air, soil, sediments, groundwater, and surface water is a difficult task. It can be an enormous undertaking even for simple, single-medium models (i.e. groundwater only) descr...

  13. SCIENTIFIC UNCERTAINTIES IN ATMOSPHERIC MERCURY MODELS II: SENSITIVITY ANALYSIS IN THE CONUS DOMAIN

    EPA Science Inventory

    In this study, we present the response of model results to different scientific treatments in an effort to quantify the uncertainties caused by the incomplete understanding of mercury science and by model assumptions in atmospheric mercury models. Two sets of sensitivity simulati...

  14. INVESTIGATING UNCERTAINTY AND SENSITIVITY IN INTEGRATED, MULTIMEDIA ENVIRONMENTAL MODELS: TOOLS FOR FRAMES-3MRA

    EPA Science Inventory

    Elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-medium constructs driven by a unique set of site-specific data. Quantitative assessment of integrated, multimedia models that simulate hundreds of sites...

  15. The Model Optimization, Uncertainty, and SEnsitivity analysis (MOUSE) toolbox: overview and application

    Technology Transfer Automated Retrieval System (TEKTRAN)

    For several decades, optimization and sensitivity/uncertainty analysis of environmental models has been the subject of extensive research. Although much progress has been made and sophisticated methods developed, the growing complexity of environmental models to represent real-world systems makes it...

  16. INVESTIGATING UNCERTAINTY AND SENSITIVITY IN INTEGRATED MULTIMEDIA ENVIRONMENTAL MODELS: TOOLS FOR 3MRA

    EPA Science Inventory

    Sufficiently elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-media constructs driven by a unique set of site-specific data. The ensuing challenge of examining ever more complex, integrated, higher-ord...

  17. Sensitivity and uncertainty analysis of estimated soil hydraulic parameters for simulating soil water content

    NASA Astrophysics Data System (ADS)

    Gupta, Manika; Garg, Naveen Kumar; Srivastava, Prashant K.

    2014-05-01

    The sensitivity and uncertainty analysis has been carried out for the scalar parameters (soil hydraulic parameters (SHPs)), which govern the simulation of soil water content in the unsaturated soil zone. The study involves field experiments, which were conducted in real field conditions for wheat crop in Roorkee, India under irrigated conditions. Soil samples were taken for the soil profile of 60 cm depth at an interval of 15 cm in the experimental field to determine soil water retention curves (SWRCs). These experimentally determined SWRCs were used to estimate the SHPs by least square optimization under constrained conditions. Sensitivity of the SHPs estimated by various pedotransfer functions (PTFs), that relate various easily measurable soil properties like soil texture, bulk density and organic carbon content, is compared with lab derived parameters to simulate respective soil water retention curves. Sensitivity analysis was carried out using the monte carlo simulations and the one factor at a time approach. The different sets of SHPs, along with experimentally determined saturated permeability, are then used as input parameters in physically based, root water uptake model to ascertain the uncertainties in simulating soil water content. The generalised likelihood uncertainty estimation procedure (GLUE) was subsequently used to estimate the uncertainty bounds (UB) on the model predictions. It was found that the experimentally obtained SHPs were able to simulate the soil water contents with efficiencies of 70-80% at all the depths for the three irrigation treatments. The SHPs obtained from the PTFs, performed with varying uncertainties in simulating the soil water contents. Keywords: Sensitivity analysis, Uncertainty estimation, Pedotransfer functions, Soil hydraulic parameters, Hydrological modelling

  18. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex, Hydrogeologic Systems

    NASA Astrophysics Data System (ADS)

    Wolfsberg, A.; Kang, Q.; Li, C.; Ruskauff, G.; Bhark, E.; Freeman, E.; Prothro, L.; Drellack, S.

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  19. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    SciTech Connect

    Sig Drellack, Lance Prothro

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  20. Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling

    NASA Astrophysics Data System (ADS)

    Pastore, Giovanni; Swiler, L. P.; Hales, J. D.; Novascone, S. R.; Perez, D. M.; Spencer, B. W.; Luzzi, L.; Van Uffelen, P.; Williamson, R. L.

    2015-01-01

    The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code with a recently implemented physics-based model for fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO2 single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information in the open literature. The study leads to an initial quantitative assessment of the uncertainty in fission gas behavior predictions with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, significantly higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.

  1. An Example Uncertainty and Sensitivity Analysis at the Horonobe Site for Performance Assessment Calculations

    NASA Astrophysics Data System (ADS)

    James, S. C.; Makino, H.

    2004-12-01

    Given pre-existing Groundwater Modeling System (GMS) models of the Horonobe Underground Research Laboratory (URL) at both the regional and site scales, this work performs an example uncertainty analysis for performance assessment (PA) applications. After a general overview of uncertainty and sensitivity analysis techniques, the existing GMS site-scale model is converted to a PA model of the steady-state conditions expected after URL closure. This is done to examine the impact of uncertainty in site-specific data in conjunction with conceptual model uncertainty regarding the location of the Oomagari Fault. In addition, a quantitative analysis of the ratio of dispersive to advective forces, the F-ratio, is performed for stochastic realizations of each conceptual model. All analyses indicate that accurate characterization of the Oomagari Fault with respect to both location and hydraulic conductivity is critical to PA calculations. This work defines and outlines typical uncertainty and sensitivity analysis procedures and demonstrates them with example PA calculations relevant to the Horonobe URL. {\\st Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.}

  2. An example uncertainty and sensitivity analysis at the Horonobe site for performance assessment calculations.

    SciTech Connect

    James, Scott Carlton

    2004-08-01

    Given pre-existing Groundwater Modeling System (GMS) models of the Horonobe Underground Research Laboratory (URL) at both the regional and site scales, this work performs an example uncertainty analysis for performance assessment (PA) applications. After a general overview of uncertainty and sensitivity analysis techniques, the existing GMS sitescale model is converted to a PA model of the steady-state conditions expected after URL closure. This is done to examine the impact of uncertainty in site-specific data in conjunction with conceptual model uncertainty regarding the location of the Oomagari Fault. In addition, a quantitative analysis of the ratio of dispersive to advective forces, the F-ratio, is performed for stochastic realizations of each conceptual model. All analyses indicate that accurate characterization of the Oomagari Fault with respect to both location and hydraulic conductivity is critical to PA calculations. This work defines and outlines typical uncertainty and sensitivity analysis procedures and demonstrates them with example PA calculations relevant to the Horonobe URL.

  3. Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling

    SciTech Connect

    Pastore, Giovanni; Swiler, L. P.; Hales, Jason D.; Novascone, Stephen R.; Perez, Danielle M.; Spencer, Benjamin W.; Luzzi, Lelio; Uffelen, Paul Van; Williamson, Richard L.

    2014-10-12

    The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code and a recently implemented physics-based model for the coupled fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO2 single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information from the open literature. The study leads to an initial quantitative assessment of the uncertainty in fission gas behavior modeling with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.

  4. Preliminary uncertainty and sensitivity analysis for basic transport parameters at the Horonobe Site, Hokkaido, Japan.

    SciTech Connect

    James, Scott Carlton; Zimmerman, Dean Anthony

    2003-10-01

    Incorporating results from a previously developed finite element model, an uncertainty and parameter sensitivity analysis was conducted using preliminary site-specific data from Horonobe, Japan (data available from five boreholes as of 2003). Latin Hypercube Sampling was used to draw random parameter values from the site-specific measured, or approximated, physicochemical uncertainty distributions. Using pathlengths and groundwater velocities extracted from the three-dimensional, finite element flow and particle tracking model, breakthrough curves for multiple realizations were calculated with the semi-analytical, one-dimensional, multirate transport code, STAMMT-L. A stepwise linear regression analysis using the 5, 50, and 95% breakthrough times as the dependent variables and LHS sampled site physicochemical parameters as the independent variables was used to perform a sensitivity analysis. Results indicate that the distribution coefficients and hydraulic conductivities are the parameters responsible for most of the variation among simulated breakthrough times. This suggests that researchers and data collectors at the Horonobe site should focus on accurately assessing these parameters and quantifying their uncertainty. Because the Horonobe Underground Research Laboratory is in an early phase of its development, this work should be considered as a first step toward an integration of uncertainty and sensitivity analyses with decision analysis.

  5. Approach for Uncertainty Propagation and Robust Design in CFD Using Sensitivity Derivatives

    NASA Technical Reports Server (NTRS)

    Putko, Michele M.; Newman, Perry A.; Taylor, Arthur C., III; Green, Lawrence L.

    2001-01-01

    This paper presents an implementation of the approximate statistical moment method for uncertainty propagation and robust optimization for a quasi 1-D Euler CFD (computational fluid dynamics) code. Given uncertainties in statistically independent, random, normally distributed input variables, a first- and second-order statistical moment matching procedure is performed to approximate the uncertainty in the CFD output. Efficient calculation of both first- and second-order sensitivity derivatives is required. In order to assess the validity of the approximations, the moments are compared with statistical moments generated through Monte Carlo simulations. The uncertainties in the CFD input variables are also incorporated into a robust optimization procedure. For this optimization, statistical moments involving first-order sensitivity derivatives appear in the objective function and system constraints. Second-order sensitivity derivatives are used in a gradient-based search to successfully execute a robust optimization. The approximate methods used throughout the analyses are found to be valid when considering robustness about input parameter mean values.

  6. Uncertainty and sensitivity analyses of ballast life-cycle cost and payback period

    SciTech Connect

    McMahon, James E.; Liu, Xiaomin; Turiel, Ike; Hakim, Sajid; Fisher, Diane

    2000-06-01

    The paper introduces an innovative methodology for evaluating the relative significance of energy-efficient technologies applied to fluorescent lamp ballasts. The method involves replacing the point estimates of life cycle cost of the ballasts with uncertainty distributions reflecting the whole spectrum of possible costs, and the assessed probability associated with each value. The results of uncertainty and sensitivity analyses will help analysts reduce effort in data collection and carry on analysis more efficiently. These methods also enable policy makers to gain an insightful understanding of which efficient technology alternatives benefit or cost what fraction of consumers, given the explicit assumptions of the analysis.

  7. Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model

    DOE PAGES

    Urrego-Blanco, Jorge Rolando; Urban, Nathan Mark; Hunke, Elizabeth Clare; Turner, Adrian Keith; Jeffery, Nicole

    2016-04-01

    Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual modelmore » parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. Lastly, it is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.« less

  8. A framework for optimization and quantification of uncertainty and sensitivity for developing carbon capture systems

    DOE PAGES

    Eslick, John C.; Ng, Brenda; Gao, Qianwen; Tong, Charles H.; Sahinidis, Nikolaos V.; Miller, David C.

    2014-12-31

    Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less

  9. A framework for optimization and quantification of uncertainty and sensitivity for developing carbon capture systems

    SciTech Connect

    Eslick, John C.; Ng, Brenda; Gao, Qianwen; Tong, Charles H.; Sahinidis, Nikolaos V.; Miller, David C.

    2014-12-31

    Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification through PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.

  10. Inverse Sensitivity/Uncertainty Methods Development for Nuclear Fuel Cycle Applications

    NASA Astrophysics Data System (ADS)

    Arbanas, G.; Dunn, M. E.; Williams, M. L.

    2014-04-01

    The Standardized Computer Analyses for Licensing Evaluation (SCALE) software package developed at the Oak Ridge National Laboratory includes codes that propagate uncertainties available in the nuclear data libraries to compute uncertainties in nuclear application performance parameters. We report on our recent efforts to extend this capability to develop an inverse sensitivity/uncertainty (IS/U) methodology that identifies the improvements in nuclear data that are needed to compute application responses within prescribed tolerances, while minimizing the cost of such data improvements. We report on our progress to date and present a simple test case for our method. Our methodology is directly applicable to thermal and intermediate neutron energy systems because it addresses the implicit neutron resonance self-shielding effects that are essential to accurate modeling of thermal and intermediate systems. This methodology is likely to increase the efficiency of nuclear data efforts.

  11. Sensitivities and Uncertainties Related to Numerics and Building Features in Urban Modeling

    SciTech Connect

    Joseph III, Robert Anthony; Slater, Charles O; Evans, Thomas M; Mosher, Scott W; Johnson, Jeffrey O

    2011-01-01

    Oak Ridge National Laboratory (ORNL) has been engaged in the development and testing of a computational system that would use a grid of activation foil detectors to provide postdetonation forensic information from a nuclear device detonation. ORNL has developed a high-performance, three-dimensional (3-D) deterministic radiation transport code called Denovo. Denovo solves the multigroup discrete ordinates (SN) equations and can output 3-D data in a platform-independent format that can be efficiently analyzed using parallel, high-performance visualization tools. To evaluate the sensitivities and uncertainties associated with the deterministic computational method numerics, a numerical study on the New York City Times Square model was conducted using Denovo. In particular, the sensitivities and uncertainties associated with various components of the calculational method were systematically investigated, including (a) the Legendre polynomial expansion order of the scattering cross sections, (b) the angular quadrature, (c) multigroup energy binning, (d) spatial mesh sizes, (e) the material compositions of the building models, (f) the composition of the foundations upon which the buildings rest (e.g., ground, concrete, or asphalt), and (g) the amount of detail included in the building models. Although Denovo may calculate the idealized model well, there may be uncertainty in the results because of slight departures of the above-named parameters from those used in the idealized calculations. Fluxes and activities at selected locations from perturbed calculations are compared with corresponding values from the idealized or base case to determine the sensitivities associated with specified parameter changes. Results indicate that uncertainties related to numerics can be controlled by using higher fidelity models, but more work is needed to control the uncertainties related to the model.

  12. Sensitivity and first-step uncertainty analyses for the preferential flow model MACRO.

    PubMed

    Dubus, Igor G; Brown, Colin D

    2002-01-01

    Sensitivity analyses for the preferential flow model MACRO were carried out using one-at-a-time and Monte Carlo sampling approaches. Four different scenarios were generated by simulating leaching to depth of two hypothetical pesticides in a sandy loam and a more structured clay loam soil. Sensitivity of the model was assessed using the predictions for accumulated water percolated at a 1-m depth and accumulated pesticide losses in percolation. Results for simulated percolation were similar for the two soils. Predictions of water volumes percolated were found to be only marginally affected by changes in input parameters and the most influential parameter was the water content defining the boundary between micropores and macropores in this dual-porosity model. In contrast, predictions of pesticide losses were found to be dependent on the scenarios considered and to be significantly affected by variations in input parameters. In most scenarios, predictions for pesticide losses by MACRO were most influenced by parameters related to sorption and degradation. Under specific circumstances, pesticide losses can be largely affected by changes in hydrological properties of the soil. Since parameters were varied within ranges that approximated their uncertainty, a first-step assessment of uncertainty for the predictions of pesticide losses was possible. Large uncertainties in the predictions were reported, although these are likely to have been overestimated by considering a large number of input parameters in the exercise. It appears desirable that a probabilistic framework accounting for uncertainty is integrated into the estimation of pesticide exposure for regulatory purposes.

  13. Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling

    DOE PAGES

    Pastore, Giovanni; Swiler, L. P.; Hales, Jason D.; Novascone, Stephen R.; Perez, Danielle M.; Spencer, Benjamin W.; Luzzi, Lelio; Uffelen, Paul Van; Williamson, Richard L.

    2014-10-12

    The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code and a recently implemented physics-based model for the coupled fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO2 single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information from the open literature. The study leads to an initial quantitative assessment of the uncertaintymore » in fission gas behavior modeling with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.« less

  14. Uncertainty and Sensitivity Analyses of a Pebble Bed HTGR Loss of Cooling Event

    DOE PAGES

    Strydom, Gerhard

    2013-01-01

    The Very High Temperature Reactor Methods Development group at the Idaho National Laboratory identified the need for a defensible and systematic uncertainty and sensitivity approach in 2009. This paper summarizes the results of an uncertainty and sensitivity quantification investigation performed with the SUSA code, utilizing the International Atomic Energy Agency CRP 5 Pebble Bed Modular Reactor benchmark and the INL code suite PEBBED-THERMIX. Eight model input parameters were selected for inclusion in this study, and after the input parameters variations and probability density functions were specified, a total of 800 steady state and depressurized loss of forced cooling (DLOFC) transientmore » PEBBED-THERMIX calculations were performed. The six data sets were statistically analyzed to determine the 5% and 95% DLOFC peak fuel temperature tolerance intervals with 95% confidence levels. It was found that the uncertainties in the decay heat and graphite thermal conductivities were the most significant contributors to the propagated DLOFC peak fuel temperature uncertainty. No significant differences were observed between the results of Simple Random Sampling (SRS) or Latin Hypercube Sampling (LHS) data sets, and use of uniform or normal input parameter distributions also did not lead to any significant differences between these data sets.« less

  15. PH Sensitive Polymers for Improving Reservoir Sweep and Conformance Control in Chemical Flooring

    SciTech Connect

    Mukul Sharma; Steven Bryant; Chun Huh

    2008-03-31

    viscoelastic behavior as functions of pH; shear rate; polymer concentration; salinity, including divalent ion effects; polymer molecular weight; and degree of hydrolysis. A comprehensive rheological model was developed for HPAM solution rheology in terms of: shear rate; pH; polymer concentration; and salinity, so that the spatial and temporal changes in viscosity during the polymer flow in the reservoir can be accurately modeled. A series of acid coreflood experiments were conducted to understand the geochemical reactions relevant for both the near-wellbore injection profile control and for conformance control applications. These experiments showed that the use hydrochloric acid as a pre-flush is not viable because of the high reaction rate with the rock. The use of citric acid as a pre-flush was found to be quite effective. This weak acid has a slow rate of reaction with the rock and can buffer the pH to below 3.5 for extended periods of time. With the citric acid pre-flush the polymer could be efficiently propagated through the core in a low pH environment i.e. at a low viscosity. The transport of various HPAM solutions was studied in sandstones, in terms of permeability reduction, mobility reduction, adsorption and inaccessible pore volume with different process variables: injection pH, polymer concentration, polymer molecular weight, salinity, degree of hydrolysis, and flow rate. Measurements of polymer effluent profiles and tracer tests show that the polymer retention increases at the lower pH. A new simulation capability to model the deep-penetrating mobility control or conformance control using pH-sensitive polymer was developed. The core flood acid injection experiments were history matched to estimate geochemical reaction rates. Preliminary scale-up simulations employing linear and radial geometry floods in 2-layer reservoir models were conducted. It is clearly shown that the injection rate of pH-sensitive polymer solutions can be significantly increased by injecting

  16. A Variance Decomposition Approach to Uncertainty Quantification and Sensitivity Analysis of the J&E Model

    PubMed Central

    Moradi, Ali; Tootkaboni, Mazdak; Pennell, Kelly G.

    2015-01-01

    The Johnson and Ettinger (J&E) model is the most widely used vapor intrusion model in the United States. It is routinely used as part of hazardous waste site assessments to evaluate the potential for vapor intrusion exposure risks. This study incorporates mathematical approaches that allow sensitivity and uncertainty of the J&E model to be evaluated. In addition to performing Monte Carlo simulations to examine the uncertainty in the J&E model output, a powerful global sensitivity analysis technique based on Sobol indices is used to evaluate J&E model sensitivity to variations in the input parameters. The results suggest that the J&E model is most sensitive to the building air exchange rate, regardless of soil type and source depth. Building air exchange rate is not routinely measured during vapor intrusion investigations, but clearly improved estimates and/or measurements of the air exchange rate would lead to improved model predictions. It is also found that the J&E model is more sensitive to effective diffusivity, than effective permeability. Field measurements of effective diffusivity are not commonly collected during vapor intrusion investigations; however, consideration of this parameter warrants additional attention. Finally, the effects of input uncertainties on model predictions for different scenarios (e.g. sandy soil as compared to clayey soil, and “shallow” sources as compared to “deep” sources) are evaluated. Our results, not only identify the range of variability to be expected depending on the scenario at hand, but also mark the important cases where special care is needed when estimating the input parameters to which the J&E model is most sensitive. PMID:25947051

  17. A variance decomposition approach to uncertainty quantification and sensitivity analysis of the Johnson and Ettinger model.

    PubMed

    Moradi, Ali; Tootkaboni, Mazdak; Pennell, Kelly G

    2015-02-01

    The Johnson and Ettinger (J&E) model is the most widely used vapor intrusion model in the United States. It is routinely used as part of hazardous waste site assessments to evaluate the potential for vapor intrusion exposure risks. This study incorporates mathematical approaches that allow sensitivity and uncertainty of the J&E model to be evaluated. In addition to performing Monte Carlo simulations to examine the uncertainty in the J&E model output, a powerful global sensitivity analysis technique based on Sobol indices is used to evaluate J&E model sensitivity to variations in the input parameters. The results suggest that the J&E model is most sensitive to the building air exchange rate, regardless of soil type and source depth. Building air exchange rate is not routinely measured during vapor intrusion investigations, but clearly improved estimates and/or measurements of the air exchange rate would lead to improved model predictions. It is also found that the J&E model is more sensitive to effective diffusivity than to effective permeability. Field measurements of effective diffusivity are not commonly collected during vapor intrusion investigations; however, consideration of this parameter warrants additional attention. Finally, the effects of input uncertainties on model predictions for different scenarios (e.g., sandy soil as compared to clayey soil, and "shallow" sources as compared to "deep" sources) are evaluated. Our results not only identify the range of variability to be expected depending on the scenario at hand, but also mark the important cases where special care is needed when estimating the input parameters to which the J&E model is most sensitive.

  18. A variance decomposition approach to uncertainty quantification and sensitivity analysis of the Johnson and Ettinger model.

    PubMed

    Moradi, Ali; Tootkaboni, Mazdak; Pennell, Kelly G

    2015-02-01

    The Johnson and Ettinger (J&E) model is the most widely used vapor intrusion model in the United States. It is routinely used as part of hazardous waste site assessments to evaluate the potential for vapor intrusion exposure risks. This study incorporates mathematical approaches that allow sensitivity and uncertainty of the J&E model to be evaluated. In addition to performing Monte Carlo simulations to examine the uncertainty in the J&E model output, a powerful global sensitivity analysis technique based on Sobol indices is used to evaluate J&E model sensitivity to variations in the input parameters. The results suggest that the J&E model is most sensitive to the building air exchange rate, regardless of soil type and source depth. Building air exchange rate is not routinely measured during vapor intrusion investigations, but clearly improved estimates and/or measurements of the air exchange rate would lead to improved model predictions. It is also found that the J&E model is more sensitive to effective diffusivity than to effective permeability. Field measurements of effective diffusivity are not commonly collected during vapor intrusion investigations; however, consideration of this parameter warrants additional attention. Finally, the effects of input uncertainties on model predictions for different scenarios (e.g., sandy soil as compared to clayey soil, and "shallow" sources as compared to "deep" sources) are evaluated. Our results not only identify the range of variability to be expected depending on the scenario at hand, but also mark the important cases where special care is needed when estimating the input parameters to which the J&E model is most sensitive. PMID:25947051

  19. Wavelet-Monte Carlo Hybrid System for HLW Nuclide Migration Modeling and Sensitivity and Uncertainty Analysis

    SciTech Connect

    Nasif, Hesham; Neyama, Atsushi

    2003-02-26

    This paper presents results of an uncertainty and sensitivity analysis for performance of the different barriers of high level radioactive waste repositories. SUA is a tool to perform the uncertainty and sensitivity on the output of Wavelet Integrated Repository System model (WIRS), which is developed to solve a system of nonlinear partial differential equations arising from the model formulation of radionuclide transport through repository. SUA performs sensitivity analysis (SA) and uncertainty analysis (UA) on a sample output from Monte Carlo simulation. The sample is generated by WIRS and contains the values of the output values of the maximum release rate in the form of time series and values of the input variables for a set of different simulations (runs), which are realized by varying the model input parameters. The Monte Carlo sample is generated with SUA as a pure random sample or using Latin Hypercube sampling technique. Tchebycheff and Kolmogrov confidence bounds are compute d on the maximum release rate for UA and effective non-parametric statistics to rank the influence of the model input parameters SA. Based on the results, we point out parameters that have primary influences on the performance of the engineered barrier system of a repository. The parameters found to be key contributor to the release rate are selenium and Cesium distribution coefficients in both geosphere and major water conducting fault (MWCF), the diffusion depth and water flow rate in the excavation-disturbed zone (EDZ).

  20. Uncertainty and sensitivity analysis of biokinetic models for radiopharmaceuticals used in nuclear medicine.

    PubMed

    Li, W B; Hoeschen, C

    2010-01-01

    Mathematical models for kinetics of radiopharmaceuticals in humans were developed and are used to estimate the radiation absorbed dose for patients in nuclear medicine by the International Commission on Radiological Protection and the Medical Internal Radiation Dose (MIRD) Committee. However, due to the fact that the residence times used were derived from different subjects, partially even with different ethnic backgrounds, a large variation in the model parameters propagates to a high uncertainty of the dose estimation. In this work, a method was developed for analysing the uncertainty and sensitivity of biokinetic models that are used to calculate the residence times. The biokinetic model of (18)F-FDG (FDG) developed by the MIRD Committee was analysed by this developed method. The sources of uncertainty of all model parameters were evaluated based on the experiments. The Latin hypercube sampling technique was used to sample the parameters for model input. Kinetic modelling of FDG in humans was performed. Sensitivity of model parameters was indicated by combining the model input and output, using regression and partial correlation analysis. The transfer rate parameter of plasma to other tissue fast is the parameter with the greatest influence on the residence time of plasma. Optimisation of biokinetic data acquisition in the clinical practice by exploitation of the sensitivity of model parameters obtained in this study is discussed. PMID:20185457

  1. Physical characterization of explosive volcanic eruptions based on tephra deposits: Propagation of uncertainties and sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Bonadonna, Costanza; Biass, Sébastien; Costa, Antonio

    2015-04-01

    Regardless of the recent advances in geophysical monitoring and real-time quantitative observations of explosive volcanic eruptions, the characterization of tephra deposits remains one of the largest sources of information on Eruption Source Parameters (ESPs) (i.e. plume height, erupted volume/mass, Mass Eruption Rate - MER, eruption duration, Total Grain-Size Distribution - TGSD). ESPs are crucial for the characterization of volcanic systems and for the compilation of comprehensive hazard scenarios but are naturally associated with various degrees of uncertainties that are traditionally not well quantified. Recent studies have highlighted the uncertainties associated with the estimation of ESPs mostly related to: i) the intrinsic variability of the natural system, ii) the observational error and iii) the strategies used to determine physical parameters. Here we review recent studies focused on the characterization of these uncertainties and we present a sensitivity analysis for the determination of ESPs and a systematic investigation to quantify the propagation of uncertainty applied to two case studies. In particular, we highlight the dependence of ESPs on specific observations used as input parameters (i.e. diameter of the largest clasts, thickness measurements, area of isopach contours, deposit density, downwind and crosswind range of isopleth maps, and empirical constants and wind speed for the determination of MER). The highest uncertainty is associated to the estimation of MER and eruption duration and is related to the determination of crosswind range of isopleth maps and the empirical constants used in the empirical parameterization relating MER and plume height. Given the exponential nature of the relation between MER and plume height, the propagation of uncertainty is not symmetrical, and both an underestimation of the empirical constant and an overestimation of plume height have the highest impact on the final outcome. A ± 20% uncertainty on thickness

  2. Incorporating uncertainty of management costs in sensitivity analyses of matrix population models.

    PubMed

    Salomon, Yacov; McCarthy, Michael A; Taylor, Peter; Wintle, Brendan A

    2013-02-01

    The importance of accounting for economic costs when making environmental-management decisions subject to resource constraints has been increasingly recognized in recent years. In contrast, uncertainty associated with such costs has often been ignored. We developed a method, on the basis of economic theory, that accounts for the uncertainty in population-management decisions. We considered the case where, rather than taking fixed values, model parameters are random variables that represent the situation when parameters are not precisely known. Hence, the outcome is not precisely known either. Instead of maximizing the expected outcome, we maximized the probability of obtaining an outcome above a threshold of acceptability. We derived explicit analytical expressions for the optimal allocation and its associated probability, as a function of the threshold of acceptability, where the model parameters were distributed according to normal and uniform distributions. To illustrate our approach we revisited a previous study that incorporated cost-efficiency analyses in management decisions that were based on perturbation analyses of matrix population models. Incorporating derivations from this study into our framework, we extended the model to address potential uncertainties. We then applied these results to 2 case studies: management of a Koala (Phascolarctos cinereus) population and conservation of an olive ridley sea turtle (Lepidochelys olivacea) population. For low aspirations, that is, when the threshold of acceptability is relatively low, the optimal strategy was obtained by diversifying the allocation of funds. Conversely, for high aspirations, the budget was directed toward management actions with the highest potential effect on the population. The exact optimal allocation was sensitive to the choice of uncertainty model. Our results highlight the importance of accounting for uncertainty when making decisions and suggest that more effort should be placed on

  3. Is the Smagorinsky coefficient sensitive to uncertainty in the form of the energy spectrum?

    NASA Astrophysics Data System (ADS)

    Meldi, M.; Lucor, D.; Sagaut, P.

    2011-12-01

    We investigate the influence of uncertainties in the shape of the energy spectrum over the Smagorinsky ["General circulation experiments with the primitive equations. I: The basic experiment," Mon. Weather Rev. 91(3), 99 (1963)] subgrid scale model constant CS: the analysis is carried out by a stochastic approach based on generalized polynomial chaos. The free parameters in the considered energy spectrum functional forms are modeled as random variables over bounded supports: two models of the energy spectrum are investigated, namely, the functional form proposed by Pope [Turbulent Flows (Cambridge University Press, Cambridge, 2000)] and by Meyers and Meneveau ["A functional form for the energy spectrum parametrizing bottleneck and intermittency effects," Phys. Fluids 20(6), 065109 (2008)]. The Smagorinsky model coefficient, computed from the algebraic relation presented in a recent work by Meyers and Sagaut ["On the model coefficients for the standard and the variational multi-scale Smagorinsky model," J. Fluid Mech. 569, 287 (2006)], is considered as a stochastic process and is described by numerical tools streaming from the probability theory. The uncertainties are introduced in the free parameters shaping the energy spectrum in correspondence to the large and the small scales, respectively. The predicted model constant is weakly sensitive to the shape of the energy spectrum when large scales uncertainty is considered: if the large-eddy simulation (LES) filter cut is performed in the inertial range, a significant probability to recover values lower in magnitude than the asymptotic Lilly-Smagorinsky model constant is recovered. Furthermore, the predicted model constant occurrences cluster in a compact range of values: the correspondent probability density function rapidly drops to zero approaching the extremes values of the range, which show a significant sensitivity to the LES filter width. The sensitivity of the model constant to uncertainties propagated in the

  4. Uncertainty and sensitivity assessments of an agricultural-hydrological model (RZWQM2) using the GLUE method

    NASA Astrophysics Data System (ADS)

    Sun, Mei; Zhang, Xiaolin; Huo, Zailin; Feng, Shaoyuan; Huang, Guanhua; Mao, Xiaomin

    2016-03-01

    Quantitatively ascertaining and analyzing the effects of model uncertainty on model reliability is a focal point for agricultural-hydrological models due to more uncertainties of inputs and processes. In this study, the generalized likelihood uncertainty estimation (GLUE) method with Latin hypercube sampling (LHS) was used to evaluate the uncertainty of the RZWQM-DSSAT (RZWQM2) model outputs responses and the sensitivity of 25 parameters related to soil properties, nutrient transport and crop genetics. To avoid the one-sided risk of model prediction caused by using a single calibration criterion, the combined likelihood (CL) function integrated information concerning water, nitrogen, and crop production was introduced in GLUE analysis for the predictions of the following four model output responses: the total amount of water content (T-SWC) and the nitrate nitrogen (T-NIT) within the 1-m soil profile, the seed yields of waxy maize (Y-Maize) and winter wheat (Y-Wheat). In the process of evaluating RZWQM2, measurements and meteorological data were obtained from a field experiment that involved a winter wheat and waxy maize crop rotation system conducted from 2003 to 2004 in southern Beijing. The calibration and validation results indicated that RZWQM2 model can be used to simulate the crop growth and water-nitrogen migration and transformation in wheat-maize crop rotation planting system. The results of uncertainty analysis using of GLUE method showed T-NIT was sensitive to parameters relative to nitrification coefficient, maize growth characteristics on seedling period, wheat vernalization period, and wheat photoperiod. Parameters on soil saturated hydraulic conductivity, nitrogen nitrification and denitrification, and urea hydrolysis played an important role in crop yield component. The prediction errors for RZWQM2 outputs with CL function were relatively lower and uniform compared with other likelihood functions composed of individual calibration criterion. This

  5. Uncertainty of Wheat Water Use: Simulated Patterns and Sensitivity to Temperature and CO2

    NASA Technical Reports Server (NTRS)

    Cammarano, Davide; Roetter, Reimund P.; Asseng, Senthold; Ewert, Frank; Wallach, Daniel; Martre, Pierre; Hatfield, Jerry L.; Jones, James W.; Rosenzweig, Cynthia E.; Ruane, Alex C.; Boote, Kenneth J.; Thorburn, Peter J.; Kersebaum, Kurt Christian; Aggarwal, Pramod K.; Angulo, Carlos; Basso, Bruno; Bertuzzi, Patrick; Biernath, Christian; Brisson, Nadine; Challinor, Andrew J.; Doltra, Jordi; Gayler, Sebastian; Goldberg, Richie; Heng, Lee; Steduto, Pasquale

    2016-01-01

    Projected global warming and population growth will reduce future water availability for agriculture. Thus, it is essential to increase the efficiency in using water to ensure crop productivity. Quantifying crop water use (WU; i.e. actual evapotranspiration) is a critical step towards this goal. Here, sixteen wheat simulation models were used to quantify sources of model uncertainty and to estimate the relative changes and variability between models for simulated WU, water use efficiency (WUE, WU per unit of grain dry mass produced), transpiration efficiency (Teff, transpiration per kg of unit of grain yield dry mass produced), grain yield, crop transpiration and soil evaporation at increased temperatures and elevated atmospheric carbon dioxide concentrations ([CO2]). The greatest uncertainty in simulating water use, potential evapotranspiration, crop transpiration and soil evaporation was due to differences in how crop transpiration was modelled and accounted for 50 of the total variability among models. The simulation results for the sensitivity to temperature indicated that crop WU will decline with increasing temperature due to reduced growing seasons. The uncertainties in simulated crop WU, and in particularly due to uncertainties in simulating crop transpiration, were greater under conditions of increased temperatures and with high temperatures in combination with elevated atmospheric [CO2] concentrations. Hence the simulation of crop WU, and in particularly crop transpiration under higher temperature, needs to be improved and evaluated with field measurements before models can be used to simulate climate change impacts on future crop water demand.

  6. Sensitivity Analysis and Uncertainty Characterization of Subnational Building Energy Demand in an Integrated Assessment Model

    NASA Astrophysics Data System (ADS)

    Scott, M. J.; Daly, D.; McJeon, H.; Zhou, Y.; Clarke, L.; Rice, J.; Whitney, P.; Kim, S.

    2012-12-01

    Residential and commercial buildings are a major source of energy consumption and carbon dioxide emissions in the United States, accounting for 41% of energy consumption and 40% of carbon emissions in 2011. Integrated assessment models (IAMs) historically have been used to estimate the impact of energy consumption on greenhouse gas emissions at the national and international level. Increasingly they are being asked to evaluate mitigation and adaptation policies that have a subnational dimension. In the United States, for example, building energy codes are adopted and enforced at the state and local level. Adoption of more efficient appliances and building equipment is sometimes directed or actively promoted by subnational governmental entities for mitigation or adaptation to climate change. The presentation reports on new example results from the Global Change Assessment Model (GCAM) IAM, one of a flexibly-coupled suite of models of human and earth system interactions known as the integrated Regional Earth System Model (iRESM) system. iRESM can evaluate subnational climate policy in the context of the important uncertainties represented by national policy and the earth system. We have added a 50-state detailed U.S. building energy demand capability to GCAM that is sensitive to national climate policy, technology, regional population and economic growth, and climate. We are currently using GCAM in a prototype stakeholder-driven uncertainty characterization process to evaluate regional climate mitigation and adaptation options in a 14-state pilot region in the U.S. upper Midwest. The stakeholder-driven decision process involves several steps, beginning with identifying policy alternatives and decision criteria based on stakeholder outreach, identifying relevant potential uncertainties, then performing sensitivity analysis, characterizing the key uncertainties from the sensitivity analysis, and propagating and quantifying their impact on the relevant decisions. In the

  7. The Lower Uncertainty Bound of Climate Sensitivity in Gcms: How Low Can We Go?...

    NASA Astrophysics Data System (ADS)

    Millar, R.; Sparrow, S.; Sexton, D.; Lowe, J. A.; Ingram, W.; Allen, M. R.

    2014-12-01

    The equilibrium climate sensitivity (ECS) is one of the most important metrics of climate change. As such, constraining the uncertainties of its magnitude, and the magnitude of its transient counterpart (TCR), is one of the primary goals of global climate science. General circulations models (GCMs) from modelling centres around the world have consistently failed to produce a model with a sensitivity of less than 2 degrees. However, as the CMIP5 multi-model ensemble is an ensemble of opportunity, it is unclear whether this fact is sufficient to rule out climate sensitivity of less than 2 degrees, or is the ensemble simply not diverse enough to sample low values of climate sensitivity? We present analysis based on the observed planetary energy budget and simple energy-balance models. When view in terms of the TCR:ECS ratio (RWF- the Realised Warming Fraction), we find a region of climate response space of low RWF and low TCR that is robust to the structure of the simple climate model and isn't sampled by the CMIP5 ensemble. We show that this region is better sampled by a perturbed physics ensemble of the HadCM3 GCM constrained solely on top of atmosphere radiative fluxes than the CMIP5 ensemble, raising the question of the physical plausibility of low climate sensitivity GCMs. Based on our results above, we have set out to systematically probe the ability to create GCMs with low climate sensitivity in the HadCM3 GCM. We train a statistical emulator on our perturbed physics ensemble and use it to identify regions of HadCM3 parameter space that are consistent with both a low climate sensitivity and a low RWF. We then run this "low sensitivity" ensemble to test our predictions and understand the combination of feedbacks needed to produce a sensible GCM with a sensitivity of less than 2 degrees. Here we hope to demonstrate our results from this systematic probing of the low climate sensitivity uncertainty bound and add further understanding to the physical plausibility

  8. Parameter uncertainty, sensitivity, and sediment coupling in bioenergetics-based food web models

    SciTech Connect

    Barron, M.G.; Cacela, D.; Beltman, D.

    1995-12-31

    A bioenergetics-based food web model was developed and calibrated using measured PCB water and sediment concentrations in two Great Lakes food webs: Green Bay, Michigan and Lake Ontario. The model incorporated functional based trophic levels and sediment, water, and food chain exposures of PCBs to aquatic biota. Sensitivity analysis indicated the parameters with the greatest influence on PCBs in top predators were lipid content of plankton and benthos, planktivore assimilation efficiency, Kow, prey selection, and ambient temperature. Sediment-associated PCBs were estimated to contribute over 90% of PCBs in benthivores and less than 50% in piscivores. Ranges of PCB concentrations in top predators estimated by Monte Carlo simulation incorporating parameter uncertainty were within one order of magnitude of modal values. Model applications include estimation of exceedences of human and ecological thresholds. The results indicate that point estimates from bioenergetics-based food web models have substantial uncertainty that should be considered in regulatory and scientific applications.

  9. Influences of parameter uncertainties within the ICRP-66 respiratory tract model: a parameter sensitivity analysis.

    PubMed

    Huston, Thomas E; Farfán, Eduardo B; Bolch, W Emmett; Bolch, Wesley E

    2003-11-01

    An important aspect in model uncertainty analysis is the evaluation of input parameter sensitivities with respect to model outcomes. In previous publications, parameter uncertainties were examined for the ICRP-66 respiratory tract model. The studies were aided by the development and use of a computer code LUDUC (Lung Dose Uncertainty Code) which allows probabilities density functions to be specified for all ICRP-66 model input parameters. These density functions are sampled using Latin hypercube techniques with values subsequently propagated through the ICRP-66 model. In the present study, LUDUC has been used to perform a detailed parameter sensitivity analysis of the ICRP-66 model using input parameter density functions specified in previously published articles. The results suggest that most of the variability in the dose to a given target region is explained by only a few input parameters. For example, for particle diameters between 0.1 and 50 microm, about 50% of the variability in the total lung dose (weighted sum of target tissue doses) for 239PuO2 is due to variability in the dose to the alveolar-interstitial (AI) region. In turn, almost 90% of the variability in the dose to the AI region is attributable to uncertainties in only four parameters in the model: the ventilation rate, the AI deposition fraction, the clearance rate constant for slow-phase absorption of deposited material to the blood, and the clearance rate constant for particle transport from the AI2 to bb1 compartment. A general conclusion is that many input parameters do not significantly influence variability in final doses. As a result, future research can focus on improving density functions for those input variables that contribute the most to variability in final dose values. PMID:14571988

  10. Sensitivity of Earthquake Loss Estimates to Source Modeling Assumptions and Uncertainty

    USGS Publications Warehouse

    Reasenberg, Paul A.; Shostak, Nan; Terwilliger, Sharon

    2006-01-01

    adopted in the loss calculations. This is a sensitivity study aimed at future regional earthquake source modelers, so that they may be informed of the effects on loss introduced by modeling assumptions and epistemic uncertainty in the WG02 earthquake source model.

  11. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis☆

    PubMed Central

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-01-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987

  12. Methods in Use for Sensitivity Analysis, Uncertainty Evaluation, and Target Accuracy Assessment

    SciTech Connect

    G. Palmiotti; M. Salvatores; G. Aliberti

    2007-10-01

    Sensitivity coefficients can be used for different objectives like uncertainty estimates, design optimization, determination of target accuracy requirements, adjustment of input parameters, and evaluations of the representativity of an experiment with respect to a reference design configuration. In this paper the theory, based on the adjoint approach, that is implemented in the ERANOS fast reactor code system is presented along with some unique tools and features related to specific types of problems as is the case for nuclide transmutation, reactivity loss during the cycle, decay heat, neutron source associated to fuel fabrication, and experiment representativity.

  13. Advanced Simulation Capability for Environmental Management (ASCEM): Developments in Uncertainty Quantification and Sensitivity Analysis.

    NASA Astrophysics Data System (ADS)

    McKinney, S. W.

    2015-12-01

    Effectiveness of uncertainty quantification (UQ) and sensitivity analysis (SA) has been improved in ASCEM by choosing from a variety of methods to best suit each model. Previously, ASCEM had a small toolset for UQ and SA, leaving out benefits of the many unincluded methods. Many UQ and SA methods are useful for analyzing models with specific characteristics; therefore, programming these methods into ASCEM would have been inefficient. Embedding the R programming language into ASCEM grants access to a plethora of UQ and SA methods. As a result, programming required is drastically decreased, and runtime efficiency and analysis effectiveness are increased relative to each unique model.

  14. Evaluation of Uncertainty and Sensitivity in Environmental Modeling at a Radioactive Waste Management Site

    NASA Astrophysics Data System (ADS)

    Stockton, T. B.; Black, P. K.; Catlett, K. M.; Tauxe, J. D.

    2002-05-01

    Environmental modeling is an essential component in the evaluation of regulatory compliance of radioactive waste management sites (RWMSs) at the Nevada Test Site in southern Nevada, USA. For those sites that are currently operating, further goals are to support integrated decision analysis for the development of acceptance criteria for future wastes, as well as site maintenance, closure, and monitoring. At these RWMSs, the principal pathways for release of contamination to the environment are upward towards the ground surface rather than downwards towards the deep water table. Biotic processes, such as burrow excavation and plant uptake and turnover, dominate this upward transport. A combined multi-pathway contaminant transport and risk assessment model was constructed using the GoldSim modeling platform. This platform facilitates probabilistic analysis of environmental systems, and is especially well suited for assessments involving radionuclide decay chains. The model employs probabilistic definitions of key parameters governing contaminant transport, with the goals of quantifying cumulative uncertainty in the estimation of performance measures and providing information necessary to perform sensitivity analyses. This modeling differs from previous radiological performance assessments (PAs) in that the modeling parameters are intended to be representative of the current knowledge, and the uncertainty in that knowledge, of parameter values rather than reflective of a conservative assessment approach. While a conservative PA may be sufficient to demonstrate regulatory compliance, a parametrically honest PA can also be used for more general site decision-making. In particular, a parametrically honest probabilistic modeling approach allows both uncertainty and sensitivity analyses to be explicitly coupled to the decision framework using a single set of model realizations. For example, sensitivity analysis provides a guide for analyzing the value of collecting more

  15. Evaluating the Hydrologic Sensitivities of Three Land Surface Models to Bound Uncertainties in Runoff Projections

    NASA Astrophysics Data System (ADS)

    Chiao, T.; Nijssen, B.; Stickel, L.; Lettenmaier, D. P.

    2013-12-01

    Hydrologic modeling is often used to assess the potential impacts of climate change on water availability and quality. A common approach in these studies is to calibrate the selected model(s) to reproduce historic stream flows prior to the application of future climate projections. This approach relies on the implicit assumptions that the sensitivities of these models to meteorological fluctuations will remain relatively constant under climate change and that these sensitivities are similar among models if all models are calibrated to the same historic record. However, even if the models are able to capture the historic variability in hydrological variables, differences in model structure and parameter estimation contribute to the uncertainties in projected runoff, which confounds the incorporation of these results into water resource management decision-making. A better understanding of the variability in hydrologic sensitivities between different models can aid in bounding this uncertainty. In this research, we characterized the hydrologic sensitivities of three watershed-scale land surface models through a case study of the Bull Run watershed in Northern Oregon. The Distributed Hydrology Soil Vegetation Model (DHSVM), Precipitation-Runoff Modeling System (PRMS), and Variable Infiltration Capacity model (VIC) were implemented and calibrated individually to historic streamflow using a common set of long-term, gridded forcings. In addition to analyzing model performances for a historic period, we quantified the temperature sensitivity (defined as change in runoff in response to change in temperature) and precipitation elasticity (defined as change in runoff in response to change in precipitation) of these three models via perturbation of the historic climate record using synthetic experiments. By comparing how these three models respond to changes in climate forcings, this research aims to test the assumption of constant and similar hydrologic sensitivities. Our

  16. Uncertainty and sensitivity analysis of early exposure results with the MACCS Reactor Accident Consequence Model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; McKay, M.D.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the early health effects associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 34 imprecisely known input variables on the following reactor accident consequences are studied: number of early fatalities, number of cases of prodromal vomiting, population dose within 10 mi of the reactor, population dose within 1000 mi of the reactor, individual early fatality probability within 1 mi of the reactor, and maximum early fatality distance. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: scaling factor for horizontal dispersion, dry deposition velocity, inhalation protection factor for nonevacuees, groundshine shielding factor for nonevacuees, early fatality hazard function alpha value for bone marrow exposure, and scaling factor for vertical dispersion.

  17. Parameter sensitivity and uncertainty analysis for a storm surge and wave model

    NASA Astrophysics Data System (ADS)

    Bastidas, Luis A.; Knighton, James; Kline, Shaun W.

    2016-09-01

    Development and simulation of synthetic hurricane tracks is a common methodology used to estimate hurricane hazards in the absence of empirical coastal surge and wave observations. Such methods typically rely on numerical models to translate stochastically generated hurricane wind and pressure forcing into coastal surge and wave estimates. The model output uncertainty associated with selection of appropriate model parameters must therefore be addressed. The computational overburden of probabilistic surge hazard estimates is exacerbated by the high dimensionality of numerical surge and wave models. We present a model parameter sensitivity analysis of the Delft3D model for the simulation of hazards posed by Hurricane Bob (1991) utilizing three theoretical wind distributions (NWS23, modified Rankine, and Holland). The sensitive model parameters (of 11 total considered) include wind drag, the depth-induced breaking γB, and the bottom roughness. Several parameters show no sensitivity (threshold depth, eddy viscosity, wave triad parameters, and depth-induced breaking αB) and can therefore be excluded to reduce the computational overburden of probabilistic surge hazard estimates. The sensitive model parameters also demonstrate a large number of interactions between parameters and a nonlinear model response. While model outputs showed sensitivity to several parameters, the ability of these parameters to act as tuning parameters for calibration is somewhat limited as proper model calibration is strongly reliant on accurate wind and pressure forcing data. A comparison of the model performance with forcings from the different wind models is also presented.

  18. Parameter sensitivity and uncertainty analysis for a storm surge and wave model

    NASA Astrophysics Data System (ADS)

    Bastidas, L. A.; Knighton, J.; Kline, S. W.

    2015-10-01

    Development and simulation of synthetic hurricane tracks is a common methodology used to estimate hurricane hazards in the absence of empirical coastal surge and wave observations. Such methods typically rely on numerical models to translate stochastically generated hurricane wind and pressure forcing into coastal surge and wave estimates. The model output uncertainty associated with selection of appropriate model parameters must therefore be addressed. The computational overburden of probabilistic surge hazard estimates is exacerbated by the high dimensionality of numerical surge and wave models. We present a model parameter sensitivity analysis of the Delft3D model for the simulation of hazards posed by Hurricane Bob (1991) utilizing three theoretical wind distributions (NWS23, modified Rankine, and Holland). The sensitive model parameters (of eleven total considered) include wind drag, the depth-induced breaking γB, and the bottom roughness. Several parameters show no sensitivity (threshold depth, eddy viscosity, wave triad parameters and depth-induced breaking αB) and can therefore be excluded to reduce the computational overburden of probabilistic surge hazard estimates. The sensitive model parameters also demonstrate a large amount of interactions between parameters and a non-linear model response. While model outputs showed sensitivity to several parameters, the ability of these parameters to act as tuning parameters for calibration is somewhat limited as proper model calibration is strongly reliant on accurate wind and pressure forcing data. A comparison of the model performance with forcings from the different wind models is also presented.

  19. Multi-Dimensional, Discrete-Ordinates Based Cross Section Sensitivity and Uncertainty Analysis Code System.

    SciTech Connect

    KODELI, IVAN-ALEXANDER

    2008-05-22

    Version 01 SUSD3D 2008 calculates sensitivity coefficients and standard deviation in the calculated detector responses or design parameters of interest due to input cross sections and their uncertainties. One-, two- and three-dimensional transport problems can be studied. Several types of uncertainties can be considered, i.e. those due to (1) neutron/gamma multi-group cross sections, (2) energy-dependent response functions, (3) secondary angular distribution (SAD) or secondary energy distribution (SED) uncertainties. SUSD3D, initially released in 2000, is loosely based on the SUSD code by K. Furuta, Y. Oka and S. Kondo from the University of Tokyo in Japan. SUSD 2008 modifications are primarily relevant for the sensitivity calculations of the critical systems and include: o Correction of the sensitivity calculation for prompt fission and number of delayed neutrons per fission (MT=18 and MT=455). o An option allows the re-normalization of the prompt fission spectra covariance matrices to be applied via the "normalization" of the sensitivity profiles. This option is useful in case if the fission spectra covariances (MF=35) used do not comply with the ENDF-6 Format Manual rules. o For the criticality calculations the normalization can be calculated by the code SUSD3D internally. Parameter NORM should be set to 0 in this case. Total number of neutrons per fission (MT=452) sensitivities for all the fissile materials must be requested in the SUSD3D OVERLAY-2 input deck in order to allow the correct normalization. o The cross section data format reading was updated, mostly for critical systems (e.g. MT18 reaction). o Fission spectra uncertainties can be calculated using the file MF35 data processed by the ERROR-J code. o Cross sections can be input directly using input card "xs" (vector data only). o k-eff card was added for subcritical systems. o This version of SUSD3D code is compatible with the single precision DANTSYS code package (CCC-0547/07 and /08, which are the

  20. Multi-Dimensional, Discrete-Ordinates Based Cross Section Sensitivity and Uncertainty Analysis Code System.

    2008-05-22

    Version 01 SUSD3D 2008 calculates sensitivity coefficients and standard deviation in the calculated detector responses or design parameters of interest due to input cross sections and their uncertainties. One-, two- and three-dimensional transport problems can be studied. Several types of uncertainties can be considered, i.e. those due to (1) neutron/gamma multi-group cross sections, (2) energy-dependent response functions, (3) secondary angular distribution (SAD) or secondary energy distribution (SED) uncertainties. SUSD3D, initially released in 2000, is looselymore » based on the SUSD code by K. Furuta, Y. Oka and S. Kondo from the University of Tokyo in Japan. SUSD 2008 modifications are primarily relevant for the sensitivity calculations of the critical systems and include: o Correction of the sensitivity calculation for prompt fission and number of delayed neutrons per fission (MT=18 and MT=455). o An option allows the re-normalization of the prompt fission spectra covariance matrices to be applied via the "normalization" of the sensitivity profiles. This option is useful in case if the fission spectra covariances (MF=35) used do not comply with the ENDF-6 Format Manual rules. o For the criticality calculations the normalization can be calculated by the code SUSD3D internally. Parameter NORM should be set to 0 in this case. Total number of neutrons per fission (MT=452) sensitivities for all the fissile materials must be requested in the SUSD3D OVERLAY-2 input deck in order to allow the correct normalization. o The cross section data format reading was updated, mostly for critical systems (e.g. MT18 reaction). o Fission spectra uncertainties can be calculated using the file MF35 data processed by the ERROR-J code. o Cross sections can be input directly using input card "xs" (vector data only). o k-eff card was added for subcritical systems. o This version of SUSD3D code is compatible with the single precision DANTSYS code package (CCC-0547/07 and /08, which

  1. Sensitivity of CO2 migration estimation on reservoir temperature and pressure uncertainty

    SciTech Connect

    Jordan, Preston; Doughty, Christine

    2008-11-01

    The density and viscosity of supercritical CO{sub 2} are sensitive to pressure and temperature (PT) while the viscosity of brine is sensitive primarily to temperature. Oil field PT data in the vicinity of WESTCARB's Phase III injection pilot test site in the southern San Joaquin Valley, California, show a range of PT values, indicating either PT uncertainty or variability. Numerical simulation results across the range of likely PT indicate brine viscosity variation causes virtually no difference in plume evolution and final size, but CO{sub 2} density variation causes a large difference. Relative ultimate plume size is almost directly proportional to the relative difference in brine and CO{sub 2} density (buoyancy flow). The majority of the difference in plume size occurs during and shortly after the cessation of injection.

  2. Climate Change Impact Uncertainties for Maize in Panama: Farm Information, Climate Projections, and Yield Sensitivities

    NASA Technical Reports Server (NTRS)

    Ruane, Alex C.; Cecil, L. Dewayne; Horton, Radley M.; Gordon, Roman; McCollum, Raymond (Brown, Douglas); Brown, Douglas; Killough, Brian; Goldberg, Richard; Greeley, Adam P.; Rosenzweig, Cynthia

    2011-01-01

    We present results from a pilot project to characterize and bound multi-disciplinary uncertainties around the assessment of maize (Zea mays) production impacts using the CERES-Maize crop model in a climate-sensitive region with a variety of farming systems (Panama). Segunda coa (autumn) maize yield in Panama currently suffers occasionally from high water stress at the end of the growing season, however under future climate conditions warmer temperatures accelerate crop maturation and elevated CO (sub 2) concentrations improve water retention. This combination reduces end-of-season water stresses and eventually leads to small mean yield gains according to median projections, although accelerated maturation reduces yields in seasons with low water stresses. Calibrations of cultivar traits, soil profile, and fertilizer amounts are most important for representing baseline yields, however sensitivity to all management factors is reduced in an assessment of future yield changes (most dramatically for fertilizers), suggesting that yield changes may be more generalizable than absolute yields. Uncertainty around General Circulation Model (GCM)s' projected changes in rainfall gain in importance throughout the century, with yield changes strongly correlated with growing season rainfall totals. Climate changes are expected to be obscured by the large inter-annual variations in Panamanian climate that will continue to be the dominant influence on seasonal maize yield into the coming decades. The relatively high (A2) and low (B1) emissions scenarios show little difference in their impact on future maize yields until the end of the century. Uncertainties related to the sensitivity of CERES-Maize to carbon dioxide concentrations have a substantial influence on projected changes, and remain a significant obstacle to climate change impacts assessment. Finally, an investigation into the potential of simple statistical yield emulators based upon key climate variables characterizes the

  3. Grid and basis adaptive polynomial chaos techniques for sensitivity and uncertainty analysis

    SciTech Connect

    Perkó, Zoltán Gilli, Luca Lathouwers, Danny Kloosterman, Jan Leen

    2014-03-01

    The demand for accurate and computationally affordable sensitivity and uncertainty techniques is constantly on the rise and has become especially pressing in the nuclear field with the shift to Best Estimate Plus Uncertainty methodologies in the licensing of nuclear installations. Besides traditional, already well developed methods – such as first order perturbation theory or Monte Carlo sampling – Polynomial Chaos Expansion (PCE) has been given a growing emphasis in recent years due to its simple application and good performance. This paper presents new developments of the research done at TU Delft on such Polynomial Chaos (PC) techniques. Our work is focused on the Non-Intrusive Spectral Projection (NISP) approach and adaptive methods for building the PCE of responses of interest. Recent efforts resulted in a new adaptive sparse grid algorithm designed for estimating the PC coefficients. The algorithm is based on Gerstner's procedure for calculating multi-dimensional integrals but proves to be computationally significantly cheaper, while at the same it retains a similar accuracy as the original method. More importantly the issue of basis adaptivity has been investigated and two techniques have been implemented for constructing the sparse PCE of quantities of interest. Not using the traditional full PC basis set leads to further reduction in computational time since the high order grids necessary for accurately estimating the near zero expansion coefficients of polynomial basis vectors not needed in the PCE can be excluded from the calculation. Moreover the sparse PC representation of the response is easier to handle when used for sensitivity analysis or uncertainty propagation due to the smaller number of basis vectors. The developed grid and basis adaptive methods have been implemented in Matlab as the Fully Adaptive Non-Intrusive Spectral Projection (FANISP) algorithm and were tested on four analytical problems. These show consistent good performance both

  4. Sensitivity study and uncertainties assessment of the permafrost model for the Swiss Alps

    NASA Astrophysics Data System (ADS)

    Marmy, Antoine; Hauck, Christian; Scherler, Martin

    2013-04-01

    Modeling the evolution and the sensitivity of permafrost in the European Alps in the context of climate change is one of the most relevant and challenging task of the permafrost research in progress. The one dimensional soil-snow-atmosphere model CoupModel (Jansson & Karlberg 2001) has already been applied successfully for permafrost modeling in the Swiss Alps (Engelhardt et al. 2010, Scherler et al. 2010). Two sites in the Swiss Alps have been studied with a particular focus: the active rock glacier Murtèl (Upper Engadine), and the Schilthorn massif (Bernese Alps). In order to evaluate the sensitivity of the model to changes in air temperature and precipitations, a sensitivity study of the model has been carried out using a delta change approach. Annual and seasonal deltas were applied to air temperature and precipitation input series until the end of the century using a large parameter range in equidistant steps. The resulting ground thermal regimes and active layer thicknesses of rock glacier Murtèl and the Schilthorn massif are analysed and presented in this contribution. In addition, the General Likelihood Uncertainty Estimation (GLUE) method is used to assess the uncertainty of the simulations within the CoupModel (Jansson 2012). This method is based on an unbiased sampling of parameter values during simulation considering all combination of prescribed parameter values, such as thermal conductivities or snow parameters. Statistical performance indicators as Root Mean Square Error or Coefficient of Determination are used to define the acceptance of values and to assess the uncertainty. By this, not only the most appropriate parameter values for consistent subsurface modeling for the two permafrost sites can be determined, but model-based uncertainty ranges of the resulting ground temperatures and active layer thicknesses can be estimated. References: Engelhardt, M., Hauck, C., and Salzmann, N. (2010) Influence of atmospheric forcing parameters on modelled

  5. Assessing model sensitivity and uncertainty across multiple Free-Air CO2 Enrichment experiments.

    NASA Astrophysics Data System (ADS)

    Cowdery, E.; Dietze, M.

    2015-12-01

    As atmospheric levels of carbon dioxide levels continue to increase, it is critical that terrestrial ecosystem models can accurately predict ecological responses to the changing environment. Current predictions of net primary productivity (NPP) in response to elevated atmospheric CO2 concentrations are highly variable and contain a considerable amount of uncertainty. It is necessary that we understand which factors are driving this uncertainty. The Free-Air CO2 Enrichment (FACE) experiments have equipped us with a rich data source that can be used to calibrate and validate these model predictions. To identify and evaluate the assumptions causing inter-model differences we performed model sensitivity and uncertainty analysis across ambient and elevated CO2 treatments using the Data Assimilation Linked Ecosystem Carbon (DALEC) model and the Ecosystem Demography Model (ED2), two process-based models ranging from low to high complexity respectively. These modeled process responses were compared to experimental data from the Kennedy Space Center Open Top Chamber Experiment, the Nevada Desert Free Air CO2 Enrichment Facility, the Rhinelander FACE experiment, the Wyoming Prairie Heating and CO2 Enrichment Experiment, the Duke Forest Face experiment and the Oak Ridge Experiment on CO2 Enrichment. By leveraging data access proxy and data tilling services provided by the BrownDog data curation project alongside analysis modules available in the Predictive Ecosystem Analyzer (PEcAn), we produced automated, repeatable benchmarking workflows that are generalized to incorporate different sites and ecological models. Combining the observed patterns of uncertainty between the two models with results of the recent FACE-model data synthesis project (FACE-MDS) can help identify which processes need further study and additional data constraints. These findings can be used to inform future experimental design and in turn can provide informative starting point for data assimilation.

  6. Sensitivity and uncertainty analysis within a methodology for evaluating environmental restoration technologies

    NASA Astrophysics Data System (ADS)

    Zio, Enrico; Apostolakis, George E.

    1999-03-01

    This paper illustrates an application of sensitivity and uncertainty analysis techniques within a methodology for evaluating environmental restoration technologies. The methodology consists of two main parts: the first part ("analysis") integrates a wide range of decision criteria and impact evaluation techniques in a framework that emphasizes and incorporates input from stakeholders in all aspects of the process. Its products are the rankings of the alternative options for each stakeholder using, essentially, expected utility theory. The second part ("deliberation") utilizes the analytical results of the "analysis" and attempts to develop consensus among the stakeholders in a session in which the stakeholders discuss and evaluate the analytical results. This paper deals with the analytical part of the approach and the uncertainty and sensitivity analyses that were carried out in preparation for the deliberative process. The objective of these investigations was that of testing the robustness of the assessments and of pointing out possible existing sources of disagreements among the participating stakeholders, thus providing insights for the successive deliberative process. Standard techniques, such as differential analysis, Monte Carlo sampling and a two-dimensional policy region analysis proved sufficient for the task.

  7. Uncertainty and sensitivity of flood risk calculations for a dike ring in the south of the Netherlands.

    PubMed

    de Moel, Hans; Bouwer, Laurens M; Aerts, Jeroen C J H

    2014-03-01

    A central tool in risk management is the exceedance-probability loss (EPL) curve, which denotes the probabilities of damages being exceeded or equalled. These curves are used for a number of purposes, including the calculation of the expected annual damage (EAD), a common indicator for risk. The model calculations that are used to create such a curve contain uncertainties that accumulate in the end result. As a result, EPL curves and EAD calculations are also surrounded by uncertainties. Knowledge of the magnitude and source of these uncertainties helps to improve assessments and leads to better informed decisions. This study, therefore, performs uncertainty and sensitivity analyses for a dike-ring area in the Netherlands, on the south bank of the river Meuse. In this study, a Monte Carlo framework is used that combines hydraulic boundary conditions, a breach growth model, an inundation model, and a damage model. It encompasses the modelling of thirteen potential breach locations and uncertainties related to probability, duration of the flood wave, height of the flood wave, erodibility of the embankment, damage curves, and the value of assets at risk. The assessment includes uncertainty and sensitivity of risk estimates for each individual location, as well as the dike-ring area as a whole. The results show that for the dike ring in question, EAD estimates exhibit a 90% percentile range from about 8 times lower than the median, up to 4.5 times higher than the median. This level of uncertainty can mainly be attributed to uncertainty in depth-damage curves, uncertainty in the probability of a flood event and the duration of the flood wave. There are considerable differences between breach locations, both in the magnitude of the uncertainty, and in its source. This indicates that local characteristics have a considerable impact on uncertainty and sensitivity of flood damage and risk calculations.

  8. Brief Report: Effects of Sensory Sensitivity and Intolerance of Uncertainty on Anxiety in Mothers of Children with Autism Spectrum Disorder

    ERIC Educational Resources Information Center

    Uljarevic, Mirko; Carrington, Sarah; Leekam, Susan

    2016-01-01

    This study examined the relations between anxiety and individual characteristics of sensory sensitivity (SS) and intolerance of uncertainty (IU) in mothers of children with ASD. The mothers of 50 children completed the Hospital Anxiety and Depression Scale, the Highly Sensitive Person Scale and the IU Scale. Anxiety was associated with both SS and…

  9. Volcano deformation source parameters estimated from InSAR: Sensitivities to uncertainties in seismic tomography

    NASA Astrophysics Data System (ADS)

    Masterlark, Timothy; Donovan, Theodore; Feigl, Kurt L.; Haney, Matthew; Thurber, Clifford H.; Tung, Sui

    2016-04-01

    The eruption cycle of a volcano is controlled in part by the upward migration of magma. The characteristics of the magma flux produce a deformation signature at the Earth's surface. Inverse analyses use geodetic data to estimate strategic controlling parameters that describe the position and pressurization of a magma chamber at depth. The specific distribution of material properties controls how observed surface deformation translates to source parameter estimates. Seismic tomography models describe the spatial distributions of material properties that are necessary for accurate models of volcano deformation. This study investigates how uncertainties in seismic tomography models propagate into variations in the estimates of volcano deformation source parameters inverted from geodetic data. We conduct finite element model-based nonlinear inverse analyses of interferometric synthetic aperture radar (InSAR) data for Okmok volcano, Alaska, as an example. We then analyze the estimated parameters and their uncertainties to characterize the magma chamber. Analyses are performed separately for models simulating a pressurized chamber embedded in a homogeneous domain as well as for a domain having a heterogeneous distribution of material properties according to seismic tomography. The estimated depth of the source is sensitive to the distribution of material properties. The estimated depths for the homogeneous and heterogeneous domains are 2666 ± 42 and 3527 ± 56 m below mean sea level, respectively (99% confidence). A Monte Carlo analysis indicates that uncertainties of the seismic tomography cannot account for this discrepancy at the 99% confidence level. Accounting for the spatial distribution of elastic properties according to seismic tomography significantly improves the fit of the deformation model predictions and significantly influences estimates for parameters that describe the location of a pressurized magma chamber.

  10. Uncertainty and sensitivity analysis of food pathway results with the MACCS Reactor Accident Consequence Model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; Rollstin, J.A.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the food pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 87 imprecisely-known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, milk growing season dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, area dependent cost, crop disposal cost, milk disposal cost, condemnation area, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: fraction of cesium deposition on grain fields that is retained on plant surfaces and transferred directly to grain, maximum allowable ground concentrations of Cs-137 and Sr-90 for production of crops, ground concentrations of Cs-134, Cs-137 and I-131 at which the disposal of milk will be initiated due to accidents that occur during the growing season, ground concentrations of Cs-134, I-131 and Sr-90 at which the disposal of crops will be initiated due to accidents that occur during the growing season, rate of depletion of Cs-137 and Sr-90 from the root zone, transfer of Sr-90 from soil to legumes, transfer of Cs-137 from soil to pasture, transfer of cesium from animal feed to meat, and the transfer of cesium, iodine and strontium from animal feed to milk.

  11. Uncertainty and sensitivity analysis of chronic exposure results with the MACCS reactor accident consequence model

    SciTech Connect

    Helton, J.C.; Johnson, J.D.; Rollstin, J.A.; Shiver, A.W.; Sprung, J.L.

    1995-01-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis and stepwise regression analysis are used in an investigation with the MACCS model of the chronic exposure pathways associated with a severe accident at a nuclear power station. The primary purpose of this study is to provide guidance on the variables to be considered in future review work to reduce the uncertainty in the important variables used in the calculation of reactor accident consequences. The effects of 75 imprecisely known input variables on the following reactor accident consequences are studied: crop growing season dose, crop long-term dose, water ingestion dose, milk growing season dose, long-term groundshine dose, long-term inhalation dose, total food pathways dose, total ingestion pathways dose, total long-term pathways dose, total latent cancer fatalities, area-dependent cost, crop disposal cost, milk disposal cost, population-dependent cost, total economic cost, condemnation area, condemnation population, crop disposal area and milk disposal area. When the predicted variables are considered collectively, the following input variables were found to be the dominant contributors to uncertainty: dry deposition velocity, transfer of cesium from animal feed to milk, transfer of cesium from animal feed to meat, ground concentration of Cs-134 at which the disposal of milk products will be initiated, transfer of Sr-90 from soil to legumes, maximum allowable ground concentration of Sr-90 for production of crops, fraction of cesium entering surface water that is consumed in drinking water, groundshine shielding factor, scale factor defining resuspension, dose reduction associated with decontamination, and ground concentration of 1-131 at which disposal of crops will be initiated due to accidents that occur during the growing season.

  12. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping

    PubMed Central

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-01-01

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster–Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster–Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC

  13. Determination of protection zones for Dutch groundwater wells against virus contamination--uncertainty and sensitivity analysis.

    PubMed

    Schijven, J F; Mülschlegel, J H C; Hassanizadeh, S M; Teunis, P F M; de Roda Husman, A M

    2006-09-01

    Protection zones of shallow unconfined aquifers in The Netherlands were calculated that allow protection against virus contamination to the level that the infection risk of 10(-4) per person per year is not exceeded with a 95% certainty. An uncertainty and a sensitivity analysis of the calculated protection zones were included. It was concluded that protection zones of 1 to 2 years travel time (206-418 m) are needed (6 to 12 times the currently applied travel time of 60 days). This will lead to enlargement of protection zones, encompassing 110 unconfined groundwater well systems that produce 3 x 10(8) m3 y(-1) of drinking water (38% of total Dutch production from groundwater). A smaller protection zone is possible if it can be shown that an aquifer has properties that lead to greater reduction of virus contamination, like more attachment. Deeper aquifers beneath aquitards of at least 2 years of vertical travel time are adequately protected because vertical flow in the aquitards is only 0.7 m per year. The most sensitive parameters are virus attachment and inactivation. The next most sensitive parameters are grain size of the sand, abstraction rate of groundwater, virus concentrations in raw sewage and consumption of unboiled drinking water. Research is recommended on additional protection by attachment and under unsaturated conditions.

  14. A comparison of five forest interception models using global sensitivity and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Linhoss, Anna C.; Siegert, Courtney M.

    2016-07-01

    Interception by the forest canopy plays a critical role in the hydrologic cycle by removing a significant portion of incoming precipitation from the terrestrial component. While there are a number of existing physical models of forest interception, few studies have summarized or compared these models. The objective of this work is to use global sensitivity and uncertainty analysis to compare five mechanistic interception models including the Rutter, Rutter Sparse, Gash, Sparse Gash, and Liu models. Using parameter probability distribution functions of values from the literature, our results show that on average storm duration [Dur], gross precipitation [PG], canopy storage [S] and solar radiation [Rn] are the most important model parameters. On the other hand, empirical parameters used in calculating evaporation and drip (i.e. trunk evaporation as a proportion of evaporation from the saturated canopy [ɛ], the empirical drainage parameter [b], the drainage partitioning coefficient [pd], and the rate of water dripping from the canopy when canopy storage has been reached [Ds]) have relatively low levels of importance in interception modeling. As such, future modeling efforts should aim to decompose parameters that are the most influential in determining model outputs into easily measurable physical components. Because this study compares models, the choices regarding the parameter probability distribution functions are applied across models, which enables a more definitive ranking of model uncertainty.

  15. Uncertainty analysis and global sensitivity analysis of techno-economic assessments for biodiesel production.

    PubMed

    Tang, Zhang-Chun; Zhenzhou, Lu; Zhiwen, Liu; Ningcong, Xiao

    2015-01-01

    There are various uncertain parameters in the techno-economic assessments (TEAs) of biodiesel production, including capital cost, interest rate, feedstock price, maintenance rate, biodiesel conversion efficiency, glycerol price and operating cost. However, fewer studies focus on the influence of these parameters on TEAs. This paper investigated the effects of these parameters on the life cycle cost (LCC) and the unit cost (UC) in the TEAs of biodiesel production. The results show that LCC and UC exhibit variations when involving uncertain parameters. Based on the uncertainty analysis, three global sensitivity analysis (GSA) methods are utilized to quantify the contribution of an individual uncertain parameter to LCC and UC. The GSA results reveal that the feedstock price and the interest rate produce considerable effects on the TEAs. These results can provide a useful guide for entrepreneurs when they plan plants.

  16. Perspectives Gained in an Evaluation of Uncertainty, Sensitivity, and Decision Analysis Software

    SciTech Connect

    Davis, F.J.; Helton, J.C.

    1999-02-24

    The following software packages for uncertainty, sensitivity, and decision analysis were reviewed and also tested with several simple analysis problems: Crystal Ball, RiskQ, SUSA-PC, Analytica, PRISM, Ithink, Stella, LHS, STEPWISE, and JMP. Results from the review and test problems are presented. The study resulted in the recognition of the importance of four considerations in the selection of a software package: (1) the availability of an appropriate selection of distributions, (2) the ease with which data flows through the input sampling, model evaluation, and output analysis process, (3) the type of models that can be incorporated into the analysis process, and (4) the level of confidence in the software modeling and results.

  17. Uncertainty analysis and global sensitivity analysis of techno-economic assessments for biodiesel production.

    PubMed

    Tang, Zhang-Chun; Zhenzhou, Lu; Zhiwen, Liu; Ningcong, Xiao

    2015-01-01

    There are various uncertain parameters in the techno-economic assessments (TEAs) of biodiesel production, including capital cost, interest rate, feedstock price, maintenance rate, biodiesel conversion efficiency, glycerol price and operating cost. However, fewer studies focus on the influence of these parameters on TEAs. This paper investigated the effects of these parameters on the life cycle cost (LCC) and the unit cost (UC) in the TEAs of biodiesel production. The results show that LCC and UC exhibit variations when involving uncertain parameters. Based on the uncertainty analysis, three global sensitivity analysis (GSA) methods are utilized to quantify the contribution of an individual uncertain parameter to LCC and UC. The GSA results reveal that the feedstock price and the interest rate produce considerable effects on the TEAs. These results can provide a useful guide for entrepreneurs when they plan plants. PMID:25459861

  18. Third Floor Plan, Second Floor Plan, First Floor Plan, Ground ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Third Floor Plan, Second Floor Plan, First Floor Plan, Ground Floor Plan, West Bunkhouse - Kennecott Copper Corporation, On Copper River & Northwestern Railroad, Kennicott, Valdez-Cordova Census Area, AK

  19. Sensitivity of low energy brachytherapy Monte Carlo dose calculations to uncertainties in human tissue composition

    SciTech Connect

    Landry, Guillaume; Reniers, Brigitte; Murrer, Lars; Lutgens, Ludy; Bloemen-Van Gurp, Esther; Pignol, Jean-Philippe; Keller, Brian; Beaulieu, Luc; Verhaegen, Frank

    2010-10-15

    Purpose: The objective of this work is to assess the sensitivity of Monte Carlo (MC) dose calculations to uncertainties in human tissue composition for a range of low photon energy brachytherapy sources: {sup 125}I, {sup 103}Pd, {sup 131}Cs, and an electronic brachytherapy source (EBS). The low energy photons emitted by these sources make the dosimetry sensitive to variations in tissue atomic number due to the dominance of the photoelectric effect. This work reports dose to a small mass of water in medium D{sub w,m} as opposed to dose to a small mass of medium in medium D{sub m,m}. Methods: Mean adipose, mammary gland, and breast tissues (as uniform mixture of the aforementioned tissues) are investigated as well as compositions corresponding to one standard deviation from the mean. Prostate mean compositions from three different literature sources are also investigated. Three sets of MC simulations are performed with the GEANT4 code: (1) Dose calculations for idealized TG-43-like spherical geometries using point sources. Radial dose profiles obtained in different media are compared to assess the influence of compositional uncertainties. (2) Dose calculations for four clinical prostate LDR brachytherapy permanent seed implants using {sup 125}I seeds (Model 2301, Best Medical, Springfield, VA). The effect of varying the prostate composition in the planning target volume (PTV) is investigated by comparing PTV D{sub 90} values. (3) Dose calculations for four clinical breast LDR brachytherapy permanent seed implants using {sup 103}Pd seeds (Model 2335, Best Medical). The effects of varying the adipose/gland ratio in the PTV and of varying the elemental composition of adipose and gland within one standard deviation of the assumed mean composition are investigated by comparing PTV D{sub 90} values. For (2) and (3), the influence of using the mass density from CT scans instead of unit mass density is also assessed. Results: Results from simulation (1) show that variations

  20. Evaluation of habitat suitability index models by global sensitivity and uncertainty analyses: a case study for submerged aquatic vegetation

    USGS Publications Warehouse

    Zajac, Zuzanna; Stith, Bradley M.; Bowling, Andrea C.; Langtimm, Catherine A.; Swain, Eric D.

    2015-01-01

    Habitat suitability index (HSI) models are commonly used to predict habitat quality and species distributions and are used to develop biological surveys, assess reserve and management priorities, and anticipate possible change under different management or climate change scenarios. Important management decisions may be based on model results, often without a clear understanding of the level of uncertainty associated with model outputs. We present an integrated methodology to assess the propagation of uncertainty from both inputs and structure of the HSI models on model outputs (uncertainty analysis: UA) and relative importance of uncertain model inputs and their interactions on the model output uncertainty (global sensitivity analysis: GSA). We illustrate the GSA/UA framework using simulated hydrology input data from a hydrodynamic model representing sea level changes and HSI models for two species of submerged aquatic vegetation (SAV) in southwest Everglades National Park: Vallisneria americana (tape grass) and Halodule wrightii (shoal grass). We found considerable spatial variation in uncertainty for both species, but distributions of HSI scores still allowed discrimination of sites with good versus poor conditions. Ranking of input parameter sensitivities also varied spatially for both species, with high habitat quality sites showing higher sensitivity to different parameters than low-quality sites. HSI models may be especially useful when species distribution data are unavailable, providing means of exploiting widely available environmental datasets to model past, current, and future habitat conditions. The GSA/UA approach provides a general method for better understanding HSI model dynamics, the spatial and temporal variation in uncertainties, and the parameters that contribute most to model uncertainty. Including an uncertainty and sensitivity analysis in modeling efforts as part of the decision-making framework will result in better-informed, more robust

  1. Evaluation of habitat suitability index models by global sensitivity and uncertainty analyses: a case study for submerged aquatic vegetation

    PubMed Central

    Zajac, Zuzanna; Stith, Bradley; Bowling, Andrea C; Langtimm, Catherine A; Swain, Eric D

    2015-01-01

    Habitat suitability index (HSI) models are commonly used to predict habitat quality and species distributions and are used to develop biological surveys, assess reserve and management priorities, and anticipate possible change under different management or climate change scenarios. Important management decisions may be based on model results, often without a clear understanding of the level of uncertainty associated with model outputs. We present an integrated methodology to assess the propagation of uncertainty from both inputs and structure of the HSI models on model outputs (uncertainty analysis: UA) and relative importance of uncertain model inputs and their interactions on the model output uncertainty (global sensitivity analysis: GSA). We illustrate the GSA/UA framework using simulated hydrology input data from a hydrodynamic model representing sea level changes and HSI models for two species of submerged aquatic vegetation (SAV) in southwest Everglades National Park: Vallisneria americana (tape grass) and Halodule wrightii (shoal grass). We found considerable spatial variation in uncertainty for both species, but distributions of HSI scores still allowed discrimination of sites with good versus poor conditions. Ranking of input parameter sensitivities also varied spatially for both species, with high habitat quality sites showing higher sensitivity to different parameters than low-quality sites. HSI models may be especially useful when species distribution data are unavailable, providing means of exploiting widely available environmental datasets to model past, current, and future habitat conditions. The GSA/UA approach provides a general method for better understanding HSI model dynamics, the spatial and temporal variation in uncertainties, and the parameters that contribute most to model uncertainty. Including an uncertainty and sensitivity analysis in modeling efforts as part of the decision-making framework will result in better-informed, more robust

  2. Sensitivity of an atmospheric photochemistry model to chlorine perturbations including consideration of uncertainty propagation

    NASA Technical Reports Server (NTRS)

    Stolarski, R. S.; Douglass, A. R.

    1986-01-01

    Models of stratospheric photochemistry are generally tested by comparing their predictions for the composition of the present atmosphere with measurements of species concentrations. These models are then used to make predictions of the atmospheric sensitivity to perturbations. Here the problem of the sensitivity of such a model to chlorine perturbations ranging from the present influx of chlorine-containing compounds to several times that influx is addressed. The effects of uncertainties in input parameters, including reaction rate coefficients, cross sections, solar fluxes, and boundary conditions, are evaluated using a Monte Carlo method in which the values of the input parameters are randomly selected. The results are probability distributions for present atmosheric concentrations and for calculated perturbations due to chlorine from fluorocarbons. For more than 300 Monte Carlo runs the calculated ozone perturbation for continued emission of fluorocarbons at today's rates had a mean value of -6.2 percent, with a 1-sigma width of 5.5 percent. Using the same runs but only allowing the cases in which the calculated present atmosphere values of NO, NO2, and ClO at 25 km altitude fell within the range of measurements yielded a mean ozone depletion of -3 percent, with a 1-sigma deviation of 2.2 percent. The model showed a nonlinear behavior as a function of added fluorocarbons. The mean of the Monte Carlo runs was less nonlinear than the model run using mean value of the input parameters.

  3. Sensitivity-Informed De Novo Programming for Many-Objective Water Portfolio Planning Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Kasprzyk, J. R.; Reed, P. M.; Kirsch, B. R.; Characklis, G. W.

    2009-12-01

    Risk-based water supply management presents severe cognitive, computational, and social challenges to planning in a changing world. Decision aiding frameworks must confront the cognitive biases implicit to risk, the severe uncertainties associated with long term planning horizons, and the consequent ambiguities that shape how we define and solve water resources planning and management problems. This paper proposes and demonstrates a new interactive framework for sensitivity informed de novo programming. The theoretical focus of our many-objective de novo programming is to promote learning and evolving problem formulations to enhance risk-based decision making. We have demonstrated our proposed de novo programming framework using a case study for a single city’s water supply in the Lower Rio Grande Valley (LRGV) in Texas. Key decisions in this case study include the purchase of permanent rights to reservoir inflows and anticipatory thresholds for acquiring transfers of water through optioning and spot leases. A 10-year Monte Carlo simulation driven by historical data is used to provide performance metrics for the supply portfolios. The three major components of our methodology include Sobol globoal sensitivity analysis, many-objective evolutionary optimization and interactive tradeoff visualization. The interplay between these components allows us to evaluate alternative design metrics, their decision variable controls and the consequent system vulnerabilities. Our LRGV case study measures water supply portfolios’ efficiency, reliability, and utilization of transfers in the water supply market. The sensitivity analysis is used interactively over interannual, annual, and monthly time scales to indicate how the problem controls change as a function of the timescale of interest. These results have been used then to improve our exploration and understanding of LRGV costs, vulnerabilities, and the water portfolios’ critical reliability constraints. These results

  4. COMPUTATIONAL METHODS FOR SENSITIVITY AND UNCERTAINTY ANALYSIS FOR ENVIRONMENTAL AND BIOLOGICAL MODELS

    EPA Science Inventory

    This work introduces a computationally efficient alternative method for uncertainty propagation, the Stochastic Response Surface Method (SRSM). The SRSM approximates uncertainties in model outputs through a series expansion in normal random variables (polynomial chaos expansion)...

  5. A sensitivity study of s-process: the impact of uncertainties from nuclear reaction rates

    NASA Astrophysics Data System (ADS)

    Vinyoles, N.; Serenelli, A.

    2016-01-01

    The slow neutron capture process (s-process) is responsible for the production of about half the elements beyond the Fe-peak. The production sites and the conditions under which the different components of s-process occur are relatively well established. A detailed quantitative understanding of s-process nucleosynthesis may yield light in physical processes, e.g. convection and mixing, taking place in the production sites. For this, it is important that the impact of uncertainties in the nuclear physics is well understood. In this work we perform a study of the sensitivity of s-process nucleosynthesis, with particular emphasis in the main component, on the nuclear reaction rates. Our aims are: to quantify the current uncertainties in the production factors of s-process elements originating from nuclear physics and, to identify key nuclear reactions that require more precise experimental determinations. In this work we studied two different production sites in which s-process occurs with very different neutron exposures: 1) a low-mass extremely metal-poor star during the He-core flash (nn reaching up to values of ∼ 1014cm-3); 2) the TP-AGB phase of a M⊙, Z=0.01 model, the typical site of the main s-process component (nn up to 108 — 109cm-3). In the first case, the main variation in the production of s-process elements comes from the neutron poisons and with relative variations around 30%-50%. In the second, the neutron poison are not as important because of the higher metallicity of the star that actually acts as a seed and therefore, the final error of the abundances are much lower around 10%-25%.

  6. Fractionated Lung IMPT Treatments: Sensitivity to Setup Uncertainties and Motion Effects Based on Single-Field Homogeneity.

    PubMed

    Dowdell, Stephen; Grassberger, Clemens; Sharp, Greg; Paganetti, Harald

    2016-10-01

    Treatment uncertainties in radiotherapy are either systematic or random. This study evaluates the sensitivity of fractionated intensity-modulated proton therapy (IMPT) lung treatments to systematic and random setup uncertainties. Treatments in which single-field homogeneity was restricted to within ±20% (IMPT20%) were compared to full IMPT (IMPTfull) for 10 patients with lung cancer. Four-dimensional Monte Carlo calculations were performed using patient computed tomography geometries with ±5 mm systematic or random setup uncertainties applied over a 35 × 2.5Gy(RBE) treatment course. Fifty fractionated courses were simulated for each patient using both IMPT delivery methods with random setup uncertainties applied each fraction and for 3 energy-dependent spot sizes (big spots, σ≈18-9 mm; intermediate spots, σ≈11-5 mm; and small spots, σ≈4-2 mm). These results were compared to Monte Carlo recalculations of the original treatment plan assuming zero setup uncertainty. Results are presented as the difference in equivalent uniform dose (ΔEUD), V95 (ΔV95), and target dose homogeneity (ΔD1-D99). Over the whole patient cohort, the ΔEUD was 2.0 ± 0.5 (big spots), 1.9 ± 0.7 (intermediate spots), and 1.3 ± 0.4 (small spots) times more sensitive to ±5 mm systematic setup uncertainties in IMPTfull compared to IMPT20% IMPTfull is 1.9 ± 0.9 (big spots), 2.1 ± 1.1 (intermediate spots), and 1.5 ± 0.6 (small spots) times more sensitive to random setup uncertainties than IMPT20% over a fractionated treatment course. The ΔV95 is at least 1.4 times more sensitive to systematic and random setup uncertainties for IMPTfull for all spot sizes considered. The ΔD1-D99 values coincided within uncertainty limits for both IMPT delivery methods for the 3 spot sizes considered, with higher mean values always observed for IMPTfull The paired t-test indicated that variations observed between IMPTfull and IMPT20% were significantly different for the majority of scenarios

  7. Uncertainty, sensitivity analysis and the role of data based mechanistic modeling in hydrology

    NASA Astrophysics Data System (ADS)

    Ratto, M.; Young, P. C.; Romanowicz, R.; Pappenberger, F.; Saltelli, A.; Pagano, A.

    2007-05-01

    In this paper, we discuss a joint approach to calibration and uncertainty estimation for hydrologic systems that combines a top-down, data-based mechanistic (DBM) modelling methodology; and a bottom-up, reductionist modelling methodology. The combined approach is applied to the modelling of the River Hodder catchment in North-West England. The top-down DBM model provides a well identified, statistically sound yet physically meaningful description of the rainfall-flow data, revealing important characteristics of the catchment-scale response, such as the nature of the effective rainfall nonlinearity and the partitioning of the effective rainfall into different flow pathways. These characteristics are defined inductively from the data without prior assumptions about the model structure, other than it is within the generic class of nonlinear differential-delay equations. The bottom-up modelling is developed using the TOPMODEL, whose structure is assumed a priori and is evaluated by global sensitivity analysis (GSA) in order to specify the most sensitive and important parameters. The subsequent exercises in calibration and validation, performed with Generalized Likelihood Uncertainty Estimation (GLUE), are carried out in the light of the GSA and DBM analyses. This allows for the pre-calibration of the the priors used for GLUE, in order to eliminate dynamical features of the TOPMODEL that have little effect on the model output and would be rejected at the structure identification phase of the DBM modelling analysis. In this way, the elements of meaningful subjectivity in the GLUE approach, which allow the modeler to interact in the modelling process by constraining the model to have a specific form prior to calibration, are combined with other more objective, data-based benchmarks for the final uncertainty estimation. GSA plays a major role in building a bridge between the hypothetico-deductive (bottom-up) and inductive (top-down) approaches and helps to improve the

  8. Overcoming computational uncertainties to reveal chemical sensitivity in single molecule conduction calculations

    NASA Astrophysics Data System (ADS)

    Solomon, Gemma C.; Reimers, Jeffrey R.; Hush, Noel S.

    2005-06-01

    In the calculation of conduction through single molecule's approximations about the geometry and electronic structure of the system are usually made in order to simplify the problem. Previously [G. C. Solomon, J. R. Reimers, and N. S. Hush, J. Chem. Phys. 121, 6615 (2004)], we have shown that, in calculations employing cluster models for the electrodes, proper treatment of the open-shell nature of the clusters is the most important computational feature required to make the results sensitive to variations in the structural and chemical features of the system. Here, we expand this and establish a general hierarchy of requirements involving treatment of geometrical approximations. These approximations are categorized into two classes: those associated with finite-dimensional methods for representing the semi-infinite electrodes, and those associated with the chemisorption topology. We show that ca. 100 unique atoms are required in order to properly characterize each electrode: using fewer atoms leads to nonsystematic variations in conductivity that can overwhelm the subtler changes. The choice of binding site is shown to be the next most important feature, while some effects that are difficult to control experimentally concerning the orientations at each binding site are actually shown to be insignificant. Verification of this result provides a general test for the precision of computational procedures for molecular conductivity. Predictions concerning the dependence of conduction on substituent and other effects on the central molecule are found to be meaningful only when they exceed the uncertainties of the effects associated with binding-site variation.

  9. Sensitivity of power functions to aggregation: Bias and uncertainty in radar rainfall retrieval

    NASA Astrophysics Data System (ADS)

    Sassi, M. G.; Leijnse, H.; Uijlenhoet, R.

    2014-10-01

    Rainfall retrieval using weather radar relies on power functions between radar reflectivity Z and rain rate R. The nonlinear nature of these relations complicates the comparison of rainfall estimates employing reflectivities measured at different scales. Transforming Z into R using relations that have been derived for other scales results in a bias and added uncertainty. We investigate the sensitivity of Z-R relations to spatial and temporal aggregation using high-resolution reflectivity fields for five rainfall events. Existing Z-R relations were employed to investigate the behavior of aggregated Z-R relations with scale, the aggregation bias, and the variability of the estimated rain rate. The prefactor and the exponent of aggregated Z-R relations systematically diverge with scale, showing a break that is event-dependent in the temporal domain and nearly constant in space. The systematic error associated with the aggregation bias at a given scale can become of the same order as the corresponding random error associated with intermittent sampling. The bias can be constrained by including information about the variability of Z within a certain scale of aggregation, and is largely captured by simple functions of the coefficient of variation of Z. Several descriptors of spatial and temporal variability of the reflectivity field are presented, to establish the links between variability descriptors and resulting aggregation bias. Prefactors in Z-R relations can be related to multifractal properties of the rainfall field. We find evidence of scaling breaks in the structural analysis of spatial rainfall with aggregation.

  10. Overcoming computational uncertainties to reveal chemical sensitivity in single molecule conduction calculations.

    PubMed

    Solomon, Gemma C; Reimers, Jeffrey R; Hush, Noel S

    2005-06-01

    In the calculation of conduction through single molecule's approximations about the geometry and electronic structure of the system are usually made in order to simplify the problem. Previously [G. C. Solomon, J. R. Reimers, and N. S. Hush, J. Chem. Phys. 121, 6615 (2004)], we have shown that, in calculations employing cluster models for the electrodes, proper treatment of the open-shell nature of the clusters is the most important computational feature required to make the results sensitive to variations in the structural and chemical features of the system. Here, we expand this and establish a general hierarchy of requirements involving treatment of geometrical approximations. These approximations are categorized into two classes: those associated with finite-dimensional methods for representing the semi-infinite electrodes, and those associated with the chemisorption topology. We show that ca. 100 unique atoms are required in order to properly characterize each electrode: using fewer atoms leads to nonsystematic variations in conductivity that can overwhelm the subtler changes. The choice of binding site is shown to be the next most important feature, while some effects that are difficult to control experimentally concerning the orientations at each binding site are actually shown to be insignificant. Verification of this result provides a general test for the precision of computational procedures for molecular conductivity. Predictions concerning the dependence of conduction on substituent and other effects on the central molecule are found to be meaningful only when they exceed the uncertainties of the effects associated with binding-site variation.

  11. Integrating model behavior, optimization, and sensitivity/uncertainty analysis: overview and application of the MOUSE software toolbox

    Technology Transfer Automated Retrieval System (TEKTRAN)

    This paper provides an overview of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) software application, an open-source, Java-based toolbox of visual and numerical analysis components for the evaluation of environmental models. MOUSE is based on the OPTAS model calibration syst...

  12. Sensitivity and uncertainty analysis of atmospheric ozone photochemistry models. Final report, September 30, 1993--December 31, 1998

    SciTech Connect

    Smith, G.P.

    1999-03-01

    The author has examined the kinetic reliability of ozone model predictions by computing direct first-order sensitivities of model species concentrations to input parameters: S{sub ij} = [dC{sub i}/C{sub i}]/[dk{sub j}/k{sub j}], where C{sub i} is the abundance of species i (e.g., ozone) and k{sub j} is the rate constant of step j (reaction, photolysis, or transport), for localized boxes from the LLNL 2-D diurnally averaged atmospheric model. An ozone sensitivity survey of boxes at altitudes of 10--55 km, 2--62N latitude, for spring, equinox, and winter is presented. Ozone sensitivities are used to evaluate the response of model predictions of ozone to input rate coefficient changes, to propagate laboratory rate uncertainties through the model, and to select processes and regions suited to more precise measurements. By including the local chemical feedbacks, the sensitivities quantify the important roles of oxygen and ozone photolysis, transport from the tropics, and the relation of key catalytic steps and cycles in regulating stratospheric ozone as a function of altitude, latitude, and season. A sensitivity-uncertainty analysis uses the sensitivity coefficients to propagate laboratory error bars in input photochemical parameters and estimate the net model uncertainties of predicted ozone in isolated boxes; it was applied to potential problems in the upper stratospheric ozone budget, and also highlights superior regions for model validation.

  13. WE-D-BRE-07: Variance-Based Sensitivity Analysis to Quantify the Impact of Biological Uncertainties in Particle Therapy

    SciTech Connect

    Kamp, F.; Brueningk, S.C.; Wilkens, J.J.

    2014-06-15

    Purpose: In particle therapy, treatment planning and evaluation are frequently based on biological models to estimate the relative biological effectiveness (RBE) or the equivalent dose in 2 Gy fractions (EQD2). In the context of the linear-quadratic model, these quantities depend on biological parameters (α, β) for ions as well as for the reference radiation and on the dose per fraction. The needed biological parameters as well as their dependency on ion species and ion energy typically are subject to large (relative) uncertainties of up to 20–40% or even more. Therefore it is necessary to estimate the resulting uncertainties in e.g. RBE or EQD2 caused by the uncertainties of the relevant input parameters. Methods: We use a variance-based sensitivity analysis (SA) approach, in which uncertainties in input parameters are modeled by random number distributions. The evaluated function is executed 10{sup 4} to 10{sup 6} times, each run with a different set of input parameters, randomly varied according to their assigned distribution. The sensitivity S is a variance-based ranking (from S = 0, no impact, to S = 1, only influential part) of the impact of input uncertainties. The SA approach is implemented for carbon ion treatment plans on 3D patient data, providing information about variations (and their origin) in RBE and EQD2. Results: The quantification enables 3D sensitivity maps, showing dependencies of RBE and EQD2 on different input uncertainties. The high number of runs allows displaying the interplay between different input uncertainties. The SA identifies input parameter combinations which result in extreme deviations of the result and the input parameter for which an uncertainty reduction is the most rewarding. Conclusion: The presented variance-based SA provides advantageous properties in terms of visualization and quantification of (biological) uncertainties and their impact. The method is very flexible, model independent, and enables a broad assessment

  14. PEBBED Uncertainty and Sensitivity Analysis of the CRP-5 PBMR DLOFC Transient Benchmark with the SUSA Code

    SciTech Connect

    Gerhard Strydom

    2011-01-01

    The need for a defendable and systematic uncertainty and sensitivity approach that conforms to the Code Scaling, Applicability, and Uncertainty (CSAU) process, and that could be used for a wide variety of software codes, was defined in 2008. The GRS (Gesellschaft für Anlagen und Reaktorsicherheit) company of Germany has developed one type of CSAU approach that is particularly well suited for legacy coupled core analysis codes, and a trial version of their commercial software product SUSA (Software for Uncertainty and Sensitivity Analyses) was acquired on May 12, 2010. This report summarized the results of the initial investigations performed with SUSA, utilizing a typical High Temperature Reactor benchmark (the IAEA CRP-5 PBMR 400MW Exercise 2) and the PEBBED-THERMIX suite of codes. The following steps were performed as part of the uncertainty and sensitivity analysis: 1. Eight PEBBED-THERMIX model input parameters were selected for inclusion in the uncertainty study: the total reactor power, inlet gas temperature, decay heat, and the specific heat capability and thermal conductivity of the fuel, pebble bed and reflector graphite. 2. The input parameters variations and probability density functions were specified, and a total of 800 PEBBED-THERMIX model calculations were performed, divided into 4 sets of 100 and 2 sets of 200 Steady State and Depressurized Loss of Forced Cooling (DLOFC) transient calculations each. 3. The steady state and DLOFC maximum fuel temperature, as well as the daily pebble fuel load rate data, were supplied to SUSA as model output parameters of interest. The 6 data sets were statistically analyzed to determine the 5% and 95% percentile values for each of the 3 output parameters with a 95% confidence level, and typical statistical indictors were also generated (e.g. Kendall, Pearson and Spearman coefficients). 4. A SUSA sensitivity study was performed to obtain correlation data between the input and output parameters, and to identify the

  15. Uncertainty, sensitivity analysis and the role of data based mechanistic modeling in hydrology

    NASA Astrophysics Data System (ADS)

    Ratto, M.; Young, P. C.; Romanowicz, R.; Pappenberge, F.; Saltelli, A.; Pagano, A.

    2006-09-01

    In this paper, we discuss the problem of calibration and uncertainty estimation for hydrologic systems from two points of view: a bottom-up, reductionist approach; and a top-down, data-based mechanistic (DBM) approach. The two approaches are applied to the modelling of the River Hodder catchment in North-West England. The bottom-up approach is developed using the TOPMODEL, whose structure is evaluated by global sensitivity analysis (GSA) in order to specify the most sensitive and important parameters; and the subsequent exercises in calibration and validation are carried out in the light of this sensitivity analysis. GSA helps to improve the calibration of hydrological models, making their properties more transparent and highlighting mis-specification problems. The DBM model provides a quick and efficient analysis of the rainfall-flow data, revealing important characteristics of the catchment-scale response, such as the nature of the effective rainfall nonlinearity and the partitioning of the effective rainfall into different flow pathways. TOPMODEL calibration takes more time and it explains the flow data a little less well than the DBM model. The main differences in the modelling results are in the nature of the models and the flow decomposition they suggest. The "quick'' (63%) and "slow'' (37%) components of the decomposed flow identified in the DBM model show a clear partitioning of the flow, with the quick component apparently accounting for the effects of surface and near surface processes; and the slow component arising from the displacement of groundwater into the river channel (base flow). On the other hand, the two output flow components in TOPMODEL have a different physical interpretation, with a single flow component (95%) accounting for both slow (subsurface) and fast (surface) dynamics, while the other, very small component (5%) is interpreted as an instantaneous surface runoff generated by rainfall falling on areas of saturated soil. The results of

  16. OECD/NEA expert group on uncertainty analysis for criticality safety assessment: Results of benchmark on sensitivity calculation (phase III)

    SciTech Connect

    Ivanova, T.; Laville, C.; Dyrda, J.; Mennerdahl, D.; Golovko, Y.; Raskach, K.; Tsiboulia, A.; Lee, G. S.; Woo, S. W.; Bidaud, A.; Sabouri, P.; Bledsoe, K.; Rearden, B.; Gulliford, J.; Michel-Sendis, F.

    2012-07-01

    The sensitivities of the k{sub eff} eigenvalue to neutron cross sections have become commonly used in similarity studies and as part of the validation algorithm for criticality safety assessments. To test calculations of the sensitivity coefficients, a benchmark study (Phase III) has been established by the OECD-NEA/WPNCS/EG UACSA (Expert Group on Uncertainty Analysis for Criticality Safety Assessment). This paper presents some sensitivity results generated by the benchmark participants using various computational tools based upon different computational methods: SCALE/TSUNAMI-3D and -1D, MONK, APOLLO2-MORET 5, DRAGON-SUSD3D and MMKKENO. The study demonstrates the performance of the tools. It also illustrates how model simplifications impact the sensitivity results and demonstrates the importance of 'implicit' (self-shielding) sensitivities. This work has been a useful step towards verification of the existing and developed sensitivity analysis methods. (authors)

  17. Valley Floor

    NASA Technical Reports Server (NTRS)

    2003-01-01

    MGS MOC Release No. MOC2-529, 30 October 2003

    This Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image shows the floor of an ancient valley located near the Pyrrhae Chaos region of Mars. This valley might have been carved by liquid water, but today no evidence remains that a fluid ever flowed through it. Long after the valley formed, its floor was covered by large, windblown, ripple-like dunes. This picture is located near 13.0oS, 31.2oW. The image is illuminated by sunlight from the upper left and covers an area 3 km (1.9 mi) wide.

  18. Sensitivity and uncertainty analysis for the annual P loss estimator (APLE) model

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that there are inherent uncertainties with model predictions, limited studies have addressed model prediction uncertainty. In this study we assess the effect of model input error on predict...

  19. Two-dimensional cross-section sensitivity and uncertainty analysis of the LBM (Lithium Blanket Module) experiments at LOTUS

    SciTech Connect

    Davidson, J.W.; Dudziak, D.J.; Pelloni, S.; Stepanek, J.

    1988-01-01

    In a recent common Los Alamos/PSI effort, a sensitivity and nuclear data uncertainty path for the modular code system AARE (Advanced Analysis for Reactor Engineering) was developed. This path includes the cross-section code TRAMIX, the one-dimensional finite difference S/sub N/-transport code ONEDANT, the two-dimensional finite element S/sub N/-transport code TRISM, and the one- and two-dimensional sensitivity and nuclear data uncertainty code SENSIBL. Within the framework of the present work a complete set of forward and adjoint two-dimensional TRISM calculations were performed both for the bare, as well as for the Pb- and Be-preceeded, LBM using MATXS8 libraries. Then a two-dimensional sensitivity and uncertainty analysis for all cases was performed. The goal of this analysis was the determination of the uncertainties of a calculated tritium production per source neutron from lithium along the central Li/sub 2/O rod in the LBM. Considered were the contributions from /sup 1/H, /sup 6/Li, /sup 7/Li, /sup 9/Be, /sup nat/C, /sup 14/N, /sup 16/O, /sup 23/Na, /sup 27/Al, /sup nat/Si, /sup nat/Cr, /sup nat/Fe, /sup nat/Ni, and /sup nat/Pb. 22 refs., 1 fig., 3 tabs.

  20. Sensitivity of the remote sensing reflectance of ocean and coastal waters to uncertainties in aerosol characteristics

    NASA Astrophysics Data System (ADS)

    Seidel, F. C.; Garay, M. J.; Zhai, P.; Kalashnikova, O. V.; Diner, D. J.

    2015-12-01

    Remote sensing is a powerful tool for optical oceanography and limnology to monitor and study ocean, coastal, and inland water ecosystems. However, the highly spatially and temporally variable nature of water conditions and constituents, as well as atmospheric conditions are challenging factors, especially for spaceborne observations.Here, we study the quantitative impact of uncertainties in the spectral aerosol optical and microphysical properties, namely aerosol optical depth (AOD), spectral absorption, and particle size, on the remote sensing reflectance (Rrs) of simulated typical open ocean and coastal waters. Rrs is related to the inherent optical properties of the water column and is a fundamental parameter in ocean optics retrievals. We use the successive order of scattering (SOS) method to perform radiative transfer calculations of the coupled system of atmosphere and water. The optics of typical open ocean and coastal waters are simulated with bio-optical models. We derive sensitivities by comparing spectral SOS calculations of Rrs with a reference aerosol model against similar calculations performed using a different aerosol model. One particular focus of this study lies on the impact of the spectral absorption of dust and brown carbon, or similar particles with greater absorption at short wavelengths on Rrs. The results are presented in terms of the minimum expected error in Rrs due to the choice of an incorrect aerosol model during the atmospheric correction of ocean color remote sensing data from space. This study is independent of errors related to observational data or retrieval techniques.The results are relevant for quantifying requirements of aerosol retrievals to derive accurate Rrs from spaceborne observations, such as NASA's future Pre-Aerosol, Clouds, and ocean Ecosystem (PACE) mission.

  1. Sensitivity analysis and quantification of uncertainty for isotopic mixing relationships in carbon cycle research

    NASA Astrophysics Data System (ADS)

    Zobitz, J. M.; Keener, J. P.; Bowling, D. R.

    2004-12-01

    Quantifying and understanding the uncertainty in isotopic mixing relationships is critical to isotopic applications in carbon cycle studies at all spatial and temporal scales. Studies associated with the North American Carbon Program will depend on stable isotope approaches and quantification of isotopic uncertainty. An important application of isotopic mixing relationships is determination of the isotopic content of large-scale respiration (δ 13CR) via an inverse relationship (a Keeling plot) between atmospheric CO2 concentrations ([CO2]) and carbon isotope ratios of CO2 (δ 13C). Alternatively, a linear relationship between [CO2] and the product of [CO2] and δ 13C (a Miller/Tans plot) can also be applied. We used an extensive dataset from the Niwot Ridge Ameriflux Site of [CO2] and δ 13C in forest air to examine contrasting approaches to determine δ 13CR and its uncertainty. These included Keeling plots, Miller/Tans plots, Model I, and Model II regressions Our analysis confirms previous observations that increasing the range of measurements ([CO2] range) reduces the uncertainty associated with δ 13CR. For carbon isotope studies, uncertainty in the isotopic measurements has a greater effect on the uncertainty of δ 13CR than the uncertainty in [CO2]. Reducing the uncertainty of isotopic measurements reduces the uncertainty of δ 13CR even when the [CO2] range of samples is small (< 20 ppm). As a result, improvement in isotope (rather than CO2) measuring capability is needed to substantially reduce uncertainty in δ 13CR. We also find for carbon isotope studies no inherent advantage to using either a Keeling or a Miller/Tans approach to determine δ 13CR.

  2. Uncertainty and Sensitivity Analysis Results Obtained in the 1996 Performance Assessment for the Waste Isolation Pilot Plant

    SciTech Connect

    Bean, J.E.; Berglund, J.W.; Davis, F.J.; Economy, K.; Garner, J.W.; Helton, J.C.; Johnson, J.D.; MacKinnon, R.J.; Miller, J.; O'Brien, D.G.; Ramsey, J.L.; Schreiber, J.D.; Shinta, A.; Smith, L.N.; Stockman, C.; Stoelzel, D.M.; Vaughn, P.

    1998-09-01

    The Waste Isolation Pilot Plant (WPP) is located in southeastern New Mexico and is being developed by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. A detailed performance assessment (PA) for the WIPP was carried out in 1996 and supports an application by the DOE to the U.S. Environmental Protection Agency (EPA) for the certification of the WIPP for the disposal of TRU waste. The 1996 WIPP PA uses a computational structure that maintains a separation between stochastic (i.e., aleatory) and subjective (i.e., epistemic) uncertainty, with stochastic uncertainty arising from the many possible disruptions that could occur over the 10,000 yr regulatory period that applies to the WIPP and subjective uncertainty arising from the imprecision with which many of the quantities required in the PA are known. Important parts of this structure are (1) the use of Latin hypercube sampling to incorporate the effects of subjective uncertainty, (2) the use of Monte Carlo (i.e., random) sampling to incorporate the effects of stochastic uncertainty, and (3) the efficient use of the necessarily limited number of mechanistic calculations that can be performed to support the analysis. The use of Latin hypercube sampling generates a mapping from imprecisely known analysis inputs to analysis outcomes of interest that provides both a display of the uncertainty in analysis outcomes (i.e., uncertainty analysis) and a basis for investigating the effects of individual inputs on these outcomes (i.e., sensitivity analysis). The sensitivity analysis procedures used in the PA include examination of scatterplots, stepwise regression analysis, and partial correlation analysis. Uncertainty and sensitivity analysis results obtained as part of the 1996 WIPP PA are presented and discussed. Specific topics considered include two phase flow in the vicinity of the repository, radionuclide release from the repository, fluid flow and radionuclide

  3. AN OVERVIEW OF THE UNCERTAINTY ANALYSIS, SENSITIVITY ANALYSIS, AND PARAMETER ESTIMATION (UA/SA/PE) API AND HOW TO IMPLEMENT IT

    EPA Science Inventory

    The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and
    Parameter Estimation (UA/SA/PE API) (also known as Calibration, Optimization and Sensitivity and Uncertainty (CUSO)) was developed in a joint effort between several members of both ...

  4. Kiwi: An Evaluated Library of Uncertainties in Nuclear Data and Package for Nuclear Sensitivity Studies

    SciTech Connect

    Pruet, J

    2007-06-23

    This report describes Kiwi, a program developed at Livermore to enable mature studies of the relation between imperfectly known nuclear physics and uncertainties in simulations of complicated systems. Kiwi includes a library of evaluated nuclear data uncertainties, tools for modifying data according to these uncertainties, and a simple interface for generating processed data used by transport codes. As well, Kiwi provides access to calculations of k eigenvalues for critical assemblies. This allows the user to check implications of data modifications against integral experiments for multiplying systems. Kiwi is written in python. The uncertainty library has the same format and directory structure as the native ENDL used at Livermore. Calculations for critical assemblies rely on deterministic and Monte Carlo codes developed by B division.

  5. SENSITIVITY OF THE NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION MULTILAYER MODEL TO INSTRUMENT ERROR AND PARAMETERIZATION UNCERTAINTY

    EPA Science Inventory

    The response of the National Oceanic and Atmospheric Administration multilayer inferential dry deposition velocity model (NOAA-MLM) to error in meteorological inputs and model parameterization is reported. Monte Carlo simulations were performed to assess the uncertainty in NOA...

  6. Uncertainty Analysis of Ozone Formation and Response to Emission Controls Using Higher-Order Sensitivities

    EPA Science Inventory

    Understanding ozone response to its precursor emissions is crucial for effective air quality management practices. This nonlinear response is usually simulated using chemical transport models, and the modeling results are affected by uncertainties in emissions inputs. In this stu...

  7. Propagation of uncertainty and sensitivity analysis in an integral oil-gas plume model

    NASA Astrophysics Data System (ADS)

    Wang, Shitao; Iskandarani, Mohamed; Srinivasan, Ashwanth; Thacker, W. Carlisle; Winokur, Justin; Knio, Omar M.

    2016-05-01

    Polynomial Chaos expansions are used to analyze uncertainties in an integral oil-gas plume model simulating the Deepwater Horizon oil spill. The study focuses on six uncertain input parameters—two entrainment parameters, the gas to oil ratio, two parameters associated with the droplet-size distribution, and the flow rate—that impact the model's estimates of the plume's trap and peel heights, and of its various gas fluxes. The ranges of the uncertain inputs were determined by experimental data. Ensemble calculations were performed to construct polynomial chaos-based surrogates that describe the variations in the outputs due to variations in the uncertain inputs. The surrogates were then used to estimate reliably the statistics of the model outputs, and to perform an analysis of variance. Two experiments were performed to study the impacts of high and low flow rate uncertainties. The analysis shows that in the former case the flow rate is the largest contributor to output uncertainties, whereas in the latter case, with the uncertainty range constrained by aposteriori analyses, the flow rate's contribution becomes negligible. The trap and peel heights uncertainties are then mainly due to uncertainties in the 95% percentile of the droplet size and in the entrainment parameters.

  8. Sensitivity and uncertainty analysis of the Variable Infiltration Capacity model in the upstream of Heihe River basin

    NASA Astrophysics Data System (ADS)

    He, R.; Pang, B.

    2015-05-01

    The increasing water problems and eco-environmental issues of Heihe River basin have attracted widespread attention. In this research, the VIC (Variable Infiltration Capacity) model was selected to simulate the water cycle of the upstream in Heihe River basin. The GLUE (Generalized Likelihood Uncertainty Estimation) method was used to study the sensitivity of the model parameters and the uncertainty of model outputs. The results showed that the Nash-Sutcliffe efficient coefficient was 0.62 in the calibration period and 0.64 in the validation period. Of the seven elected parameters, Dm (maximum baseflow that can occur from the third soil layer), Ws (fraction of the maximum soil moisture of the third soil layer where non-linear baseflow occurs), and d1 (soil depth of the first soil layer), were very sensitive, especially d1. Observed discharges were almost in the range of the 95% predicted confidence range.

  9. Thoughts on Sensitivity Analysis and Uncertainty Propagation Methods with Respect to the Prompt Fission Neutron Spectrum Impact on Critical Assemblies

    SciTech Connect

    Rising, M.E.

    2015-01-15

    The prompt fission neutron spectrum (PFNS) uncertainties in the n+{sup 239}Pu fission reaction are used to study the impact on several fast critical assemblies modeled in the MCNP6.1 code. The newly developed sensitivity capability in MCNP6.1 is used to compute the k{sub eff} sensitivity coefficients with respect to the PFNS. In comparison, the covariance matrix given in the ENDF/B-VII.1 library is decomposed and randomly sampled realizations of the PFNS are propagated through the criticality calculation, preserving the PFNS covariance matrix. The information gathered from both approaches, including the overall k{sub eff} uncertainty, is statistically analyzed. Overall, the forward and backward approaches agree as expected. The results from a new method appear to be limited by the process used to evaluate the PFNS and is not necessarily a flaw of the method itself. Final thoughts and directions for future work are suggested.

  10. Antoniadi's Floor

    NASA Technical Reports Server (NTRS)

    2005-01-01

    16 February 2005 This Mars Global Surveyor (MGS) Orbiter Camera (MOC) image shows landforms on the floor of Antoniadi Crater. The circular features were once meteor impact craters that have been almost completely eroded away.

    Location near: 21.6oN, 297.4oW Image width: 3.0 km (1.9 mi) Illumination from: upper left Season: Northern Summer

  11. Trough Floor

    NASA Technical Reports Server (NTRS)

    2005-01-01

    3 March 2005 This Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image shows boulders on the floor of a wide trough in Memnonia Fossae.

    Location near: 18.8oS, 150.3oW Image width: 3 km (1.9 mi) Illumination from: upper left Season: Southern Winter

  12. Brief Report: Effects of Sensory Sensitivity and Intolerance of Uncertainty on Anxiety in Mothers of Children with Autism Spectrum Disorder.

    PubMed

    Uljarević, Mirko; Carrington, Sarah; Leekam, Susan

    2016-01-01

    This study examined the relations between anxiety and individual characteristics of sensory sensitivity (SS) and intolerance of uncertainty (IU) in mothers of children with ASD. The mothers of 50 children completed the Hospital Anxiety and Depression Scale, the Highly Sensitive Person Scale and the IU Scale. Anxiety was associated with both SS and IU and IU was also associated with SS. Mediation analyses showed direct effects between anxiety and both IU and SS but a significant indirect effect was found only in the model in which IU mediated between SS. This is the first study to characterize the nature of the IU and SS interrelation in predicting levels of anxiety.

  13. Sensitivity of Polar Stratospheric Ozone Loss to Uncertainties in Chemical Reaction Kinetics

    NASA Technical Reports Server (NTRS)

    Kawa, S. Randolph; Stolarksi, Richard S.; Douglass, Anne R.; Newman, Paul A.

    2008-01-01

    Several recent observational and laboratory studies of processes involved in polar stratospheric ozone loss have prompted a reexamination of aspects of our understanding for this key indicator of global change. To a large extent, our confidence in understanding and projecting changes in polar and global ozone is based on our ability to simulate these processes in numerical models of chemistry and transport. The fidelity of the models is assessed in comparison with a wide range of observations. These models depend on laboratory-measured kinetic reaction rates and photolysis cross sections to simulate molecular interactions. A typical stratospheric chemistry mechanism has on the order of 50- 100 species undergoing over a hundred intermolecular reactions and several tens of photolysis reactions. The rates of all of these reactions are subject to uncertainty, some substantial. Given the complexity of the models, however, it is difficult to quantify uncertainties in many aspects of system. In this study we use a simple box-model scenario for Antarctic ozone to estimate the uncertainty in loss attributable to known reaction kinetic uncertainties. Following the method of earlier work, rates and uncertainties from the latest laboratory evaluations are applied in random combinations. We determine the key reactions and rates contributing the largest potential errors and compare the results to observations to evaluate which combinations are consistent with atmospheric data. Implications for our theoretical and practical understanding of polar ozone loss will be assessed.

  14. Floor Chemical Basics.

    ERIC Educational Resources Information Center

    Shaw, Richard

    1998-01-01

    Discusses the issues to consider when selecting floor-care chemicals, including the floor-finish systems for hard-surface floors and the care of carpeted floors. Provides thoughts on cleaning chemical usage and environmental awareness. (GR)

  15. Floors: Selection and Maintenance.

    ERIC Educational Resources Information Center

    Berkeley, Bernard

    Flooring for institutional, commercial, and industrial use is described with regard to its selection, care, and maintenance. The following flooring and subflooring material categories are discussed--(1) resilient floor coverings, (2) carpeting, (3) masonry floors, (4) wood floors, and (5) "formed-in-place floors". The properties, problems,…

  16. UNCERTAINTY AND SENSITIVITY ANALYSES FOR INTEGRATED HUMAN HEALTH AND ECOLOGICAL RISK ASSESSMENT OF HAZARDOUS WASTE DISPOSAL

    EPA Science Inventory

    While there is a high potential for exposure of humans and ecosystems to chemicals released from hazardous waste sites, the degree to which this potential is realized is often uncertain. Conceptually divided among parameter, model, and modeler uncertainties imparted during simula...

  17. Analysis of model sensitivity and predictive uncertainty of capture zones in the Espanola Basin regional aquifer, Northern New Mexico

    SciTech Connect

    Vesselinov, V. V.; Keating, E. H.; Zyvoloski, G. A.

    2002-01-01

    Predictions and their uncertainty are key aspects of any modeling effort. The prediction uncertainty can be significant when the predictions depend on uncertain system parameters. We analyze prediction uncertainties through constrained nonlinear second-order optimization of an inverse model. The optimized objective function is the weighted squared-difference between observed and simulated system quantities (flux and time-dependent head data). The constraints are defined by the maximization/minimization of the prediction within a given objective-function range. The method is applied in capture-zone analyses of groundwater-supply systems using a three-dimensional numerical model of the Espanola Basin aquifer. We use the finite-element simulator FEHM coupled with parameter-estimation/predictive-analysis code PEST. The model is run in parallel on a multi-processor supercomputer. We estimate sensitivity and uncertainty of model predictions such as capture-zone identification and travel times. While the methodology is extremely powerful, it is numerically intensive.

  18. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis.

    SciTech Connect

    Eldred, Michael Scott; Vigil, Dena M.; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Lefantzi, Sophia; Hough, Patricia Diane; Eddy, John P.

    2011-12-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a theoretical manual for selected algorithms implemented within the DAKOTA software. It is not intended as a comprehensive theoretical treatment, since a number of existing texts cover general optimization theory, statistical analysis, and other introductory topics. Rather, this manual is intended to summarize a set of DAKOTA-related research publications in the areas of surrogate-based optimization, uncertainty quantification, and optimization under uncertainty that provide the foundation for many of DAKOTA's iterative analysis capabilities.

  19. Scalloped Floor

    NASA Technical Reports Server (NTRS)

    2006-01-01

    13 July 2006 This Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image shows erosional remnants of layered rock and large windblown ripples on the floor of a crater in the Tyrrhena Terra region of Mars. The layered rocks are most likely sedimentary.

    Location near: 15.5oS, 270.5oW Image width: 3 km (1.9 mi) Illumination from: upper left Season: Southern Autumn

  20. Tiled Floor

    NASA Technical Reports Server (NTRS)

    2006-01-01

    30 April 2006 This Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image shows a variety of materials found on the floor of an impact crater northwest of Hellas Planitia. The discontinuous, dark-toned ridges, typically running diagonally across the scene, are windblown ripples which overlie light-toned rock that is heavily fractured and cratered.

    Location near: 25.0oS, 322.9oW Image width: 3 km (1.9 mi) Illumination from: upper left Season: Southern Summer

  1. Considerations for sensitivity analysis, uncertainty quantification, and data assimilation for grid-to-rod fretting

    SciTech Connect

    Michael Pernice

    2012-10-01

    Grid-to-rod fretting is the leading cause of fuel failures in pressurized water reactors, and is one of the challenge problems being addressed by the Consortium for Advanced Simulation of Light Water Reactors to guide its efforts to develop a virtual reactor environment. Prior and current efforts in modeling and simulation of grid-to-rod fretting are discussed. Sources of uncertainty in grid-to-rod fretting are also described.

  2. Sensitivity of Polar Stratospheric Ozone Loss to Uncertainties in Chemical Reaction Kinetics

    NASA Technical Reports Server (NTRS)

    Kawa, S. Randolph; Stolarski, Richard S.; Douglass, Anne R.; Newman, Paul A.

    2008-01-01

    Several recent observational and laboratory studies of processes involved in polar stratospheric ozone loss have prompted a reexamination of aspect of out understanding for this key indicator of global change. To a large extent, our confidence in understanding and projecting changes in polar and global ozone is based on our ability to to simulate these process in numerical models of chemistry and transport. These models depend on laboratory-measured kinetic reaction rates and photlysis cross section to simulate molecular interactions. In this study we use a simple box-model scenario for Antarctic ozone to estimate the uncertainty in loss attributable to known reaction kinetic uncertainties. Following the method of earlier work, rates and uncertainties from the latest laboratory evaluation are applied in random combinations. We determine the key reaction and rates contributing the largest potential errors and compare the results to observations to evaluate which combinations are consistent with atmospheric data. Implications for our theoretical and practical understanding of polar ozone loss will be assessed.

  3. Analysis of Sensitivity and Uncertainty in an Individual-Based Model of a Threatened Wildlife Species

    EPA Science Inventory

    We present a multi-faceted sensitivity analysis of a spatially explicit, individual-based model (IBM) (HexSim) of a threatened species, the Northern Spotted Owl (Strix occidentalis caurina) on a national forest in Washington, USA. Few sensitivity analyses have been conducted on ...

  4. Assessing uncertainty in ecological systems using global sensitivity analyses: a case example of simulated wolf reintroduction effects on elk

    USGS Publications Warehouse

    Fieberg, J.; Jenkins, Kurt J.

    2005-01-01

    Often landmark conservation decisions are made despite an incomplete knowledge of system behavior and inexact predictions of how complex ecosystems will respond to management actions. For example, predicting the feasibility and likely effects of restoring top-level carnivores such as the gray wolf (Canis lupus) to North American wilderness areas is hampered by incomplete knowledge of the predator-prey system processes and properties. In such cases, global sensitivity measures, such as Sobola?? indices, allow one to quantify the effect of these uncertainties on model predictions. Sobola?? indices are calculated by decomposing the variance in model predictions (due to parameter uncertainty) into main effects of model parameters and their higher order interactions. Model parameters with large sensitivity indices can then be identified for further study in order to improve predictive capabilities. Here, we illustrate the use of Sobola?? sensitivity indices to examine the effect of parameter uncertainty on the predicted decline of elk (Cervus elaphus) population sizes following a hypothetical reintroduction of wolves to Olympic National Park, Washington, USA. The strength of density dependence acting on survival of adult elk and magnitude of predation were the most influential factors controlling elk population size following a simulated wolf reintroduction. In particular, the form of density dependence in natural survival rates and the per-capita predation rate together accounted for over 90% of variation in simulated elk population trends. Additional research on wolf predation rates on elk and natural compensations in prey populations is needed to reliably predict the outcome of predatora??prey system behavior following wolf reintroductions.

  5. Sensitivity of land surface modeling to parameters: An uncertainty quantification method applied to the Community Land Model

    NASA Astrophysics Data System (ADS)

    Ricciuto, D. M.; Mei, R.; Mao, J.; Hoffman, F. M.; Kumar, J.

    2015-12-01

    Uncertainties in land parameters could have important impacts on simulated water and energy fluxes and land surface states, which will consequently affect atmospheric and biogeochemical processes. Therefore, quantification of such parameter uncertainties using a land surface model is the first step towards better understanding of predictive uncertainty in Earth system models. In this study, we applied a random-sampling, high-dimensional model representation (RS-HDMR) method to analyze the sensitivity of simulated photosynthesis, surface energy fluxes and surface hydrological components to selected land parameters in version 4.5 of the Community Land Model (CLM4.5). Because of the large computational expense of conducting ensembles of global gridded model simulations, we used the results of a previous cluster analysis to select one thousand representative land grid cells for simulation. Plant functional type (PFT)-specific uniform prior ranges for land parameters were determined using expert opinion and literature survey, and samples were generated with a quasi-Monte Carlo approach-Sobol sequence. Preliminary analysis of 1024 simulations suggested that four PFT-dependent parameters (including slope of the conductance-photosynthesis relationship, specific leaf area at canopy top, leaf C:N ratio and fraction of leaf N in RuBisco) are the dominant sensitive parameters for photosynthesis, surface energy and water fluxes across most PFTs, but with varying importance rankings. On the other hand, for surface ans sub-surface runoff, PFT-independent parameters, such as the depth-dependent decay factors for runoff, play more important roles than the previous four PFT-dependent parameters. Further analysis by conditioning the results on different seasons and years are being conducted to provide guidance on how climate variability and change might affect such sensitivity. This is the first step toward coupled simulations including biogeochemical processes, atmospheric processes

  6. Random vibration sensitivity studies of modeling uncertainties in the NIF structures

    SciTech Connect

    Swensen, E.A.; Farrar, C.R.; Barron, A.A.; Cornwell, P.

    1996-12-31

    The National Ignition Facility is a laser fusion project that will provide an above-ground experimental capability for nuclear weapons effects simulation. This facility will achieve fusion ignition utilizing solid-state lasers as the energy driver. The facility will cover an estimated 33,400 m{sup 2} at an average height of 5--6 stories. Within this complex, a number of beam transport structures will be houses that will deliver the laser beams to the target area within a 50 {micro}m ms radius of the target center. The beam transport structures are approximately 23 m long and reach approximately heights of 2--3 stories. Low-level ambient random vibrations are one of the primary concerns currently controlling the design of these structures. Low level ambient vibrations, 10{sup {minus}10} g{sup 2}/Hz over a frequency range of 1 to 200 Hz, are assumed to be present during all facility operations. Each structure described in this paper will be required to achieve and maintain 0.6 {micro}rad ms laser beam pointing stability for a minimum of 2 hours under these vibration levels. To date, finite element (FE) analysis has been performed on a number of the beam transport structures. Certain assumptions have to be made regarding structural uncertainties in the FE models. These uncertainties consist of damping values for concrete and steel, compliance within bolted and welded joints, and assumptions regarding the phase coherence of ground motion components. In this paper, the influence of these structural uncertainties on the predicted pointing stability of the beam line transport structures as determined by random vibration analysis will be discussed.

  7. Optimization algorithm for overlapping-field plans of scanned ion beam therapy with reduced sensitivity to range and setup uncertainties

    NASA Astrophysics Data System (ADS)

    Inaniwa, Taku; Kanematsu, Nobuyuki; Furukawa, Takuji; Noda, Koji

    2011-03-01

    A 'patch-field' strategy is often used for tumors with large volumes exceeding the available field size in passive irradiations with ion beams. Range and setup errors can cause hot and cold spots at the field junction within the target. Such errors will also displace the field to miss the target periphery. With scanned ion beams with fluence modulation, the two junctional fields can be overlapped rather than patched, which may potentially reduce the sensitivity to these uncertainties. In this study, we have developed such a robust optimization algorithm. This algorithm is composed of the following two steps: (1) expanding the target volume with margins against the uncertainties, and (2) solving the inverse problem where the terms suppressing the dose gradient of individual fields are added into the objective function. The validity of this algorithm is demonstrated through simulation studies for two extreme cases of two fields with unidirectional and opposing geometries and for a prostate-cancer case. With the proposed algorithm, we can obtain a more robust plan with minimized influence of range and setup uncertainties than the conventional plan. Compared to conventional optimization, the calculation time for the robust optimization increased by a factor of approximately 3.

  8. New SCALE Sensitivity/Uncertainty Capabilities Applied to Bias Estimation and to Design of MIRTE Reference Experiments

    SciTech Connect

    Rearden, Bradley T; Duhamel, Isabelle; Letang, Eric

    2009-01-01

    New TSUNAMI tools of SCALE 6, TSURFER and TSAR, are demonstrated to examine the bias effects of small-worth test materials, relative to reference experiments. TSURFER is a data adjustment bias and bias uncertainty assessment tool, and TSAR computes the sensitivity of the change in reactivity between two systems to the cross-section data common to their calculation. With TSURFER, it is possible to examine biases and bias uncertainties in fine detail. For replacement experiments, the application of TSAR to TSUNAMI-3D sensitivity data for pairs of experiments allows the isolation of sources of bias that could otherwise be obscured by materials with more worth in an individual experiment. The application of TSUNAMI techniques in the design of nine reference experiments for the MIRTE program will allow application of these advanced techniques to data acquired in the experimental series. The validation of all materials in a complex criticality safety application likely requires consolidating information from many different critical experiments. For certain materials, such as structural materials or fission products, only a limited number of critical experiments are available, and the fuel and moderator compositions of the experiments may differ significantly from those of the application. In these cases, it is desirable to extract the computational bias of a specific material from an integral keff measurement and use that information to quantify the bias due to the use of the same material in the application system. Traditional parametric and nonparametric methods are likely to prove poorly suited for such a consolidation of specific data components from a diverse set of experiments. An alternative choice for consolidating specific data from numerous sources is a data adjustment tool, like the ORNL tool TSURFER (Tool for Sensitivity/Uncertainty analysis of Response Functionals using Experimental Results) from SCALE 6.1 However, even with TSURFER, it may be difficult to

  9. The sensitivity of oxidant formation rates to uncertainties in temperature, water vapor, and cloud cover

    SciTech Connect

    Walcek, C.J.; Yuan, H.H.

    1994-12-31

    Photochemical reaction mechanisms have been used for several decades to understand the formation of acids, oxidants, and other pollutants in the atmosphere. With complex chemical reaction mechanisms, it is useful to perform sensitivity studies to identify the most important or uncertain components within the system of reactions. In this study, we quantify the sensitivity of a chemical reaction mechanism to changes in three meteorological factors: temperature, relative humidity, and sunlight intensity. We perform these sensitivity studies over a wide range of nitrogen oxides (NO{sub x} = NO + NO{sub 2}) and nonmethane hydrocarbon (NMHC) concentrations, since these two chemicals are the dominant controllable pollutants that influence the chemical reactivity of the atmosphere.

  10. Using Uncertainty and Sensitivity Analyses in Socioecological Agent-Based Models to Improve Their Analytical Performance and Policy Relevance

    PubMed Central

    Ligmann-Zielinska, Arika; Kramer, Daniel B.; Spence Cheruvelil, Kendra; Soranno, Patricia A.

    2014-01-01

    Agent-based models (ABMs) have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1) efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2) conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system. PMID:25340764

  11. Combining Apples and Oranges: Lessons from Weighting, Inversion, Sensitivity Analysis, and Uncertainty

    NASA Astrophysics Data System (ADS)

    Hill, Mary

    2016-04-01

    Combining different data types can seem like combining apples and oranges. Yet combining different data types into inverse modeling and uncertainty quantification are important in all types of environmental systems. There are two main methods for combining different data types. - Single objective optimization (SOO) with weighting. - Multi-objective optimization (MOO) in which coefficients for data groups are defined and changed during model development. SOO and MOO are related in that different coefficient values in MOO are equivalent to considering alternative weightings. MOO methods often take many model runs and tend to be much more computationally expensive than SOO, but for SOO the weighting needs to be defined. When alternative models are more important to consider than alternate weightings, SOO can be advantageous (Lu et al. 2012). This presentation considers how to determine the weighting when using SOO. A saltwater intrusion example is used to examine two methods of weighting three data types. The two methods of determining weighting are based on contributions to the objective function, as suggested by Anderson et al. (2015) and error-based weighting, as suggested by Hill and Tiedeman (2007). The consequences of weighting on measures of uncertainty, the importance and interdependence of parameters, and the importance of observations are presented. This work is important to many types of environmental modeling, including climate models, because integrating many kinds of data is often important. The advent of rainfall-runoff models with fewer numerical deamons, such as TOPKAPI and SUMMA, make the convenient model analysis methods used in this work more useful for many hydrologic problems.

  12. UCODE_2005 and six other computer codes for universal sensitivity analysis, calibration, and uncertainty evaluation constructed using the JUPITER API

    USGS Publications Warehouse

    Poeter, Eileen E.; Hill, Mary C.; Banta, Edward R.; Mehl, Steffen; Christensen, Steen

    2006-01-01

    This report documents the computer codes UCODE_2005 and six post-processors. Together the codes can be used with existing process models to perform sensitivity analysis, data needs assessment, calibration, prediction, and uncertainty analysis. Any process model or set of models can be used; the only requirements are that models have numerical (ASCII or text only) input and output files, that the numbers in these files have sufficient significant digits, that all required models can be run from a single batch file or script, and that simulated values are continuous functions of the parameter values. Process models can include pre-processors and post-processors as well as one or more models related to the processes of interest (physical, chemical, and so on), making UCODE_2005 extremely powerful. An estimated parameter can be a quantity that appears in the input files of the process model(s), or a quantity used in an equation that produces a value that appears in the input files. In the latter situation, the equation is user-defined. UCODE_2005 can compare observations and simulated equivalents. The simulated equivalents can be any simulated value written in the process-model output files or can be calculated from simulated values with user-defined equations. The quantities can be model results, or dependent variables. For example, for ground-water models they can be heads, flows, concentrations, and so on. Prior, or direct, information on estimated parameters also can be considered. Statistics are calculated to quantify the comparison of observations and simulated equivalents, including a weighted least-squares objective function. In addition, data-exchange files are produced that facilitate graphical analysis. UCODE_2005 can be used fruitfully in model calibration through its sensitivity analysis capabilities and its ability to estimate parameter values that result in the best possible fit to the observations. Parameters are estimated using nonlinear regression: a

  13. Extension of sensitivity and uncertainty analysis for long term dose assessment of high level nuclear waste disposal sites to uncertainties in the human behaviour.

    PubMed

    Albrecht, Achim; Miquel, Stéphan

    2010-01-01

    Biosphere dose conversion factors are computed for the French high-level geological waste disposal concept and to illustrate the combined probabilistic and deterministic approach. Both (135)Cs and (79)Se are used as examples. Probabilistic analyses of the system considering all parameters, as well as physical and societal parameters independently, allow quantification of their mutual impact on overall uncertainty. As physical parameter uncertainties decreased, for example with the availability of further experimental and field data, the societal uncertainties, which are less easily constrained, particularly for the long term, become more and more significant. One also has to distinguish uncertainties impacting the low dose portion of a distribution from those impacting the high dose range, the latter having logically a greater impact in an assessment situation. The use of cumulative probability curves allows us to quantify probability variations as a function of the dose estimate, with the ratio of the probability variation (slope of the curve) indicative of uncertainties of different radionuclides. In the case of (135)Cs with better constrained physical parameters, the uncertainty in human behaviour is more significant, even in the high dose range, where they increase the probability of higher doses. For both radionuclides, uncertainties impact more strongly in the intermediate than in the high dose range. In an assessment context, the focus will be on probabilities of higher dose values. The probabilistic approach can furthermore be used to construct critical groups based on a predefined probability level and to ensure that critical groups cover the expected range of uncertainty.

  14. A practical method to assess model sensitivity and parameter uncertainty in C cycle models

    NASA Astrophysics Data System (ADS)

    Delahaies, Sylvain; Roulstone, Ian; Nichols, Nancy

    2015-04-01

    The carbon cycle combines multiple spatial and temporal scales, from minutes to hours for the chemical processes occurring in plant cells to several hundred of years for the exchange between the atmosphere and the deep ocean and finally to millennia for the formation of fossil fuels. Together with our knowledge of the transformation processes involved in the carbon cycle, many Earth Observation systems are now available to help improving models and predictions using inverse modelling techniques. A generic inverse problem consists in finding a n-dimensional state vector x such that h(x) = y, for a given N-dimensional observation vector y, including random noise, and a given model h. The problem is well posed if the three following conditions hold: 1) there exists a solution, 2) the solution is unique and 3) the solution depends continuously on the input data. If at least one of these conditions is violated the problem is said ill-posed. The inverse problem is often ill-posed, a regularization method is required to replace the original problem with a well posed problem and then a solution strategy amounts to 1) constructing a solution x, 2) assessing the validity of the solution, 3) characterizing its uncertainty. The data assimilation linked ecosystem carbon (DALEC) model is a simple box model simulating the carbon budget allocation for terrestrial ecosystems. Intercomparison experiments have demonstrated the relative merit of various inverse modelling strategies (MCMC, ENKF) to estimate model parameters and initial carbon stocks for DALEC using eddy covariance measurements of net ecosystem exchange of CO2 and leaf area index observations. Most results agreed on the fact that parameters and initial stocks directly related to fast processes were best estimated with narrow confidence intervals, whereas those related to slow processes were poorly estimated with very large uncertainties. While other studies have tried to overcome this difficulty by adding complementary

  15. Quantifying the economic competitiveness of cellulosic biofuel pathways under uncertainty and regional sensitivity

    NASA Astrophysics Data System (ADS)

    Brown, Tristan R.

    The revised Renewable Fuel Standard requires the annual blending of 16 billion gallons of cellulosic biofuel by 2022 from zero gallons in 2009. The necessary capacity investments have been underwhelming to date, however, and little is known about the likely composition of the future cellulosic biofuel industry as a result. This dissertation develops a framework for identifying and analyzing the industry's likely future composition while also providing a possible explanation for why investment in cellulosic biofuels capacity has been low to date. The results of this dissertation indicate that few cellulosic biofuel pathways will be economically competitive with petroleum on an unsubsidized basis. Of five cellulosic biofuel pathways considered under 20-year price forecasts with volatility, only two achieve positive mean 20-year net present value (NPV) probabilities. Furthermore, recent exploitation of U.S. shale gas reserves and the subsequent fall in U.S. natural gas prices have negatively impacted the economic competitiveness of all but two of the cellulosic biofuel pathways considered; only two of the five pathways achieve substantially higher 20-year NPVs under a post-shale gas economic scenario relative to a pre-shale gas scenario. The economic competitiveness of cellulosic biofuel pathways with petroleum is reduced further when considered under price uncertainty in combination with realistic financial assumptions. This dissertation calculates pathway-specific costs of capital for five cellulosic biofuel pathway scenarios. The analysis finds that the large majority of the scenarios incur costs of capital that are substantially higher than those commonly assumed in the literature. Employment of these costs of capital in a comparative TEA greatly reduces the mean 20-year NPVs for each pathway while increasing their 10-year probabilities of default to above 80% for all five scenarios. Finally, this dissertation quantifies the economic competitiveness of six

  16. Understanding hydrological flow paths in conceptual catchment models using uncertainty and sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Mockler, Eva M.; O'Loughlin, Fiachra E.; Bruen, Michael

    2016-05-01

    Increasing pressures on water quality due to intensification of agriculture have raised demands for environmental modeling to accurately simulate the movement of diffuse (nonpoint) nutrients in catchments. As hydrological flows drive the movement and attenuation of nutrients, individual hydrological processes in models should be adequately represented for water quality simulations to be meaningful. In particular, the relative contribution of groundwater and surface runoff to rivers is of interest, as increasing nitrate concentrations are linked to higher groundwater discharges. These requirements for hydrological modeling of groundwater contribution to rivers initiated this assessment of internal flow path partitioning in conceptual hydrological models. In this study, a variance based sensitivity analysis method was used to investigate parameter sensitivities and flow partitioning of three conceptual hydrological models simulating 31 Irish catchments. We compared two established conceptual hydrological models (NAM and SMARG) and a new model (SMART), produced especially for water quality modeling. In addition to the criteria that assess streamflow simulations, a ratio of average groundwater contribution to total streamflow was calculated for all simulations over the 16 year study period. As observations time-series of groundwater contributions to streamflow are not available at catchment scale, the groundwater ratios were evaluated against average annual indices of base flow and deep groundwater flow for each catchment. The exploration of sensitivities of internal flow path partitioning was a specific focus to assist in evaluating model performances. Results highlight that model structure has a strong impact on simulated groundwater flow paths. Sensitivity to the internal pathways in the models are not reflected in the performance criteria results. This demonstrates that simulated groundwater contribution should be constrained by independent data to ensure results

  17. Uncertainties in the temperature sensitivity of decomposition in tropical and subtropical ecosystems: Implications for models

    NASA Astrophysics Data System (ADS)

    Holland, Elisabeth A.; Neff, Jason C.; Townsend, Alan R.; McKeown, Becky

    2000-12-01

    Tropical ecosystems play a central role in the global carbon cycle. Large changes in tropical temperature over geologic time and the significant responses of tropical ecosystems to shorter-term variations such as El Niño/La Niña argue for a robust understanding of the temperature sensitivity of tropical decomposition. To examine the responsiveness of heterotrophic respiration to temperature, we measured rates of heterotrophic respiration from a wide range of tropical soils in a series of laboratory incubations. Under conditions of optimal soil water and nonlimiting substrate availability, heterotrophic respiration rose exponentially with rising temperature. The meanQ10measured across all temperature ranges in these short-term incubations was 2.37, but there was significant variation inQ10s across sites. The source of this variation could not be explained by soil carbon or nitrogen content, soil texture, site climate, or lignin to nitrogen ratio. At the beginning of the incubation, heterotrophic respiration increased exponentially with temperature for all sites, despite the fact that the fluxes differed by an order of magnitude. When substrate availability became limiting later in the incubation, the temperature response changed, and heterotrophic response declined above 35°C. The documented changes in temperature sensitivity with substrate availability argue for using temperature relationships developed under optimal conditions of substrate availability for models which include temperature regulation of heterotrophic respiration. To evaluate the significance of this natural variation in temperature control over decomposition, we used the Century ecosystem model gridded for the areas between the tropics of Cancer and Capricorn. These simulations used the mean and upper and lower confidence limits of the normalized exponential temperature response of our experimental studies. We found that systems with the lowest temperature sensitivity accumulated a total of 70

  18. Sensitivity of Last Glacial Maximum climate to uncertainties in tropical and subtropical ocean temperatures

    USGS Publications Warehouse

    Hostetler, S.; Pisias, N.; Mix, A.

    2006-01-01

    The faunal and floral gradients that underlie the CLIMAP (1981) sea-surface temperature (SST) reconstructions for the Last Glacial Maximum (LGM) reflect ocean temperature gradients and frontal positions. The transfer functions used to reconstruct SSTs from biologic gradients are biased, however, because at the warmest sites they display inherently low sensitivity in translating fauna to SST and they underestimate SST within the euphotic zones where the pycnocline is strong. Here we assemble available data and apply a statistical approach to adjust for hypothetical biases in the faunal-based SST estimates of LGM temperature. The largest bias adjustments are distributed in the tropics (to address low sensitivity) and subtropics (to address underestimation in the euphotic zones). The resulting SSTs are generally in better agreement than CLIMAP with recent geochemical estimates of glacial-interglacial temperature changes. We conducted a series of model experiments using the GENESIS general atmospheric circulation model to assess the sensitivity of the climate system to our bias-adjusted SSTs. Globally, the new SST field results in a modeled LGM surface-air cooling relative to present of 6.4 ??C (1.9 ??C cooler than that of CLIMAP). Relative to the simulation with CLIMAP SSTs, modeled precipitation over the oceans is reduced by 0.4 mm d-1 (an anomaly -0.4 versus 0.0 mm d-1 for CLIMAP) and increased over land (an anomaly -0.2 versus -0.5 mm d-1 for CLIMAP). Regionally strong responses are induced by changes in SST gradients. Data-model comparisons indicate improvement in agreement relative to CLIMAP, but differences among terrestrial data inferences and simulated moisture and temperature remain. Our SSTs result in positive mass balance over the northern hemisphere ice sheets (primarily through reduced summer ablation), supporting the hypothesis that tropical and subtropical ocean temperatures may have played a role in triggering glacial changes at higher latitudes.

  19. Uncertainty and Sensitivity of Contaminant Travel Times from the Upgradient Nevada Test Site to the Yucca Mountain Area

    SciTech Connect

    J. Zhu; K. Pohlmann; J. Chapman; C. Russell; R.W.H. Carroll; D. Shafer

    2009-09-10

    Yucca Mountain (YM), Nevada, has been proposed by the U.S. Department of Energy as the nation’s first permanent geologic repository for spent nuclear fuel and highlevel radioactive waste. In this study, the potential for groundwater advective pathways from underground nuclear testing areas on the Nevada Test Site (NTS) to intercept the subsurface of the proposed land withdrawal area for the repository is investigated. The timeframe for advective travel and its uncertainty for possible radionuclide movement along these flow pathways is estimated as a result of effective-porosity value uncertainty for the hydrogeologic units (HGUs) along the flow paths. Furthermore, sensitivity analysis is conducted to determine the most influential HGUs on the advective radionuclide travel times from the NTS to the YM area. Groundwater pathways are obtained using the particle tracking package MODPATH and flow results from the Death Valley regional groundwater flow system (DVRFS) model developed by the U.S. Geological Survey (USGS). Effectiveporosity values for HGUs along these pathways are one of several parameters that determine possible radionuclide travel times between the NTS and proposed YM withdrawal areas. Values and uncertainties of HGU porosities are quantified through evaluation of existing site effective-porosity data and expert professional judgment and are incorporated in the model through Monte Carlo simulations to estimate mean travel times and uncertainties. The simulations are based on two steady-state flow scenarios, the pre-pumping (the initial stress period of the DVRFS model), and the 1998 pumping (assuming steady-state conditions resulting from pumping in the last stress period of the DVRFS model) scenarios for the purpose of long-term prediction and monitoring. The pumping scenario accounts for groundwater withdrawal activities in the Amargosa Desert and other areas downgradient of YM. Considering each detonation in a clustered region around Pahute Mesa (in

  20. Development, sensitivity analysis, and uncertainty quantification of high-fidelity arctic sea ice models.

    SciTech Connect

    Peterson, Kara J.; Bochev, Pavel Blagoveston; Paskaleva, Biliana S.

    2010-09-01

    Arctic sea ice is an important component of the global climate system and due to feedback effects the Arctic ice cover is changing rapidly. Predictive mathematical models are of paramount importance for accurate estimates of the future ice trajectory. However, the sea ice components of Global Climate Models (GCMs) vary significantly in their prediction of the future state of Arctic sea ice and have generally underestimated the rate of decline in minimum sea ice extent seen over the past thirty years. One of the contributing factors to this variability is the sensitivity of the sea ice to model physical parameters. A new sea ice model that has the potential to improve sea ice predictions incorporates an anisotropic elastic-decohesive rheology and dynamics solved using the material-point method (MPM), which combines Lagrangian particles for advection with a background grid for gradient computations. We evaluate the variability of the Los Alamos National Laboratory CICE code and the MPM sea ice code for a single year simulation of the Arctic basin using consistent ocean and atmospheric forcing. Sensitivities of ice volume, ice area, ice extent, root mean square (RMS) ice speed, central Arctic ice thickness, and central Arctic ice speed with respect to ten different dynamic and thermodynamic parameters are evaluated both individually and in combination using the Design Analysis Kit for Optimization and Terascale Applications (DAKOTA). We find similar responses for the two codes and some interesting seasonal variability in the strength of the parameters on the solution.

  1. Numerical study of premixed HCCI engine combustion and its sensitivity to computational mesh and model uncertainties

    NASA Astrophysics Data System (ADS)

    Kong, Song-Charng; Reitz, Rolf D.

    2003-06-01

    This study used a numerical model to investigate the combustion process in a premixed iso-octane homogeneous charge compression ignition (HCCI) engine. The engine was a supercharged Cummins C engine operated under HCCI conditions. The CHEMKIN code was implemented into an updated KIVA-3V code so that the combustion could be modelled using detailed chemistry in the context of engine CFD simulations. The model was able to accurately simulate the ignition timing and combustion phasing for various engine conditions. The unburned hydrocarbon emissions were also well predicted while the carbon monoxide emissions were under predicted. Model results showed that the majority of unburned hydrocarbon is located in the piston-ring crevice region and the carbon monoxide resides in the vicinity of the cylinder walls. A sensitivity study of the computational grid resolution indicated that the combustion predictions were relatively insensitive to the grid density. However, the piston-ring crevice region needed to be simulated with high resolution to obtain accurate emissions predictions. The model results also indicated that HCCI combustion and emissions are very sensitive to the initial mixture temperature. The computations also show that the carbon monoxide emissions prediction can be significantly improved by modifying a key oxidation reaction rate constant.

  2. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis :

    SciTech Connect

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S.; Jakeman, John Davis; Swiler, Laura Painton; Stephens, John Adam; Vigil, Dena M.; Wildey, Timothy Michael; Bohnhoff, William J.; Eddy, John P.; Hu, Kenneth T.; Dalbey, Keith R.; Bauman, Lara E; Hough, Patricia Diane

    2014-05-01

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

  3. Genetic algorithm applied to a Soil-Vegetation-Atmosphere system: Sensitivity and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Schneider, Sébastien; Jacques, Diederik; Mallants, Dirk

    2010-05-01

    Numerical models are of precious help for predicting water fluxes in the vadose zone and more specifically in Soil-Vegetation-Atmosphere (SVA) systems. For such simulations, robust models and representative soil hydraulic parameters are required. Calibration of unsaturated hydraulic properties is known to be a difficult optimization problem due to the high non-linearity of the water flow equations. Therefore, robust methods are needed to avoid the optimization process to lead to non-optimal parameters. Evolutionary algorithms and specifically genetic algorithms (GAs) are very well suited for those complex parameter optimization problems. Additionally, GAs offer the opportunity to assess the confidence in the hydraulic parameter estimations, because of the large number of model realizations. The SVA system in this study concerns a pine stand on a heterogeneous sandy soil (podzol) in the Campine region in the north of Belgium. Throughfall and other meteorological data and water contents at different soil depths have been recorded during one year at a daily time step in two lysimeters. The water table level, which is varying between 95 and 170 cm, has been recorded with intervals of 0.5 hour. The leaf area index was measured as well at some selected time moments during the year in order to evaluate the energy which reaches the soil and to deduce the potential evaporation. Water contents at several depths have been recorded. Based on the profile description, five soil layers have been distinguished in the podzol. Two models have been used for simulating water fluxes: (i) a mechanistic model, the HYDRUS-1D model, which solves the Richards' equation, and (ii) a compartmental model, which treats the soil profile as a bucket into which water flows until its maximum capacity is reached. A global sensitivity analysis (Morris' one-at-a-time sensitivity analysis) was run previously to the calibration, in order to check the sensitivity in the chosen parameter search space. For

  4. A two dimensional modeling study of the sensitivity of ozone to radiative flux uncertainties

    SciTech Connect

    Grant, K.E.; Wuebbles, D.J.

    1988-08-01

    Radiative processes strongly effect equilibrium trace gas concentrations both directly, through photolysis reactions, and indirectly through temperature and transport processes. We have used the LLNL 2-D chemical-radiative-transport model to investigate the net sensitivity of equilibrium ozone concentrations to several changes in radiative forcing. Doubling CO/sub 2/ from 300 ppmv to 600 ppmv resulted in a temperature decrease of 5 K to 8 K in the middle stratosphere along with an 8% to 16% increase in ozone in the same region. Replacing our usual shortwave scattering algorithms with a simplified Rayleigh algorithm led to a 1% to 2% increase in ozone in the lower stratosphere. Finally, modifying our normal CO/sub 2/ cooling rates by corrections derived from line-by-line calculations resulted in several regions of heating and cooling. We observed temperature changes on the order of 1 K to 1.5 K with corresponding changes of 0.5% to 1.5% in O/sub 3/. Our results for doubled CO/sub 2/ compare favorably with those by other authors. Results for our two perturbation scenarios stress the need for accurately modeling radiative processes while confirming the general validity of current models. 15 refs., 5 figs.

  5. Regional-scale yield simulations using crop and climate models: assessing uncertainties, sensitivity to temperature and adaptation options

    NASA Astrophysics Data System (ADS)

    Challinor, A. J.

    2010-12-01

    Recent progress in assessing the impacts of climate variability and change on crops using multiple regional-scale simulations of crop and climate (i.e. ensembles) is presented. Simulations for India and China used perturbed responses to elevated carbon dioxide constrained using observations from FACE studies and controlled environments. Simulations with crop parameter sets representing existing and potential future adapted varieties were also carried out. The results for India are compared to sensitivity tests on two other crop models. For China, a parallel approach used socio-economic data to account for autonomous farmer adaptation. Results for the USA analysed cardinal temperatures under a range of local warming scenarios for 2711 varieties of spring wheat. The results are as follows: 1. Quantifying and reducing uncertainty. The relative contribution of uncertainty in crop and climate simulation to the total uncertainty in projected yield changes is examined. The observational constraints from FACE and controlled environment studies are shown to be the likely critical factor in maintaining relatively low crop parameter uncertainty. Without these constraints, crop simulation uncertainty in a doubled CO2 environment would likely be greater than uncertainty in simulating climate. However, consensus across crop models in India varied across different biophysical processes. 2. The response of yield to changes in local mean temperature was examined and compared to that found in the literature. No consistent response to temperature change was found across studies. 3. Implications for adaptation. China. The simulations of spring wheat in China show the relative importance of tolerance to water and heat stress in avoiding future crop failures. The greatest potential for reducing the number of harvests less than one standard deviation below the baseline mean yield value comes from alleviating water stress; the greatest potential for reducing harvests less than two

  6. Uncertainty in maternal exposures to ambient PM2.5 and benzene during pregnancy: Sensitivity to exposure estimation decisions.

    PubMed

    Tanner, Jean Paul; Salemi, Jason L; Stuart, Amy L; Yu, Haofei; Jordan, Melissa M; DuClos, Chris; Cavicchia, Philip; Correia, Jane A; Watkins, Sharon M; Kirby, Russell S

    2016-05-01

    We investigate uncertainty in estimates of pregnant women's exposure to ambient PM2.5 and benzene derived from central-site monitoring data. Through a study of live births in Florida during 2000-2009, we discuss the selection of spatial and temporal scales of analysis, limiting distances, and aggregation method. We estimate exposure concentrations and classify exposure for a range of alternatives, and compare impacts. Estimated exposure concentrations were most sensitive to the temporal scale of analysis for PM2.5, with similar sensitivity to spatial scale for benzene. Using 1-12 versus 3-8 weeks of gestational age as the exposure window resulted in reclassification of exposure by at least one quartile for up to 37% of mothers for PM2.5 and 27% for benzene. The largest mean absolute differences in concentration resulting from any decision were 0.78 µg/m(3) and 0.44 ppbC, respectively. No bias toward systematically higher or lower estimates was found between choices for any decision. PMID:27246278

  7. A one- and two-dimensional cross-section sensitivity and uncertainty path of the AARE (Advanced Analysis for Reactor Engineering) modular code system

    SciTech Connect

    Davidson, J.W.; Dudziak, D.J.; Higgs, C.E.; Stepanek, J.

    1988-01-01

    AARE, a code package to perform Advanced Analysis for Reactor Engineering, is a linked modular system for fission reactor core and shielding, as well as fusion blanket, analysis. Its cross-section sensitivity and uncertainty path presently includes the cross-section processing and reformatting code TRAMIX, cross-section homogenization and library reformatting code MIXIT, the 1-dimensional transport code ONEDANT, the 2-dimensional transport code TRISM, and the 1- and 2- dimensional cross-section sensitivity and uncertainty code SENSIBL. IN the present work, a short description of the whole AARE system is given, followed by a detailed description of the cross-section sensitivity and uncertainty path. 23 refs., 2 figs.

  8. CASL L1 Milestone report : CASL.P4.01, sensitivity and uncertainty analysis for CIPS with VIPRE-W and BOA.

    SciTech Connect

    Sung, Yixing; Adams, Brian M.; Secker, Jeffrey R.

    2011-12-01

    The CASL Level 1 Milestone CASL.P4.01, successfully completed in December 2011, aimed to 'conduct, using methodologies integrated into VERA, a detailed sensitivity analysis and uncertainty quantification of a crud-relevant problem with baseline VERA capabilities (ANC/VIPRE-W/BOA).' The VUQ focus area led this effort, in partnership with AMA, and with support from VRI. DAKOTA was coupled to existing VIPRE-W thermal-hydraulics and BOA crud/boron deposit simulations representing a pressurized water reactor (PWR) that previously experienced crud-induced power shift (CIPS). This work supports understanding of CIPS by exploring the sensitivity and uncertainty in BOA outputs with respect to uncertain operating and model parameters. This report summarizes work coupling the software tools, characterizing uncertainties, and analyzing the results of iterative sensitivity and uncertainty studies. These studies focused on sensitivity and uncertainty of CIPS indicators calculated by the current version of the BOA code used in the industry. Challenges with this kind of analysis are identified to inform follow-on research goals and VERA development targeting crud-related challenge problems.

  9. Uncertainty in Estimates of the Apparent Temperature Sensitivity of Peatland Dissolved Organic Carbon Fluxes under Changing Hydrologic Conditions

    NASA Astrophysics Data System (ADS)

    Clark, J. M.; Ballard, C. E.; Ireson, A. M.; Buytaert, W.; Wheater, H. S.; Rose, R.

    2010-12-01

    Peatlands cover ca. 3% of the land surface yet account for ca. 30% of the global soil carbon sink. As climate conditions are known to control carbon accumulation in peatlands, future projections of increased temperatures and decreased summer precipitation could alter peatland carbon fluxes by influencing the water table dynamics in these saturated soils. Dissolved organic carbon (DOC) is a small but significant peatland carbon flux that can alter the balance between net carbon sink or source. In spite of this importance, DOC fluxes are often overlooked in both measured and modelled estimates of peatland carbon budgets. Integration of DOC fluxes into peatland carbon models is hindered because limited data are available to quantify production rates and their sensitivity to changes in temperature and water table, particularly in comparison to carbon dioxide (CO2) fluxes. The few data that exist are largely based on laboratory incubation experiments; and it is unclear whether these laboratory derived values are comparable with the apparent temperature sensitivities observed in the field. Here, we present new analysis of long-term monitoring data from a British peatland site (Moor House), where we estimate the apparent temperature sensitivity of net DOC production using observed DOC concentrations and soil temperatures and estimates of soil water content from a hydrological model. Our estimates take into account uncertainties from both the observational data and the hydrological model. Our aim is to determine whether our laboratory derived Q10 values of net DOC production of 1.84 under saturated and 3.53 under unsaturated conditions for this site are comparable with values derived from the field monitoring data. If correct, these Q10 values suggest that DOC fluxes could increase under warmer and drier conditions.

  10. Sensitivity and uncertainty investigations for Hiroshima dose estimates and the applicability of the Little Boy mockup measurements

    SciTech Connect

    Bartine, D.E.; Cacuci, D.G.

    1983-09-13

    This paper describes sources of uncertainty in the data used for calculating dose estimates for the Hiroshima explosion and details a methodology for systematically obtaining best estimates and reduced uncertainties for the radiation doses received. (ACR)

  11. Robust Adaptation? Assessing the sensitivity of safety margins in flood defences to uncertainty in future simulations - a case study from Ireland.

    NASA Astrophysics Data System (ADS)

    Murphy, Conor; Bastola, Satish; Sweeney, John

    2013-04-01

    Climate change impact and adaptation assessments have traditionally adopted a 'top-down' scenario based approach, where information from different Global Climate Models (GCMs) and emission scenarios are employed to develop impacts led adaptation strategies. Due to the tradeoffs in the computational cost and need to include a wide range of GCMs for fuller characterization of uncertainties, scenarios are better used for sensitivity testing and adaptation options appraisal. One common approach to adaptation that has been defined as robust is the use of safety margins. In this work the sensitivity of safety margins that have been adopted by the agency responsible for flood risk management in Ireland, to the uncertainty in future projections are examined. The sensitivity of fluvial flood risk to climate change is assessed for four Irish catchments using a large number of GCMs (17) forced with three emissions scenarios (SRES A1B, A2, B1) as input to four hydrological models. Both uncertainty within and between hydrological models is assessed using the GLUE framework. Regionalisation is achieved using a change factor method to infer changes in the parameters of a weather generator using monthly output from the GCMs, while flood frequency analysis is conducted using the method of probability weighted moments to fit the Generalised Extreme Value distribution to ~20,000 annual maxima series. The sensitivity of design margins to the uncertainty space considered is visualised using risk response surfaces. The hydrological sensitivity is measured as the percentage change in flood peak for specified recurrence intervals. Results indicate that there is a considerable residual risk associated with allowances of +20% when uncertainties are accounted for and that the risk of exceedence of design allowances is greatest for more extreme, low frequency events with considerable implication for critical infrastructure, e.g., culverts, bridges, flood defences whose designs are normally

  12. A Single Bout of Aerobic Exercise Reduces Anxiety Sensitivity But Not Intolerance of Uncertainty or Distress Tolerance: A Randomized Controlled Trial.

    PubMed

    LeBouthillier, Daniel M; Asmundson, Gordon J G

    2015-01-01

    Several mechanisms have been posited for the anxiolytic effects of exercise, including reductions in anxiety sensitivity through interoceptive exposure. Studies on aerobic exercise lend support to this hypothesis; however, research investigating aerobic exercise in comparison to placebo, the dose-response relationship between aerobic exercise anxiety sensitivity, the efficacy of aerobic exercise on the spectrum of anxiety sensitivity and the effect of aerobic exercise on other related constructs (e.g. intolerance of uncertainty, distress tolerance) is lacking. We explored reductions in anxiety sensitivity and related constructs following a single session of exercise in a community sample using a randomized controlled trial design. Forty-one participants completed 30 min of aerobic exercise or a placebo stretching control. Anxiety sensitivity, intolerance of uncertainty and distress tolerance were measured at baseline, post-intervention and 3-day and 7-day follow-ups. Individuals in the aerobic exercise group, but not the control group, experienced significant reductions with moderate effect sizes in all dimensions of anxiety sensitivity. Intolerance of uncertainty and distress tolerance remained unchanged in both groups. Our trial supports the efficacy of aerobic exercise in uniquely reducing anxiety sensitivity in individuals with varying levels of the trait and highlights the importance of empirically validating the use of aerobic exercise to address specific mental health vulnerabilities. Aerobic exercise may have potential as a temporary substitute for psychotherapy aimed at reducing anxiety-related psychopathology. PMID:25874370

  13. A Single Bout of Aerobic Exercise Reduces Anxiety Sensitivity But Not Intolerance of Uncertainty or Distress Tolerance: A Randomized Controlled Trial.

    PubMed

    LeBouthillier, Daniel M; Asmundson, Gordon J G

    2015-01-01

    Several mechanisms have been posited for the anxiolytic effects of exercise, including reductions in anxiety sensitivity through interoceptive exposure. Studies on aerobic exercise lend support to this hypothesis; however, research investigating aerobic exercise in comparison to placebo, the dose-response relationship between aerobic exercise anxiety sensitivity, the efficacy of aerobic exercise on the spectrum of anxiety sensitivity and the effect of aerobic exercise on other related constructs (e.g. intolerance of uncertainty, distress tolerance) is lacking. We explored reductions in anxiety sensitivity and related constructs following a single session of exercise in a community sample using a randomized controlled trial design. Forty-one participants completed 30 min of aerobic exercise or a placebo stretching control. Anxiety sensitivity, intolerance of uncertainty and distress tolerance were measured at baseline, post-intervention and 3-day and 7-day follow-ups. Individuals in the aerobic exercise group, but not the control group, experienced significant reductions with moderate effect sizes in all dimensions of anxiety sensitivity. Intolerance of uncertainty and distress tolerance remained unchanged in both groups. Our trial supports the efficacy of aerobic exercise in uniquely reducing anxiety sensitivity in individuals with varying levels of the trait and highlights the importance of empirically validating the use of aerobic exercise to address specific mental health vulnerabilities. Aerobic exercise may have potential as a temporary substitute for psychotherapy aimed at reducing anxiety-related psychopathology.

  14. Advances in global sensitivity analyses of demographic-based species distribution models to address uncertainties in dynamic landscapes.

    PubMed

    Naujokaitis-Lewis, Ilona; Curtis, Janelle M R

    2016-01-01

    Developing a rigorous understanding of multiple global threats to species persistence requires the use of integrated modeling methods that capture processes which influence species distributions. Species distribution models (SDMs) coupled with population dynamics models can incorporate relationships between changing environments and demographics and are increasingly used to quantify relative extinction risks associated with climate and land-use changes. Despite their appeal, uncertainties associated with complex models can undermine their usefulness for advancing predictive ecology and informing conservation management decisions. We developed a computationally-efficient and freely available tool (GRIP 2.0) that implements and automates a global sensitivity analysis of coupled SDM-population dynamics models for comparing the relative influence of demographic parameters and habitat attributes on predicted extinction risk. Advances over previous global sensitivity analyses include the ability to vary habitat suitability across gradients, as well as habitat amount and configuration of spatially-explicit suitability maps of real and simulated landscapes. Using GRIP 2.0, we carried out a multi-model global sensitivity analysis of a coupled SDM-population dynamics model of whitebark pine (Pinus albicaulis) in Mount Rainier National Park as a case study and quantified the relative influence of input parameters and their interactions on model predictions. Our results differed from the one-at-time analyses used in the original study, and we found that the most influential parameters included the total amount of suitable habitat within the landscape, survival rates, and effects of a prevalent disease, white pine blister rust. Strong interactions between habitat amount and survival rates of older trees suggests the importance of habitat in mediating the negative influences of white pine blister rust. Our results underscore the importance of considering habitat attributes along

  15. Advances in global sensitivity analyses of demographic-based species distribution models to address uncertainties in dynamic landscapes.

    PubMed

    Naujokaitis-Lewis, Ilona; Curtis, Janelle M R

    2016-01-01

    Developing a rigorous understanding of multiple global threats to species persistence requires the use of integrated modeling methods that capture processes which influence species distributions. Species distribution models (SDMs) coupled with population dynamics models can incorporate relationships between changing environments and demographics and are increasingly used to quantify relative extinction risks associated with climate and land-use changes. Despite their appeal, uncertainties associated with complex models can undermine their usefulness for advancing predictive ecology and informing conservation management decisions. We developed a computationally-efficient and freely available tool (GRIP 2.0) that implements and automates a global sensitivity analysis of coupled SDM-population dynamics models for comparing the relative influence of demographic parameters and habitat attributes on predicted extinction risk. Advances over previous global sensitivity analyses include the ability to vary habitat suitability across gradients, as well as habitat amount and configuration of spatially-explicit suitability maps of real and simulated landscapes. Using GRIP 2.0, we carried out a multi-model global sensitivity analysis of a coupled SDM-population dynamics model of whitebark pine (Pinus albicaulis) in Mount Rainier National Park as a case study and quantified the relative influence of input parameters and their interactions on model predictions. Our results differed from the one-at-time analyses used in the original study, and we found that the most influential parameters included the total amount of suitable habitat within the landscape, survival rates, and effects of a prevalent disease, white pine blister rust. Strong interactions between habitat amount and survival rates of older trees suggests the importance of habitat in mediating the negative influences of white pine blister rust. Our results underscore the importance of considering habitat attributes along

  16. Advances in global sensitivity analyses of demographic-based species distribution models to address uncertainties in dynamic landscapes

    PubMed Central

    Curtis, Janelle M.R.

    2016-01-01

    Developing a rigorous understanding of multiple global threats to species persistence requires the use of integrated modeling methods that capture processes which influence species distributions. Species distribution models (SDMs) coupled with population dynamics models can incorporate relationships between changing environments and demographics and are increasingly used to quantify relative extinction risks associated with climate and land-use changes. Despite their appeal, uncertainties associated with complex models can undermine their usefulness for advancing predictive ecology and informing conservation management decisions. We developed a computationally-efficient and freely available tool (GRIP 2.0) that implements and automates a global sensitivity analysis of coupled SDM-population dynamics models for comparing the relative influence of demographic parameters and habitat attributes on predicted extinction risk. Advances over previous global sensitivity analyses include the ability to vary habitat suitability across gradients, as well as habitat amount and configuration of spatially-explicit suitability maps of real and simulated landscapes. Using GRIP 2.0, we carried out a multi-model global sensitivity analysis of a coupled SDM-population dynamics model of whitebark pine (Pinus albicaulis) in Mount Rainier National Park as a case study and quantified the relative influence of input parameters and their interactions on model predictions. Our results differed from the one-at-time analyses used in the original study, and we found that the most influential parameters included the total amount of suitable habitat within the landscape, survival rates, and effects of a prevalent disease, white pine blister rust. Strong interactions between habitat amount and survival rates of older trees suggests the importance of habitat in mediating the negative influences of white pine blister rust. Our results underscore the importance of considering habitat attributes along

  17. Cleaning up Floor Care.

    ERIC Educational Resources Information Center

    Carr, Richard; McLean, Doug

    1995-01-01

    Discusses how educational-facility maintenance departments can cut costs in floor cleaning through careful evaluation of floor equipment and products. Tips for choosing carpet detergents are highlighted. (GR)

  18. Mixed-Up Floors.

    ERIC Educational Resources Information Center

    Shaw, Richard

    2001-01-01

    Examines the maintenance management problems inherent in cleaning multiple flooring materials revealing the need for school officials to keep it simple when choosing flooring types. Also highlighted is a carpet recycling program used by Wright State University (Ohio). (GR)

  19. Use of Sensitivity and Uncertainty Analysis in the Design of Reactor Physics and Criticality Benchmark Experiments for Advanced Nuclear Fuel

    SciTech Connect

    Rearden, B.T.; Anderson, W.J.; Harms, G.A.

    2005-08-15

    Framatome ANP, Sandia National Laboratories (SNL), Oak Ridge National Laboratory (ORNL), and the University of Florida are cooperating on the U.S. Department of Energy Nuclear Energy Research Initiative (NERI) project 2001-0124 to design, assemble, execute, analyze, and document a series of critical experiments to validate reactor physics and criticality safety codes for the analysis of commercial power reactor fuels consisting of UO{sub 2} with {sup 235}U enrichments {>=}5 wt%. The experiments will be conducted at the SNL Pulsed Reactor Facility.Framatome ANP and SNL produced two series of conceptual experiment designs based on typical parameters, such as fuel-to-moderator ratios, that meet the programmatic requirements of this project within the given restraints on available materials and facilities. ORNL used the Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) to assess, from a detailed physics-based perspective, the similarity of the experiment designs to the commercial systems they are intended to validate. Based on the results of the TSUNAMI analysis, one series of experiments was found to be preferable to the other and will provide significant new data for the validation of reactor physics and criticality safety codes.

  20. Uncertainty and sensitivity analyses for gas and brine migration at the Waste Isolation Pilot Plant, May 1992

    SciTech Connect

    Helton, J.C.; Bean, J.E.; Butcher, B.M.; Garner, J.W.; Vaughn, P.; Schreiber, J.D.; Swift, P.N.

    1993-08-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis, stepwise regression analysis and examination of scatterplots are used in conjunction with the BRAGFLO model to examine two phase flow (i.e., gas and brine) at the Waste Isolation Pilot Plant (WIPP), which is being developed by the US Department of Energy as a disposal facility for transuranic waste. The analyses consider either a single waste panel or the entire repository in conjunction with the following cases: (1) fully consolidated shaft, (2) system of shaft seals with panel seals, and (3) single shaft seal without panel seals. The purpose of this analysis is to develop insights on factors that are potentially important in showing compliance with applicable regulations of the US Environmental Protection Agency (i.e., 40 CFR 191, Subpart B; 40 CFR 268). The primary topics investigated are (1) gas production due to corrosion of steel, (2) gas production due to microbial degradation of cellulosics, (3) gas migration into anhydrite marker beds in the Salado Formation, (4) gas migration through a system of shaft seals to overlying strata, and (5) gas migration through a single shaft seal to overlying strata. Important variables identified in the analyses include initial brine saturation of the waste, stoichiometric terms for corrosion of steel and microbial degradation of cellulosics, gas barrier pressure in the anhydrite marker beds, shaft seal permeability, and panel seal permeability.

  1. A PROBABILISTIC ARSENIC EXPOSURE ASSESSMENT FOR CHILDREN WHO CONTACT CHROMATED COPPER ARSENATE ( CAA )-TREATED PLAYSETS AND DECKS: PART 2 SENSITIVITY AND UNCERTAINTY ANALYSIS

    EPA Science Inventory

    A probabilistic model (SHEDS-Wood) was developed to examine children's exposure and dose to chromated copper arsenate (CCA)-treated wood, as described in Part 1 of this two part paper. This Part 2 paper discusses sensitivity and uncertainty analyses conducted to assess the key m...

  2. FIRST FLOOR FRONT ROOM. SECOND FLOOR HAS BEEN REMOVED NOTE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    FIRST FLOOR FRONT ROOM. SECOND FLOOR HAS BEEN REMOVED-- NOTE PRESENCE OF SECOND FLOOR WINDOWS (THE LATTER FLOOR WAS REMOVED MANY YEARS AGO), See also PA-1436 B-12 - Kid-Physick House, 325 Walnut Street, Philadelphia, Philadelphia County, PA

  3. Using global sensitivity analysis to evaluate the uncertainties of future shoreline changes under the Bruun rule assumption

    NASA Astrophysics Data System (ADS)

    Le Cozannet, Gonéri; Oliveros, Carlos; Castelle, Bruno; Garcin, Manuel; Idier, Déborah; Pedreros, Rodrigo; Rohmer, Jeremy

    2016-04-01

    Future sandy shoreline changes are often assed by summing the contributions of longshore and cross-shore effects. In such approaches, a contribution of sea-level rise can be incorporated by adding a supplementary term based on the Bruun rule. Here, our objective is to identify where and when the use of the Bruun rule can be (in)validated, in the case of wave-exposed beaches with gentle slopes. We first provide shoreline change scenarios that account for all uncertain hydrosedimentary processes affecting the idealized low- and high-energy coasts described by Stive (2004)[Stive, M. J. F. 2004, How important is global warming for coastal erosion? an editorial comment, Climatic Change, vol. 64, n 12, doi:10.1023/B:CLIM.0000024785.91858. ISSN 0165-0009]. Then, we generate shoreline change scenarios based on probabilistic sea-level rise projections based on IPCC. For scenario RCP 6.0 and 8.5 and in the absence of coastal defenses, the model predicts an observable shift toward generalized beach erosion by the middle of the 21st century. On the contrary, the model predictions are unlikely to differ from the current situation in case of scenario RCP 2.6. To get insight into the relative importance of each source of uncertainties, we quantify each contributions to the variance of the model outcome using a global sensitivity analysis. This analysis shows that by the end of the 21st century, a large part of shoreline change uncertainties are due to the climate change scenario if all anthropogenic greenhousegas emission scenarios are considered equiprobable. To conclude, the analysis shows that under the assumptions above, (in)validating the Bruun rule should be straightforward during the second half of the 21st century and for the RCP 8.5 scenario. Conversely, for RCP 2.6, the noise in shoreline change evolution should continue dominating the signal due to the Bruun effect. This last conclusion can be interpreted as an important potential benefit of climate change mitigation.

  4. SU-E-T-146: Effects of Uncertainties of Radiation Sensitivity of Biological Modelling for Treatment Planning

    SciTech Connect

    Oita, M; Uto, Y; Hori, H; Tominaga, M; Sasaki, M

    2014-06-01

    Purpose: The aim of this study was to evaluate the distribution of uncertainty of cell survival by radiation, and assesses the usefulness of stochastic biological model applying for gaussian distribution. Methods: For single cell experiments, exponentially growing cells were harvested from the standard cell culture dishes by trypsinization, and suspended in test tubes containing 1 ml of MEM(2x10{sup 6} cells/ml). The hypoxic cultures were treated with 95% N{sub 2}−5% CO{sub 2} gas for 30 minutes. In vitro radiosensitization was also measured in EMT6/KU single cells to add radiosensitizer under hypoxic conditions. X-ray irradiation was carried out by using an Xray unit (Hitachi X-ray unit, model MBR-1505R3) with 0.5 mm Al/1.0 mm Cu filter, 150 kV, 4 Gy/min). In vitro assay, cells on the dish were irradiated with 1 Gy to 24 Gy, respectively. After irradiation, colony formation assays were performed. Variations of biological parameters were investigated at standard cell culture(n=16), hypoxic cell culture(n=45) and hypoxic cell culture(n=21) with radiosensitizers, respectively. The data were obtained by separate schedule to take account for the variation of radiation sensitivity of cell cycle. Results: At standard cell culture, hypoxic cell culture and hypoxic cell culture with radiosensitizers, median and standard deviation of alpha/beta ratio were 37.1±73.4 Gy, 9.8±23.7 Gy, 20.7±21.9 Gy, respectively. Average and standard deviation of D{sub 50} were 2.5±2.5 Gy, 6.1±2.2 Gy, 3.6±1.3 Gy, respectively. Conclusion: In this study, we have challenged to apply these uncertainties of parameters for the biological model. The variation of alpha values, beta values, D{sub 50} as well as cell culture might have highly affected by probability of cell death. Further research is in progress for precise prediction of the cell death as well as tumor control probability for treatment planning.

  5. Quasi-Monte Carlo based global uncertainty and sensitivity analysis in modeling free product migration and recovery from petroleum-contaminated aquifers.

    PubMed

    He, Li; Huang, Gordon; Lu, Hongwei; Wang, Shuo; Xu, Yi

    2012-06-15

    This paper presents a global uncertainty and sensitivity analysis (GUSA) framework based on global sensitivity analysis (GSA) and generalized likelihood uncertainty estimation (GLUE) methods. Quasi-Monte Carlo (QMC) is employed by GUSA to obtain realizations of uncertain parameters, which are then input to the simulation model for analysis. Compared to GLUE, GUSA can not only evaluate global sensitivity and uncertainty of modeling parameter sets, but also quantify the uncertainty in modeling prediction sets. Moreover, GUSA's another advantage lies in alleviation of computational effort, since those globally-insensitive parameters can be identified and removed from the uncertain-parameter set. GUSA is applied to a practical petroleum-contaminated site in Canada to investigate free product migration and recovery processes under aquifer remediation operations. Results from global sensitivity analysis show that (1) initial free product thickness has the most significant impact on total recovery volume but least impact on residual free product thickness and recovery rate; (2) total recovery volume and recovery rate are sensitive to residual LNAPL phase saturations and soil porosity. Results from uncertainty predictions reveal that the residual thickness would remain high and almost unchanged after about half-year of skimmer-well scheme; the rather high residual thickness (0.73-1.56 m 20 years later) indicates that natural attenuation would not be suitable for the remediation. The largest total recovery volume would be from water pumping, followed by vacuum pumping, and then skimmer. The recovery rates of the three schemes would rapidly decrease after 2 years (less than 0.05 m(3)/day), thus short-term remediation is not suggested.

  6. Development code for sensitivity and uncertainty analysis of input on the MCNPX for neutronic calculation in PWR core

    SciTech Connect

    Hartini, Entin Andiwijayakusuma, Dinan

    2014-09-30

    This research was carried out on the development of code for uncertainty analysis is based on a statistical approach for assessing the uncertainty input parameters. In the butn-up calculation of fuel, uncertainty analysis performed for input parameters fuel density, coolant density and fuel temperature. This calculation is performed during irradiation using Monte Carlo N-Particle Transport. The Uncertainty method based on the probabilities density function. Development code is made in python script to do coupling with MCNPX for criticality and burn-up calculations. Simulation is done by modeling the geometry of PWR terrace, with MCNPX on the power 54 MW with fuel type UO2 pellets. The calculation is done by using the data library continuous energy cross-sections ENDF / B-VI. MCNPX requires nuclear data in ACE format. Development of interfaces for obtaining nuclear data in the form of ACE format of ENDF through special process NJOY calculation to temperature changes in a certain range.

  7. Sensitivity of a radiative transfer model to the uncertainty in the aerosol optical depth used as input

    NASA Astrophysics Data System (ADS)

    Román, Roberto; Bilbao, Julia; de Miguel, Argimiro; Pérez-Burgos, Ana

    2014-05-01

    The radiative transfer models can be used to obtain solar radiative quantities in the Earth surface as the erythemal ultraviolet (UVER) irradiance, which is the spectral irradiance weighted with the erythemal (sunburn) action spectrum, and the total shortwave irradiance (SW; 305-2,8000 nm). Aerosol and atmospheric properties are necessary as inputs in the model in order to calculate the UVER and SW irradiances under cloudless conditions, however the uncertainty in these inputs causes another uncertainty in the simulations. The objective of this work is to quantify the uncertainty in UVER and SW simulations generated by the aerosol optical depth (AOD) uncertainty. The data from different satellite retrievals were downloaded at nine Spanish places located in the Iberian Peninsula: Total ozone column from different databases, spectral surface albedo and water vapour column from MODIS instrument, AOD at 443 nm and Angström Exponent (between 443 nm and 670 nm) from MISR instrument onboard Terra satellite, single scattering albedo from OMI instrument onboard Aura satellite. The obtained AOD at 443 nm data from MISR were compared with AERONET measurements in six Spanish sites finding an uncertainty in the AOD from MISR of 0.074. In this work the radiative transfer model UVSPEC/Libradtran (1.7 version) was used to obtain the SW and UVER irradiance under cloudless conditions for each month and for different solar zenith angles (SZA) in the nine mentioned locations. The inputs used for these simulations were monthly climatology tables obtained with the available data in each location. Once obtained the UVER and SW simulations, they were repeated twice but changing the AOD monthly values by the same AOD plus/minus its uncertainty. The maximum difference between the irradiance run with AOD and the irradiance run with AOD plus/minus its uncertainty was calculated for each month, SZA, and location. This difference was considered as the uncertainty on the model caused by the AOD

  8. Preliminary performance assessment for the Waste Isolation Pilot Plant, December 1992. Volume 4: Uncertainty and sensitivity analyses for 40 CFR 191, Subpart B

    SciTech Connect

    Not Available

    1993-08-01

    Before disposing of transuranic radioactive waste in the Waste Isolation Pilot Plant (WIPP), the United States Department of Energy (DOE) must evaluate compliance with applicable long-term regulations of the United States Environmental Protection Agency (EPA). Sandia National Laboratories is conducting iterative performance assessments (PAs) of the WIPP for the DOE to provide interim guidance while preparing for a final compliance evaluation. This volume of the 1992 PA contains results of uncertainty and sensitivity analyses with respect to the EPA`s Environmental Protection Standards for Management and Disposal of Spent Nuclear Fuel, High-Level and Transuranic Radioactive Wastes (40 CFR 191, Subpart B). Additional information about the 1992 PA is provided in other volumes. Results of the 1992 uncertainty and sensitivity analyses indicate that, conditional on the modeling assumptions, the choice of parameters selected for sampling, and the assigned parameter-value distributions, the most important parameters for which uncertainty has the potential to affect compliance with 40 CFR 191B are: drilling intensity, intrusion borehole permeability, halite and anhydrite permeabilities, radionuclide solubilities and distribution coefficients, fracture spacing in the Culebra Dolomite Member of the Rustler Formation, porosity of the Culebra, and spatial variability of Culebra transmissivity. Performance with respect to 40 CFR 191B is insensitive to uncertainty in other parameters; however, additional data are needed to confirm that reality lies within the assigned distributions.

  9. Pre-waste-emplacement ground-water travel time sensitivity and uncertainty analyses for Yucca Mountain, Nevada; Yucca Mountain Site Characterization Project

    SciTech Connect

    Kaplan, P.G.

    1993-01-01

    Yucca Mountain, Nevada is a potential site for a high-level radioactive-waste repository. Uncertainty and sensitivity analyses were performed to estimate critical factors in the performance of the site with respect to a criterion in terms of pre-waste-emplacement ground-water travel time. The degree of failure in the analytical model to meet the criterion is sensitive to the estimate of fracture porosity in the upper welded unit of the problem domain. Fracture porosity is derived from a number of more fundamental measurements including fracture frequency, fracture orientation, and the moisture-retention characteristic inferred for the fracture domain.

  10. Floors: Care and Maintenance.

    ERIC Educational Resources Information Center

    Post Office Dept., Washington, DC.

    Guidelines, methods and policies regarding the care and maintenance of post office building floors are overviewed in this handbook. Procedures outlined are concerned with maintaining a required level of appearance without wasting manpower. Flooring types and characteristics and the particular cleaning requirements of each type are given along with…

  11. School Flooring Factors

    ERIC Educational Resources Information Center

    McGrath, John

    2012-01-01

    With all of the hype that green building is receiving throughout the school facility-management industry, it's easy to overlook some elements that may not be right in front of a building manager's nose. It is helpful to examine the role floor covering plays in a green building project. Flooring is one of the most significant and important systems…

  12. Maximizing Hard Floor Maintenance.

    ERIC Educational Resources Information Center

    Steger, Michael

    2000-01-01

    Explains the maintenance options available for hardwood flooring that can help ensure long life cycles and provide inviting spaces. Developing a maintenance system, knowing the type of traffic that the floor must endure, using entrance matting, and adhering to manufacturers guidelines are discussed. Daily, monthly or quarterly, and long-term…

  13. Preliminary performance assessment for the Waste Isolation Pilot Plant, December 1992. Volume 5, Uncertainty and sensitivity analyses of gas and brine migration for undisturbed performance

    SciTech Connect

    Not Available

    1993-08-01

    Before disposing of transuranic radioactive waste in the Waste Isolation Pilot Plant (WIPP), the United States Department of Energy (DOE) must evaluate compliance with applicable long-term regulations of the United States Environmental Protection Agency (EPA). Sandia National Laboratories is conducting iterative performance assessments (PAs) of the WIPP for the DOE to provide interim guidance while preparing for a final compliance evaluation. This volume of the 1992 PA contains results of uncertainty and sensitivity analyses with respect to migration of gas and brine from the undisturbed repository. Additional information about the 1992 PA is provided in other volumes. Volume 1 contains an overview of WIPP PA and results of a preliminary comparison with 40 CFR 191, Subpart B. Volume 2 describes the technical basis for the performance assessment, including descriptions of the linked computational models used in the Monte Carlo analyses. Volume 3 contains the reference data base and values for input parameters used in consequence and probability modeling. Volume 4 contains uncertainty and sensitivity analyses with respect to the EPA`s Environmental Standards for the Management and Disposal of Spent Nuclear Fuel, High-Level and Transuranic Radioactive Wastes (40 CFR 191, Subpart B). Finally, guidance derived from the entire 1992 PA is presented in Volume 6. Results of the 1992 uncertainty and sensitivity analyses indicate that, conditional on the modeling assumptions and the assigned parameter-value distributions, the most important parameters for which uncertainty has the potential to affect gas and brine migration from the undisturbed repository are: initial liquid saturation in the waste, anhydrite permeability, biodegradation-reaction stoichiometry, gas-generation rates for both corrosion and biodegradation under inundated conditions, and the permeability of the long-term shaft seal.

  14. Modelling ecological and human exposure to POPs in Venice lagoon - Part II: Quantitative uncertainty and sensitivity analysis in coupled exposure models.

    PubMed

    Radomyski, Artur; Giubilato, Elisa; Ciffroy, Philippe; Critto, Andrea; Brochot, Céline; Marcomini, Antonio

    2016-11-01

    The study is focused on applying uncertainty and sensitivity analysis to support the application and evaluation of large exposure models where a significant number of parameters and complex exposure scenarios might be involved. The recently developed MERLIN-Expo exposure modelling tool was applied to probabilistically assess the ecological and human exposure to PCB 126 and 2,3,7,8-TCDD in the Venice lagoon (Italy). The 'Phytoplankton', 'Aquatic Invertebrate', 'Fish', 'Human intake' and PBPK models available in MERLIN-Expo library were integrated to create a specific food web to dynamically simulate bioaccumulation in various aquatic species and in the human body over individual lifetimes from 1932 until 1998. MERLIN-Expo is a high tier exposure modelling tool allowing propagation of uncertainty on the model predictions through Monte Carlo simulation. Uncertainty in model output can be further apportioned between parameters by applying built-in sensitivity analysis tools. In this study, uncertainty has been extensively addressed in the distribution functions to describe the data input and the effect on model results by applying sensitivity analysis techniques (screening Morris method, regression analysis, and variance-based method EFAST). In the exposure scenario developed for the Lagoon of Venice, the concentrations of 2,3,7,8-TCDD and PCB 126 in human blood turned out to be mainly influenced by a combination of parameters (half-lives of the chemicals, body weight variability, lipid fraction, food assimilation efficiency), physiological processes (uptake/elimination rates), environmental exposure concentrations (sediment, water, food) and eating behaviours (amount of food eaten). In conclusion, this case study demonstrated feasibility of MERLIN-Expo to be successfully employed in integrated, high tier exposure assessment. PMID:27432731

  15. Modelling ecological and human exposure to POPs in Venice lagoon - Part II: Quantitative uncertainty and sensitivity analysis in coupled exposure models.

    PubMed

    Radomyski, Artur; Giubilato, Elisa; Ciffroy, Philippe; Critto, Andrea; Brochot, Céline; Marcomini, Antonio

    2016-11-01

    The study is focused on applying uncertainty and sensitivity analysis to support the application and evaluation of large exposure models where a significant number of parameters and complex exposure scenarios might be involved. The recently developed MERLIN-Expo exposure modelling tool was applied to probabilistically assess the ecological and human exposure to PCB 126 and 2,3,7,8-TCDD in the Venice lagoon (Italy). The 'Phytoplankton', 'Aquatic Invertebrate', 'Fish', 'Human intake' and PBPK models available in MERLIN-Expo library were integrated to create a specific food web to dynamically simulate bioaccumulation in various aquatic species and in the human body over individual lifetimes from 1932 until 1998. MERLIN-Expo is a high tier exposure modelling tool allowing propagation of uncertainty on the model predictions through Monte Carlo simulation. Uncertainty in model output can be further apportioned between parameters by applying built-in sensitivity analysis tools. In this study, uncertainty has been extensively addressed in the distribution functions to describe the data input and the effect on model results by applying sensitivity analysis techniques (screening Morris method, regression analysis, and variance-based method EFAST). In the exposure scenario developed for the Lagoon of Venice, the concentrations of 2,3,7,8-TCDD and PCB 126 in human blood turned out to be mainly influenced by a combination of parameters (half-lives of the chemicals, body weight variability, lipid fraction, food assimilation efficiency), physiological processes (uptake/elimination rates), environmental exposure concentrations (sediment, water, food) and eating behaviours (amount of food eaten). In conclusion, this case study demonstrated feasibility of MERLIN-Expo to be successfully employed in integrated, high tier exposure assessment.

  16. Automating calibration, sensitivity and uncertainty analysis of complex models using the R package Flexible Modeling Environment (FME): SWAT as an example

    USGS Publications Warehouse

    Wu, Y.; Liu, S.

    2012-01-01

    Parameter optimization and uncertainty issues are a great challenge for the application of large environmental models like the Soil and Water Assessment Tool (SWAT), which is a physically-based hydrological model for simulating water and nutrient cycles at the watershed scale. In this study, we present a comprehensive modeling environment for SWAT, including automated calibration, and sensitivity and uncertainty analysis capabilities through integration with the R package Flexible Modeling Environment (FME). To address challenges (e.g., calling the model in R and transferring variables between Fortran and R) in developing such a two-language coupling framework, 1) we converted the Fortran-based SWAT model to an R function (R-SWAT) using the RFortran platform, and alternatively 2) we compiled SWAT as a Dynamic Link Library (DLL). We then wrapped SWAT (via R-SWAT) with FME to perform complex applications including parameter identifiability, inverse modeling, and sensitivity and uncertainty analysis in the R environment. The final R-SWAT-FME framework has the following key functionalities: automatic initialization of R, running Fortran-based SWAT and R commands in parallel, transferring parameters and model output between SWAT and R, and inverse modeling with visualization. To examine this framework and demonstrate how it works, a case study simulating streamflow in the Cedar River Basin in Iowa in the United Sates was used, and we compared it with the built-in auto-calibration tool of SWAT in parameter optimization. Results indicate that both methods performed well and similarly in searching a set of optimal parameters. Nonetheless, the R-SWAT-FME is more attractive due to its instant visualization, and potential to take advantage of other R packages (e.g., inverse modeling and statistical graphics). The methods presented in the paper are readily adaptable to other model applications that require capability for automated calibration, and sensitivity and uncertainty

  17. FIRST FLOOR REAR ROOM. SECOND FLOOR HAS BEEN REMOVED NOTE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    FIRST FLOOR REAR ROOM. SECOND FLOOR HAS BEEN REMOVED-- NOTE PRESENCE OF SECOND FLOOR WINDOWS AT LEFT. See also PA-1436 B-6 - Kid-Physick House, 325 Walnut Street, Philadelphia, Philadelphia County, PA

  18. FIRST FLOOR REAR ROOM. SECOND FLOOR HAS BEEN REMOVED NOTE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    FIRST FLOOR REAR ROOM. SECOND FLOOR HAS BEEN REMOVED-- NOTE PRESENCE OF SECOND FLOOR WINDOWS AT LEFT. See also PA-1436 B-13 - Kid-Physick House, 325 Walnut Street, Philadelphia, Philadelphia County, PA

  19. An index of parameter reproducibility accounting for estimation uncertainty: theory and case study on β-cell responsivity and insulin sensitivity.

    PubMed

    Dalla Man, Chiara; Pillonetto, Gianluigi; Riz, Michela; Cobelli, Claudio

    2015-06-01

    Parameter reproducibility is necessary to perform longitudinal studies where parameters are assessed to monitor disease progression or effect of therapy but are also useful in powering the study, i.e., to define how many subjects should be studied to observe a given effect. The assessment of parameter reproducibility is usually accomplished by methods that do not take into account the fact that these parameters are estimated with uncertainty. This is particularly relevant in physiological and clinical studies where usually reproducibility cannot be assessed by multiple testing and is usually assessed from a single replication of the test. Working in a suitable stochastic framework, here we propose a new index (S) to measure reproducibility that takes into account parameter uncertainty and is particularly suited to handle the normal testing conditions of physiological and clinical investigations. Simulation results prove that S, by properly taking into account parameter uncertainty, is more accurate and robust than the methods available in the literature. The new metric is applied to assess reproducibility of insulin sensitivity and β-cell responsivity of a mixed-meal tolerance test from data obtained in the same subjects retested 1 wk apart. Results show that the indices of insulin sensitivity and β-cell responsivity to glucose are well reproducible. We conclude that the oral minimal models provide useful indices that can be used safely in prospective studies or to assess the efficacy of a given therapy.

  20. Global sensitivity analysis of Leaf-Canopy radiative transfer Model for analysis and quantification of uncertainties in remote sensed data product generation

    NASA Astrophysics Data System (ADS)

    Furfaro, R.; Morris, R. D.; Kottas, A.; Taddy, M.; Ganapol, B. D.

    2007-12-01

    Analyzing, quantifying and reporting the uncertainty in remote sensed data products is critical for our understanding of Earth's coupled system. It is the only way in which the uncertainty of further analyses using these data products as inputs can be quantified. Analyzing the source of the data product uncertainties can identify where the models must be improved, or where better input information must be obtained. Here we focus on developing a probabilistic framework for analysis of uncertainties occurring when satellite data (e.g., MODIS) are employed to retrieve biophysical properties of vegetation. Indeed, the process of remotely estimating vegetation properties involves inverting a Radiative Transfer Model (RTM), as in the case of the MOD15 algorithm where seven atmospherically corrected reflectance factors are ingested and compared to a set of computed, RTM-based, reflectances (look-up table) to infer the Leaf Area Index (LAI). Since inversion is generally ill-conditioned, and since a-priori information is important in constraining the inverse model, sensitivity analysis plays a key role in defining which parameters have the greatest impact to the computed observation. We develop a framework to perform global sensitivity analysis, i.e., to determine how the output changes as all inputs vary continuously. We used a coupled Leaf-Canopy radiative transfer Model (LCM) to approximate the functional relationship between the observed reflectance and vegetation biophysical parameters. LCM was designed to study the feasibility of detecting leaf/canopy biochemistry using remote sensed observations and has the unique capability to include leaf biochemistry (e.g., chlorophyll, water, lignin, protein) as input parameters. The influence of LCM input parameters (including canopy morphological and biochemical parameters) on the hemispherical reflectance is captured by computing the "main effects", which give information about the influence of each input, and the "sensitivity

  1. Sensitivity-constrained nonlinear programming: a general approach for planning and design under parameter uncertainty and an application to treatment plant design. Final report

    SciTech Connect

    Uber, J.G.; Kao, J.J.; Brill, E.D.; Pfeffer, J.T.

    1988-01-01

    One important problem with using mathematical models is that parameter values, and thus the model results, are often uncertain. A general approach, Sensitivity Constrained Nonlinear Programming (SCNLP), was developed for extending nonlinear optimization models to include functions that depend on the system sensitivity to changes in parameter values. Such sensitivity-based functions include first-order measures of variance, reliability, and robustness. Thus SCNLP can be used to generate solutions or designs that are good with respect to modeled objectives, and that also reflect concerns about uncertainty in parameter values. A solution procedure and an implementation based on an existing nonlinear-programming code are presented. SCNLP was applied to a complex activated sludge waste-water treatment plant design problem.

  2. Uncertainty and sensitivity analysis for two-phase flow in the vicinity of the repository in the 1996 performance assessment for the Waste Isolation Pilot Plant: Undisturbed conditions

    SciTech Connect

    HELTON,JON CRAIG; BEAN,J.E.; ECONOMY,K.; GARNER,J.W.; MACKINNON,ROBERT J.; MILLER,JOEL D.; SCHREIBER,JAMES D.; VAUGHN,PALMER

    2000-05-19

    Uncertainty and sensitivity analysis results obtained in the 1996 performance assessment for the Waste Isolation Pilot Plant are presented for two-phase flow the vicinity of the repository under undisturbed conditions. Techniques based on Latin hypercube sampling, examination of scatterplots, stepwise regression analysis, partial correlation analysis and rank transformation are used to investigate brine inflow, gas generation repository pressure, brine saturation and brine and gas outflow. Of the variables under study, repository pressure is potentially the most important due to its influence on spallings and direct brine releases, with the uncertainty in its value being dominated by the extent to which the microbial degradation of cellulose takes place, the rate at which the corrosion of steel takes place, and the amount of brine that drains from the surrounding disturbed rock zone into the repository.

  3. DAKOTA, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 reference manual

    SciTech Connect

    Griffin, Joshua D. (Sandai National Labs, Livermore, CA); Eldred, Michael Scott; Martinez-Canales, Monica L.; Watson, Jean-Paul; Kolda, Tamara Gibson; Adams, Brian M.; Swiler, Laura Painton; Williams, Pamela J.; Hough, Patricia Diane; Gay, David M.; Dunlavy, Daniel M.; Eddy, John P.; Hart, William Eugene; Guinta, Anthony A.; Brown, Shannon L.

    2006-10-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a reference manual for the commands specification for the DAKOTA software, providing input overviews, option descriptions, and example specifications.

  4. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis. Version 5.0, user's reference manual.

    SciTech Connect

    Eldred, Michael Scott; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Hough, Patricia Diane; Gay, David M.; Eddy, John P.; Haskell, Karen H.

    2010-05-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a reference manual for the commands specification for the DAKOTA software, providing input overviews, option descriptions, and example specifications.

  5. Sensitivity analysis of the GEMS soil organic carbon model to land cover land use classification uncertainties under different climate scenarios in Senegal

    USGS Publications Warehouse

    Dieye, A.M.; Roy, D.P.; Hanan, N.P.; Liu, S.; Hansen, M.; Toure, A.

    2011-01-01

    Spatially explicit land cover land use (LCLU) change information is needed to drive biogeochemical models that simulate soil organic carbon (SOC) dynamics. Such information is increasingly being mapped using remotely sensed satellite data with classification schemes and uncertainties constrained by the sensing system, classification algorithms and land cover schemes. In this study, automated LCLU classification of multi-temporal Landsat satellite data were used to assess the sensitivity of SOC modeled by the Global Ensemble Biogeochemical Modeling System (GEMS). The GEMS was run for an area of 1560 km2 in Senegal under three climate change scenarios with LCLU maps generated using different Landsat classification approaches. This research provides a method to estimate the variability of SOC, specifically the SOC uncertainty due to satellite classification errors, which we show is dependent not only on the LCLU classification errors but also on where the LCLU classes occur relative to the other GEMS model inputs. ?? 2011 Author(s).

  6. Uncertainty and sensitivity analysis within the post closure Performance and Safety Assessment of the French deep geological radwaste disposal: methodology, tool and examples of results

    NASA Astrophysics Data System (ADS)

    Pepin, G.

    2009-04-01

    quantify the dispersion of results (time, maximum...); this part deals with uncertainty analysis, and (ii) to identify relevant models and input data whose uncertainty manages uncertainty of the results; this part deals with sensitivity analysis. First this paper describes Andra's methodology and numerical tool used. Then it presents results applied to Monte-Carlo probabilistic multi-parametric study on HLW (vitrified waste) disposal, in order to study propagation of uncertainties of input data (Callovo-Oxfordian, EDZ (Excavated Damaged Zone), and Engineering components) on various radionuclide pathways within the disposal. The methodology consists of (i) setting up probabilistic distribution function (pdf), according to the level of knowledge, (ii) sampling all pdf with Latin Hypercube Sampling methods, (iii) ensuring physical coherence in sets of input data, using correlations and constraints, (iv) using integrated computing tool (Alliances platform) to perform calculations. Results focus on: - uncertainty analysis: multi-parametric study shows (i) that transfer through undisturbed argillites remains the main pathway, (ii) a large dispersion (several orders of magnitude) of molar rate at the top of clay layer for the two pathways (undisturbed argillites, and repository structures), which includes reference point of altered scenario, such as seal failure one, and which is close to worst case one. - sensitivity analysis: for undisturbed argillites pathway, calculations highlight that uncertainty on some input data such as adsorption of Iodine, solubility limit of Selenium, diffusion and vertical permeability of undisturbed argillites, manages dispersion of the results. For repository structures pathway, uncertainty on hydraulic properties, such as permeabilities of EDZ, are relevant. This study is important to identify knowledge of parameters which has to be increased in order to reduce dispersion (uncertainty) of each performance assessment indicator. Lessons learnt lead

  7. Sensitivity of model assessments of high-speed civil transport effects on stratospheric ozone resulting from uncertainties in the NO x production from lightning

    NASA Astrophysics Data System (ADS)

    Smyshlyaev, Sergei P.; Geller, Marvin A.; Yudin, Valery A.

    1999-11-01

    Lightning NOx production is one of the most important and most uncertain sources of reactive nitrogen in the atmosphere. To examine the role of NOx lightning production uncertainties in supersonic aircraft assessment studies, we have done a number of numerical calculations with the State University of New York at Stony Brook-Russian State Hydrometeorological Institute of Saint-Petersburg two-dimensional model. The amount of nitrogen oxides produced by lightning discharges was varied within its quoted uncertainty from 2 to 12 Tg N/yr. Different latitudinal, altitudinal, and seasonal distributions of lightning NOx production were considered. Results of these model calculations show that the assessment of supersonic aircraft impacts on the ozone layer is very sensitive to the strength of NOx production from lightning. The high-speed civil transport produced NOx leads to positive column ozone changes for lightning NOx production less than 4 Tg N/yr, and to total ozone decrease for lightning NOx production more than 5 Tg N/yr for the same NOx emission scenario. For large lightning production the ozone response is mostly decreasing with increasing emission index, while for low lightning production the ozone response is mostly increasing with increasing emission index. Uncertainties in the global lightning NOx production strength may lead to uncertainties in column ozone up to 4%. The uncertainties due to neglecting the seasonal variations of the lightning NOx production and its simplified latitude distribution are about 2 times less (1.5-2%). The type of altitude distribution for the lightning NOx production does not significally impact the column ozone, but is very important for the assessment studies of aircraft perturbations of atmospheric ozone. Increased global lightning NOx production causes increased total ozone, but for assessment of the column ozone response to supersonic aircraft emissions, the increase of lightning NOx production leads to column ozone

  8. The Inter-Annual Variability Analysis of Carbon Exchange in Low Artic Fen Uncovers The Climate Sensitivity And The Uncertainties Around Net Ecosystem Exchange Partitioning

    NASA Astrophysics Data System (ADS)

    Blanco, E. L.; Lund, M.; Williams, M. D.; Christensen, T. R.; Tamstorf, M. P.

    2015-12-01

    An improvement in our process-based understanding of CO2 exchanges in the Arctic, and their climate sensitivity, is critical for examining the role of tundra ecosystems in changing climates. Arctic organic carbon storage has seen increased attention in recent years due to large potential for carbon releases following thaw. Our knowledge about the exact scale and sensitivity for a phase-change of these C stocks are, however, limited. Minor variations in Gross Primary Production (GPP) and Ecosystem Respiration (Reco) driven by changes in the climate can lead to either C sink or C source states, which likely will impact the overall C cycle of the ecosystem. Eddy covariance data is usually used to partition Net Ecosystem Exchange (NEE) into GPP and Reco achieved by flux separation algorithms. However, different partitioning approaches lead to different estimates. as well as undefined uncertainties. The main objectives of this study are to use model-data fusion approaches to (1) determine the inter-annual variability in C source/sink strength for an Arctic fen, and attribute such variations to GPP vs Reco, (2) investigate the climate sensitivity of these processes and (3) explore the uncertainties in NEE partitioning. The intention is to elaborate on the information gathered in an existing catchment area under an extensive cross-disciplinary ecological monitoring program in low Arctic West Greenland, established under the auspices of the Greenland Ecosystem Monitoring (GEM) program. The use of such a thorough long-term (7 years) dataset applied to the exploration in inter-annual variability of carbon exchange, related driving factors and NEE partition uncertainties provides a novel input into our understanding about land-atmosphere CO2 exchange.

  9. PROCEEDINGS OF THE INTERNATIONAL WORKSHOP ON UNCERTAINTY, SENSITIVITY, AND PARAMETER ESTIMATION FOR MULTIMEDIA ENVIRONMENTAL MODELING. EPA/600/R-04/117, NUREG/CP-0187, ERDC SR-04-2.

    EPA Science Inventory

    An International Workshop on Uncertainty, Sensitivity, and Parameter Estimation for Multimedia Environmental Modeling was held August 1921, 2003, at the U.S. Nuclear Regulatory Commission Headquarters in Rockville, Maryland, USA. The workshop was organized and convened by the Fe...

  10. Direct Aerosol Forcing Uncertainty

    DOE Data Explorer

    Mccomiskey, Allison

    2008-01-15

    Understanding sources of uncertainty in aerosol direct radiative forcing (DRF), the difference in a given radiative flux component with and without aerosol, is essential to quantifying changes in Earth's radiation budget. We examine the uncertainty in DRF due to measurement uncertainty in the quantities on which it depends: aerosol optical depth, single scattering albedo, asymmetry parameter, solar geometry, and surface albedo. Direct radiative forcing at the top of the atmosphere and at the surface as well as sensitivities, the changes in DRF in response to unit changes in individual aerosol or surface properties, are calculated at three locations representing distinct aerosol types and radiative environments. The uncertainty in DRF associated with a given property is computed as the product of the sensitivity and typical measurement uncertainty in the respective aerosol or surface property. Sensitivity and uncertainty values permit estimation of total uncertainty in calculated DRF and identification of properties that most limit accuracy in estimating forcing. Total uncertainties in modeled local diurnally averaged forcing range from 0.2 to 1.3 W m-2 (42 to 20%) depending on location (from tropical to polar sites), solar zenith angle, surface reflectance, aerosol type, and aerosol optical depth. The largest contributor to total uncertainty in DRF is usually single scattering albedo; however decreasing measurement uncertainties for any property would increase accuracy in DRF. Comparison of two radiative transfer models suggests the contribution of modeling error is small compared to the total uncertainty although comparable to uncertainty arising from some individual properties.

  11. Forest floor vegetation response to nitrogen deposition in Europe.

    PubMed

    Dirnböck, Thomas; Grandin, Ulf; Bernhardt-Römermann, Markus; Beudert, Burkhardt; Canullo, Roberto; Forsius, Martin; Grabner, Maria-Theresia; Holmberg, Maria; Kleemola, Sirpa; Lundin, Lars; Mirtl, Michael; Neumann, Markus; Pompei, Enrico; Salemaa, Maija; Starlinger, Franz; Staszewski, Tomasz; Uziębło, Aldona Katarzyna

    2014-02-01

    Chronic nitrogen (N) deposition is a threat to biodiversity that results from the eutrophication of ecosystems. We studied long-term monitoring data from 28 forest sites with a total of 1,335 permanent forest floor vegetation plots from northern Fennoscandia to southern Italy to analyse temporal trends in vascular plant species cover and diversity. We found that the cover of plant species which prefer nutrient-poor soils (oligotrophic species) decreased the more the measured N deposition exceeded the empirical critical load (CL) for eutrophication effects (P = 0.002). Although species preferring nutrient-rich sites (eutrophic species) did not experience a significantly increase in cover (P = 0.440), in comparison to oligotrophic species they had a marginally higher proportion among new occurring species (P = 0.091). The observed gradual replacement of oligotrophic species by eutrophic species as a response to N deposition seems to be a general pattern, as it was consistent on the European scale. Contrary to species cover changes, neither the decrease in species richness nor of homogeneity correlated with nitrogen CL exceedance (ExCLemp N). We assume that the lack of diversity changes resulted from the restricted time period of our observations. Although existing habitat-specific empirical CL still hold some uncertainty, we exemplify that they are useful indicators for the sensitivity of forest floor vegetation to N deposition.

  12. Volcano deformation source parameters estimated from InSAR and FEM-based nonlinear inverse methods: Sensitivities to uncertainties in seismic tomography

    NASA Astrophysics Data System (ADS)

    Masterlark, T.; Donovan, T. C.; Feigl, K. L.; Haney, M. M.; Thurber, C. H.

    2013-12-01

    Forward models of volcano deformation, due to a pressurized magma chamber embedded in an elastic domain, can predict observed surface deformation. Inverse models of surface deformation allow us to estimate characteristic parameters that describe the deformation source, such as the position and strength of a pressurized magma chamber embedded in an elastic domain. However, the specific distribution of material properties controls how the pressurization translates to surface deformation in a forward model, or alternatively, how observed surface deformation translates to source parameters in an inverse model. Seismic tomography models can describe the specific distributions of material properties that are necessary for accurate forward and inverse models of volcano deformation. The aim of this project is to investigate how uncertainties in seismic tomography models propagate into variations in the estimates of volcano deformation source parameters inverted from geodetic data. To do so, we combine FEM-based nonlinear inverse analyses of InSAR data for Okmok volcano, Alaska, as an example to estimate sensitivities of source parameters to uncertainties in seismic tomography. More specifically, we use Monte Carlo methods to construct an assembly of FEMs that simulate a pressurized magma chamber in the domain of Okmok. Each FEM simulates a realization of source parameters (three-component magma chamber position), a material property distribution that samples the seismic tomography model with a normal velocity perturbation of +/-10%, and a corresponding linear pressure estimate calculated using the Pinned Mesh Perturbation method. We then analyze the posteriori results to quantify sensitivities of source parameter estimates to the seismic tomography uncertainties. Preliminary results suggest that uncertainties in the seismic tomography do not significantly influence the estimated source parameters at a 95% confidence level. The presence of heterogeneous material properties

  13. Modelling the exposure to chemicals for risk assessment: a comprehensive library of multimedia and PBPK models for integration, prediction, uncertainty and sensitivity analysis - the MERLIN-Expo tool.

    PubMed

    Ciffroy, P; Alfonso, B; Altenpohl, A; Banjac, Z; Bierkens, J; Brochot, C; Critto, A; De Wilde, T; Fait, G; Fierens, T; Garratt, J; Giubilato, E; Grange, E; Johansson, E; Radomyski, A; Reschwann, K; Suciu, N; Tanaka, T; Tediosi, A; Van Holderbeke, M; Verdonck, F

    2016-10-15

    MERLIN-Expo is a library of models that was developed in the frame of the FP7 EU project 4FUN in order to provide an integrated assessment tool for state-of-the-art exposure assessment for environment, biota and humans, allowing the detection of scientific uncertainties at each step of the exposure process. This paper describes the main features of the MERLIN-Expo tool. The main challenges in exposure modelling that MERLIN-Expo has tackled are: (i) the integration of multimedia (MM) models simulating the fate of chemicals in environmental media, and of physiologically based pharmacokinetic (PBPK) models simulating the fate of chemicals in human body. MERLIN-Expo thus allows the determination of internal effective chemical concentrations; (ii) the incorporation of a set of functionalities for uncertainty/sensitivity analysis, from screening to variance-based approaches. The availability of such tools for uncertainty and sensitivity analysis aimed to facilitate the incorporation of such issues in future decision making; (iii) the integration of human and wildlife biota targets with common fate modelling in the environment. MERLIN-Expo is composed of a library of fate models dedicated to non biological receptor media (surface waters, soils, outdoor air), biological media of concern for humans (several cultivated crops, mammals, milk, fish), as well as wildlife biota (primary producers in rivers, invertebrates, fish) and humans. These models can be linked together to create flexible scenarios relevant for both human and wildlife biota exposure. Standardized documentation for each model and training material were prepared to support an accurate use of the tool by end-users. One of the objectives of the 4FUN project was also to increase the confidence in the applicability of the MERLIN-Expo tool through targeted realistic case studies. In particular, we aimed at demonstrating the feasibility of building complex realistic exposure scenarios and the accuracy of the

  14. Modelling the exposure to chemicals for risk assessment: a comprehensive library of multimedia and PBPK models for integration, prediction, uncertainty and sensitivity analysis - the MERLIN-Expo tool.

    PubMed

    Ciffroy, P; Alfonso, B; Altenpohl, A; Banjac, Z; Bierkens, J; Brochot, C; Critto, A; De Wilde, T; Fait, G; Fierens, T; Garratt, J; Giubilato, E; Grange, E; Johansson, E; Radomyski, A; Reschwann, K; Suciu, N; Tanaka, T; Tediosi, A; Van Holderbeke, M; Verdonck, F

    2016-10-15

    MERLIN-Expo is a library of models that was developed in the frame of the FP7 EU project 4FUN in order to provide an integrated assessment tool for state-of-the-art exposure assessment for environment, biota and humans, allowing the detection of scientific uncertainties at each step of the exposure process. This paper describes the main features of the MERLIN-Expo tool. The main challenges in exposure modelling that MERLIN-Expo has tackled are: (i) the integration of multimedia (MM) models simulating the fate of chemicals in environmental media, and of physiologically based pharmacokinetic (PBPK) models simulating the fate of chemicals in human body. MERLIN-Expo thus allows the determination of internal effective chemical concentrations; (ii) the incorporation of a set of functionalities for uncertainty/sensitivity analysis, from screening to variance-based approaches. The availability of such tools for uncertainty and sensitivity analysis aimed to facilitate the incorporation of such issues in future decision making; (iii) the integration of human and wildlife biota targets with common fate modelling in the environment. MERLIN-Expo is composed of a library of fate models dedicated to non biological receptor media (surface waters, soils, outdoor air), biological media of concern for humans (several cultivated crops, mammals, milk, fish), as well as wildlife biota (primary producers in rivers, invertebrates, fish) and humans. These models can be linked together to create flexible scenarios relevant for both human and wildlife biota exposure. Standardized documentation for each model and training material were prepared to support an accurate use of the tool by end-users. One of the objectives of the 4FUN project was also to increase the confidence in the applicability of the MERLIN-Expo tool through targeted realistic case studies. In particular, we aimed at demonstrating the feasibility of building complex realistic exposure scenarios and the accuracy of the

  15. Cooling Floor AC Systems

    NASA Astrophysics Data System (ADS)

    Jun, Lu; Hao, Ding; Hong, Zhang; Ce, Gao Dian

    The present HVAC equipments for the residential buildings in the Hot-summer-and-Cold-winter climate region are still at a high energy consuming level. So that the high efficiency HVAC system is an urgently need for achieving the preset government energy saving goal. With its advantage of highly sanitary, highly comfortable and uniform of temperature field, the hot-water resource floor radiation heating system has been widely accepted. This paper has put forward a new way in air-conditioning, which combines the fresh-air supply unit and such floor radiation system for the dehumidification and cooling in summer or heating in winter. By analyze its advantages and limitations, we found that this so called Cooling/ Heating Floor AC System can improve the IAQ of residential building while keep high efficiency quality. We also recommend a methodology for the HVAC system designing, which will ensure the reduction of energy cost of users.

  16. Using Real-time Event Tracking Sensitivity Analysis to Overcome Sensor Measurement Uncertainties of Geo-Information Management in Drilling Disasters

    NASA Astrophysics Data System (ADS)

    Tavakoli, S.; Poslad, S.; Fruhwirth, R.; Winter, M.

    2012-04-01

    This paper introduces an application of a novel EventTracker platform for instantaneous Sensitivity Analysis (SA) of large scale real-time geo-information. Earth disaster management systems demand high quality information to aid a quick and timely response to their evolving environments. The idea behind the proposed EventTracker platform is the assumption that modern information management systems are able to capture data in real-time and have the technological flexibility to adjust their services to work with specific sources of data/information. However, to assure this adaptation in real time, the online data should be collected, interpreted, and translated into corrective actions in a concise and timely manner. This can hardly be handled by existing sensitivity analysis methods because they rely on historical data and lazy processing algorithms. In event-driven systems, the effect of system inputs on its state is of value, as events could cause this state to change. This 'event triggering' situation underpins the logic of the proposed approach. Event tracking sensitivity analysis method describes the system variables and states as a collection of events. The higher the occurrence of an input variable during the trigger of event, the greater its potential impact will be on the final analysis of the system state. Experiments were designed to compare the proposed event tracking sensitivity analysis with existing Entropy-based sensitivity analysis methods. The results have shown a 10% improvement in a computational efficiency with no compromise for accuracy. It has also shown that the computational time to perform the sensitivity analysis is 0.5% of the time required compared to using the Entropy-based method. The proposed method has been applied to real world data in the context of preventing emerging crises at drilling rigs. One of the major purposes of such rigs is to drill boreholes to explore oil or gas reservoirs with the final scope of recovering the content

  17. A comparison of three adsorption equations and sensitivity study of parameter uncertainty effects on adsorption refrigeration thermal performance estimation

    NASA Astrophysics Data System (ADS)

    Zhao, Yongling; Hu, Eric; Blazewicz, Antoni

    2012-02-01

    This paper presents isosteric-based adsorption equilibrium tests of three activated carbon samples with methanol as an adsorbate. Experimental data was fitted into Langmuir equation, Freundlich equation and Dubinin-Astakov (D-A) equation, respectively. The fitted adsorption equations were compared in terms of agreement with experimental data. Moreover, equation format's impacts on calculation of the coefficient of performance (COP) and refrigeration capacity of an adsorption refrigeration system was analyzed. In addition, the sensitivity of each parameter in each adsorption equation format to the estimation of cycle's COP and refrigeration capacity was investigated. It was found that the D-A equation is the best form for presenting the adsorptive property of a carbon-methanol working pair. The D-A equation is recommended for estimating thermal performance of an adsorption refrigeration system because simulation results obtained using the D-A equation are less sensitive to errors of experimentally determined D-A equation's parameters.

  18. Sensitivity analysis of seismic hazard for Western Liguria (North Western Italy): A first attempt towards the understanding and quantification of hazard uncertainty

    NASA Astrophysics Data System (ADS)

    Barani, Simone; Spallarossa, Daniele; Bazzurro, Paolo; Eva, Claudio

    2007-05-01

    The use of logic trees in probabilistic seismic hazard analyses often involves a large number of branches that reflect the uncertainty in the selection of different models and in the selection of the parameter values of each model. The sensitivity analysis, as proposed by Rabinowitz and Steinberg [Rabinowitz, N., Steinberg, D.M., 1991. Seismic hazard sensitivity analysis: a multi-parameter approach. Bull. Seismol. Soc. Am. 81, 796-817], is an efficient tool that allows the construction of logic trees focusing attention on the parameters that have greater impact on the hazard. In this paper the sensitivity analysis is performed in order to identify the parameters that have the largest influence on the Western Liguria (North Western Italy) seismic hazard. The analysis is conducted for six strategic sites following the multi-parameter approach developed by Rabinowitz and Steinberg [Rabinowitz, N., Steinberg, D.M., 1991. Seismic hazard sensitivity analysis: a multi-parameter approach. Bull. Seismol. Soc. Am. 81, 796-817] and accounts for both mean hazard values and hazard values corresponding to different percentiles (e.g., 16%-ile and 84%-ile). The results are assessed in terms of the expected PGA with a 10% probability of exceedance in 50 years for rock conditions and account for both the contribution from specific source zones using the Cornell approach [Cornell, C.A., 1968. Engineering seismic risk analysis. Bull. Seismol. Soc. Am. 58, 1583-1606] and the spatially smoothed seismicity [Frankel, A., 1995. Mapping seismic hazard in the Central and Eastern United States. Seismol. Res. Lett. 66, 8-21]. The influence of different procedures for calculating seismic hazard, seismic catalogues (epicentral parameters), source zone models, frequency-magnitude parameters, maximum earthquake magnitude values and attenuation relationships is considered. As a result, the sensitivity analysis allows us to identify the parameters with higher influence on the hazard. Only these

  19. 5. Interior, second floor. Pressed metal ceiling, and wooden floors ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. Interior, second floor. Pressed metal ceiling, and wooden floors visible. Overhead light source toward rear of building indicates location of skylight. - 25-27 East Hanover Street (Commercial Building), 25-27 East Hanover Street, Trenton, Mercer County, NJ

  20. Two and Three Bedroom Units: First Floor Plan, Second Floor ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Two and Three Bedroom Units: First Floor Plan, Second Floor Plan, South Elevation (As Built), North Elevation (As Built), East Elevation (As Built), East Elevation (Existing), North Elevation (Existing) - Aluminum City Terrace, East Hill Drive, New Kensington, Westmoreland County, PA

  1. STIRLING'S QUARTERS SMALL BARN: FIRST FLOOR PLAN; SECOND FLOOR PLAN; ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    STIRLING'S QUARTERS SMALL BARN: FIRST FLOOR PLAN; SECOND FLOOR PLAN; SOUTH ELEVATION; EAST ELEVATION; NORTH ELEVATION; WEST ELEVATION. - Stirling's Quarters, 555 Yellow Springs Road, Tredyffrin Township, Valley Forge, Chester County, PA

  2. 16. STATIC TEST TOWER REMOVABLE FLOOR LEVEL VIEW OF FLOOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    16. STATIC TEST TOWER REMOVABLE FLOOR LEVEL VIEW OF FLOOR THAT FOLDS BACK TO ALLOW ROCKET PLACEMENT. - Marshall Space Flight Center, Saturn Propulsion & Structural Test Facility, East Test Area, Huntsville, Madison County, AL

  3. Sensitivity and uncertainty in the measurement of H*(10) in neutron fields using an REM500 and a multi-element TEPC.

    PubMed

    Waker, Anthony; Taylor, Graeme

    2014-10-01

    The REM500 is a commercial instrument based on a tissue-equivalent proportional counter (TEPC) that has been successfully deployed as a hand-held neutron monitor, although its sensitivity is regarded by some workers as low for nuclear power plant radiation protection work. Improvements in sensitivity can be obtained using a multi-element proportional counter design in which a large number of small detecting cavities replace the single large volume cavity of conventional TEPCs. In this work, the authors quantify the improvement in uncertainty that can be obtained by comparing the ambient dose equivalent measured with a REM500, which utilises a 5.72 cm (2(1/4) inch) diameter Rossi counter, with that of a multi-element TEPC designed to have the sensitivity of a 12.7 cm (5 inch) spherical TEPC. The results obtained also provide some insight into the influence of other design features of TEPCs, such as geometry and gas filling, on the measurement of ambient dose equivalent.

  4. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 developers manual.

    SciTech Connect

    Griffin, Joshua D. (Sandia National lababoratory, Livermore, CA); Eldred, Michael Scott; Martinez-Canales, Monica L.; Watson, Jean-Paul; Kolda, Tamara Gibson (Sandia National lababoratory, Livermore, CA); Giunta, Anthony Andrew; Adams, Brian M.; Swiler, Laura Painton; Williams, Pamela J.; Hough, Patricia Diane (Sandia National lababoratory, Livermore, CA); Gay, David M.; Dunlavy, Daniel M.; Eddy, John P.; Hart, William Eugene; Brown, Shannon L.

    2006-10-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a developers manual for the DAKOTA software and describes the DAKOTA class hierarchies and their interrelationships. It derives directly from annotation of the actual source code and provides detailed class documentation, including all member functions and attributes.

  5. Uncertainty quantification in MD simulations of concentration driven ionic flow through a silica nanopore. I. Sensitivity to physical parameters of the pore.

    PubMed

    Rizzi, F; Jones, R E; Debusschere, B J; Knio, O M

    2013-05-21

    In this article, uncertainty quantification is applied to molecular dynamics (MD) simulations of concentration driven ionic flow through a silica nanopore. We consider a silica pore model connecting two reservoirs containing a solution of sodium (Na(+)) and chloride (Cl(-)) ions in water. An ad hoc concentration control algorithm is developed to simulate a concentration driven counter flow of ions through the pore, with the ionic flux being the main observable extracted from the MD system. We explore the sensitivity of the system to two physical parameters of the pore, namely, the pore diameter and the gating charge. First we conduct a quantitative analysis of the impact of the pore diameter on the ionic flux, and interpret the results in terms of the interplay between size effects and ion mobility. Second, we analyze the effect of gating charge by treating the charge density over the pore surface as an uncertain parameter in a forward propagation study. Polynomial chaos expansions and Bayesian inference are exploited to isolate the effect of intrinsic noise and quantify the impact of parametric uncertainty on the MD predictions. We highlight the challenges arising from the heterogeneous nature of the system, given the several components involved, and from the substantial effect of the intrinsic thermal noise.

  6. Nonlinear sensitivity and uncertainty analysis in support of the blowdown heat transfer program. [Test 177 at Thermal-Hydraulic Test Facility

    SciTech Connect

    Ronen, Y.; Bjerke, M.A.; Cacuci, D.G.; Barhen, J.

    1980-11-01

    A nonlinear uncertainty analysis methodology based on the use of first and second order sensitivity coefficients is presented. As a practical demonstration, an uncertainty analysis of several responses of interest is performed for Test 177, which is part of a series of tests conducted at the Thermal-Hydraulic Test Facility (THTF) of the ORNL Engineering Technology Division Pressurized Water Reactor-Blowdown Heat Transfer (PWR-BDHT) program. These space- and time-dependent responses are: mass flow rate, temperature, pressure, density, enthalpy, and water qualtiy - in several volumetric regions of the experimental facility. The analysis shows that, over parts of the transient, the responses behave as linear functions of the input parameters; in these cases, their standard deviations are of the same order of magnitude as those of the input parameters. Otherwise, the responses exhibit nonlinearities and their standard deviations are considerably larger. The analysis also shows that the degree of nonlinearity of the responses is highly dependent on their volumetric locations.

  7. DAKOTA, a multilevel parellel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis:version 4.0 uers's manual.

    SciTech Connect

    Griffin, Joshua D. (Sandai National Labs, Livermore, CA); Eldred, Michael Scott; Martinez-Canales, Monica L.; Watson, Jean-Paul; Kolda, Tamara Gibson; Giunta, Anthony Andrew; Adams, Brian M.; Swiler, Laura Painton; Williams, Pamela J. (Sandai National Labs, Livermore, CA); Hough, Patricia Diane (Sandai National Labs, Livermore, CA); Gay, David M.; Dunlavy, Daniel M.; Eddy, John P.; Hart, William Eugene; Brown, Shannon L.

    2006-10-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the DAKOTA software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

  8. Uncertainty quantification in MD simulations of concentration driven ionic flow through a silica nanopore. I. Sensitivity to physical parameters of the pore

    NASA Astrophysics Data System (ADS)

    Rizzi, F.; Jones, R. E.; Debusschere, B. J.; Knio, O. M.

    2013-05-01

    In this article, uncertainty quantification is applied to molecular dynamics (MD) simulations of concentration driven ionic flow through a silica nanopore. We consider a silica pore model connecting two reservoirs containing a solution of sodium (Na+) and chloride (Cl-) ions in water. An ad hoc concentration control algorithm is developed to simulate a concentration driven counter flow of ions through the pore, with the ionic flux being the main observable extracted from the MD system. We explore the sensitivity of the system to two physical parameters of the pore, namely, the pore diameter and the gating charge. First we conduct a quantitative analysis of the impact of the pore diameter on the ionic flux, and interpret the results in terms of the interplay between size effects and ion mobility. Second, we analyze the effect of gating charge by treating the charge density over the pore surface as an uncertain parameter in a forward propagation study. Polynomial chaos expansions and Bayesian inference are exploited to isolate the effect of intrinsic noise and quantify the impact of parametric uncertainty on the MD predictions. We highlight the challenges arising from the heterogeneous nature of the system, given the several components involved, and from the substantial effect of the intrinsic thermal noise.

  9. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis. Version 5.0, developers manual.

    SciTech Connect

    Eldred, Michael Scott; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Hough, Patricia Diane; Gay, David M.; Eddy, John P.; Haskell, Karen H.

    2010-05-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a developers manual for the DAKOTA software and describes the DAKOTA class hierarchies and their interrelationships. It derives directly from annotation of the actual source code and provides detailed class documentation, including all member functions and attributes.

  10. DAKOTA : a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis. Version 5.0, user's manual.

    SciTech Connect

    Eldred, Michael Scott; Dalbey, Keith R.; Bohnhoff, William J.; Adams, Brian M.; Swiler, Laura Painton; Hough, Patricia Diane; Gay, David M.; Eddy, John P.; Haskell, Karen H.

    2010-05-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the DAKOTA software and provides capability overviews and procedures for software execution, as well as a variety of example studies.

  11. Uncertainty analysis

    SciTech Connect

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software.

  12. Uncertainty and sensitivity analysis for two-phase flow in the vicinity of the repository in the 1996 performance assessment for the Waste Isolation Pilot Plant: Disturbed conditions

    SciTech Connect

    HELTON,JON CRAIG; BEAN,J.E.; ECONOMY,K.; GARNER,J.W.; MACKINNON,ROBERT J.; MILLER,JOEL D.; SCHREIBER,J.D.; VAUGHN,PALMER

    2000-05-22

    Uncertainty and sensitivity analysis results obtained in the 1996 performance assessment (PA) for the Waste Isolation Pilot Plant (WIPP) are presented for two-phase flow in the vicinity of the repository under disturbed conditions resulting from drilling intrusions. Techniques based on Latin hypercube sampling, examination of scatterplots, stepwise regression analysis, partial correlation analysis and rank transformations are used to investigate brine inflow, gas generation repository pressure, brine saturation and brine and gas outflow. Of the variables under study, repository pressure and brine flow from the repository to the Culebra Dolomite are potentially the most important in PA for the WIPP. Subsequent to a drilling intrusion repository pressure was dominated by borehole permeability and generally below the level (i.e., 8 MPa) that could potentially produce spallings and direct brine releases. Brine flow from the repository to the Culebra Dolomite tended to be small or nonexistent with its occurrence and size also dominated by borehole permeability.

  13. CAST FLOOR WITH VIEW OF TORPEDO LADLE (BENEATH CAST FLOOR) ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    CAST FLOOR WITH VIEW OF TORPEDO LADLE (BENEATH CAST FLOOR) AND KEEPERS OF THE CAST HOUSE FLOOR, S.L. KIMBROUGH AND DAVID HOLMES. - U.S. Steel, Fairfield Works, Blast Furnace No. 8, North of Valley Road, West of Ensley-Pleasant Grove Road, Fairfield, Jefferson County, AL

  14. Assessing the natural attenuation of organic contaminants in aquifers using plume-scale electron and carbon balances: model development with analysis of uncertainty and parameter sensitivity.

    PubMed

    Thornton, S F; Lerner, D N; Banwart, S A

    2001-12-15

    A quantitative methodology is described for the field-scale performance assessment of natural attenuation using plume-scale electron and carbon balances. This provides a practical framework for the calculation of global mass balances for contaminant plumes, using mass inputs from the plume source, background groundwater and plume residuals in a simplified box model. Biodegradation processes and reactions included in the analysis are identified from electron acceptors, electron donors and degradation products present in these inputs. Parameter values used in the model are obtained from data acquired during typical site investigation and groundwater monitoring studies for natural attenuation schemes. The approach is evaluated for a UK Permo-Triassic Sandstone aquifer contaminated with a plume of phenolic compounds. Uncertainty in the model predictions and sensitivity to parameter values was assessed by probabilistic modelling using Monte Carlo methods. Sensitivity analyses were compared for different input parameter probability distributions and a base case using fixed parameter values, using an identical conceptual model and data set. Results show that consumption of oxidants by biodegradation is approximately balanced by the production of CH4 and total dissolved inorganic carbon (TDIC) which is conserved in the plume. Under this condition, either the plume electron or carbon balance can be used to determine contaminant mass loss, which is equivalent to only 4% of the estimated source term. This corresponds to a first order, plume-averaged, half-life of > 800 years. The electron balance is particularly sensitive to uncertainty in the source term and dispersive inputs. Reliable historical information on contaminant spillages and detailed site investigation are necessary to accurately characterise the source term. The dispersive influx is sensitive to variability in the plume mixing zone width. Consumption of aqueous oxidants greatly exceeds that of mineral oxidants

  15. RFID Data Cleaning for Shop Floor Applications

    NASA Astrophysics Data System (ADS)

    Ziekow, Holger; Ivantysynova, Lenka; Günter, Oliver

    In several case studies we found that shop-floor applications in manufacturing pose special challenges to cleaning RFID data. The underlying problem in many scenarios is the uncertainty about the exact location of observed RFID tags. Simple filter s provided in common middleware solutions do not cope well with these challenges. Therefore we have developed an approach based on maximum-likelihood estimation to infer a tag's location within the reader range. This enables improved RFID data cleaning in a number of application scenarios. We stress the benefits of our approach along exemplary application scenarios that we found in manufacturing. In simulations and experiments with real world data we show that our approach outperforms existing solutions. Our approach can extend RFID middleware or reader firmware, to improve the use of RFID in a range of shop-floor applications.

  16. Use of high-order sensitivity analysis and reduced-form modeling to quantify uncertainty in particulate matter simulations in the presence of uncertain emissions rates: A case study in Houston

    NASA Astrophysics Data System (ADS)

    Zhang, Wenxian; Trail, Marcus A.; Hu, Yongtao; Nenes, Athanasios; Russell, Armistead G.

    2015-12-01

    Regional air quality models are widely used to evaluate control strategy effectiveness. As such, it is important to understand the accuracy of model simulations to establish confidence in model performance and to guide further model development. Particulate matter with aerodynamic diameter less than 2.5 μm (PM2.5) is regulated as one of the criteria pollutants by the National Ambient Air Quality Standards (NAAQS), and PM2.5 concentrations have a complex dependence on the emissions of a number of precursors, including SO2, NOx, NH3, VOCs, and primary particulate matter (PM). This study quantifies how the emission-associated uncertainties affect modeled PM2.5 concentrations and sensitivities using a reduced-form approach. This approach is computationally efficient compared to the traditional Monte Carlo simulation. The reduced-form model represents the concentration-emission response and is constructed using first- and second-order sensitivities obtained from a single CMAQ/HDDM-PM simulation. A case study is conducted in the Houston-Galveston-Brazoria (HGB) area. The uncertainty of modeled, daily average PM2.5 concentrations due to uncertain emissions is estimated to fall between 42% and 52% for different simulated concentration levels, and the uncertainty is evenly distributed in the modeling domain. Emission-associated uncertainty can account for much of the difference between simulation and ground measurements as 60% of observed PM2.5 concentrations fall within the range of one standard deviation of corresponding simulated PM2.5 concentrations. Uncertainties in meteorological fields as well as the model representation of secondary organic aerosol formation are the other two key contributors to the uncertainty of modeled PM2.5. This study also investigates the uncertainties of the simulated first-order sensitivities, and found that the larger the first-order sensitivity, the lower its uncertainty associated with emissions. Sensitivity of PM2.5 to primary PM has

  17. Chronic pelvic floor dysfunction.

    PubMed

    Hartmann, Dee; Sarton, Julie

    2014-10-01

    The successful treatment of women with vestibulodynia and its associated chronic pelvic floor dysfunctions requires interventions that address a broad field of possible pain contributors. Pelvic floor muscle hypertonicity was implicated in the mid-1990s as a trigger of major chronic vulvar pain. Painful bladder syndrome, irritable bowel syndrome, fibromyalgia, and temporomandibular jaw disorder are known common comorbidities that can cause a host of associated muscular, visceral, bony, and fascial dysfunctions. It appears that normalizing all of those disorders plays a pivotal role in reducing complaints of chronic vulvar pain and sexual dysfunction. Though the studies have yet to prove a specific protocol, physical therapists trained in pelvic dysfunction are reporting success with restoring tissue normalcy and reducing vulvar and sexual pain. A review of pelvic anatomy and common findings are presented along with suggested physical therapy management.

  18. Polygons on Crater Floor

    NASA Technical Reports Server (NTRS)

    2003-01-01

    MGS MOC Release No. MOC2-357, 11 May 2003

    This Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) picture shows a pattern of polygons on the floor of a northern plains impact crater. These landforms are common on crater floors at high latitudes on Mars. Similar polygons occur in the arctic and antarctic regions of Earth, where they indicate the presence and freeze-thaw cycling of ground ice. Whether the polygons on Mars also indicate water ice in the ground is uncertain. The image is located in a crater at 64.8oN, 292.7oW. Sunlight illuminates the scene from the lower left.

  19. Floor of Hellas Basin

    NASA Technical Reports Server (NTRS)

    2002-01-01

    [figure removed for brevity, see original site]

    With a diameter of roughly 2000 km and a depth of over 7 km, the Hellas Basin is the largest impact feature on Mars. Because of its great depth, there is significantly more atmosphere to peer through in order to see its floor, reducing the quality of the images taken from orbit. This THEMIS image straddles a scarp between the Hellas floor and an accumulation of material at least a half kilometer thick that covers much of the floor. The southern half of the image contains some of this material. Strange ovoid landforms are present here that give the appearance of flow. It is possible that water ice or even liquid water was present in the deposits and somehow responsible for the observed landscape. The floor of Hellas remains a poorly understood portion of the planet that should benefit from the analysis of new THEMIS data.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in

  20. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  1. Modular Flooring System

    NASA Technical Reports Server (NTRS)

    Thate, Robert

    2012-01-01

    The modular flooring system (MFS) was developed to provide a portable, modular, durable carpeting solution for NASA fs Robotics Alliance Project fs (RAP) outreach efforts. It was also designed to improve and replace a modular flooring system that was too heavy for safe use and transportation. The MFS was developed for use as the flooring for various robotics competitions that RAP utilizes to meet its mission goals. One of these competitions, the FIRST Robotics Competition (FRC), currently uses two massive rolls of broadloom carpet for the foundation of the arena in which the robots are contained during the competition. The area of the arena is approximately 30 by 72 ft (approximately 9 by 22 m). This carpet is very cumbersome and requires large-capacity vehicles, and handling equipment and personnel to transport and deploy. The broadloom carpet sustains severe abuse from the robots during a regular three-day competition, and as a result, the carpet is not used again for competition. Similarly, broadloom carpets used for trade shows at convention centers around the world are typically discarded after only one use. This innovation provides a green solution to this wasteful practice. Each of the flooring modules in the previous system weighed 44 lb (.20 kg). The improvements in the overall design of the system reduce the weight of each module by approximately 22 lb (.10 kg) (50 %), and utilize an improved "module-to-module" connection method that is superior to the previous system. The MFS comprises 4-by-4-ft (.1.2-by- 1.2-m) carpet module assemblies that utilize commercially available carpet tiles that are bonded to a lightweight substrate. The substrate surface opposite from the carpeted surface has a module-to-module connecting interface that allows for the modules to be connected, one to the other, as the modules are constructed. This connection is hidden underneath the modules, creating a smooth, co-planar flooring surface. The modules are stacked and strapped

  2. Floor of Baldet Crater

    NASA Technical Reports Server (NTRS)

    2002-01-01

    (Released 13 June 2002) The Science This THEMIS visible image shows a remarkable array of dunes on the floor of a large impact crater named Baldet located near 22.8o N. Many of the dunes in this region are isolated features, with large, sand-free 'interdune' surfaces between the individual dunes. These isolated dunes typically occur in regions where there is a limited supply of sand. Any sand that is present moves rapidly across the interdune surfaces, which in many cases are hardened surfaces over which the sand can easily bounce, or 'saltate.' When this loose sand lands on a dune it cannot travel as quickly and is trapped within the dune. In some areas within this sand mass the dunes have grown together to form crescent dunes and dune ridges. The dunes in this image are likely active today, slowly migrating across the crater floor. THEMIS will re-image this and other dunes throughout the Mars Odyssey mission to search for any evidence of dune motion over time. Based on the asymmetrical shape of the dunes, the wind direction over much of the dune field appears to be from the right (west) or upper right (northwest). However, the topography of the crater floor apparently produces complex wind patterns within the dune field, as can be seen by the different orientations of the dunes. For example the dunes in the lower portion of the image appear to be somewhat symmetrical and aligned east-west, suggesting that the wind in this region blows from both the north (top) and south (bottom). The Story A fuzzy 'carpet' of sand dunes covers the floor of a large impact crater, which you can see almost in full in the context image to the right. While the dunes give this area a plush, tufted look, there actually isn't a lot of sand in this area. How can you tell? Large, sand-free spaces exist in between the dunes, and those usually occur when sand particles are sparse. You can see these 'interdune spaces' better if you click on the image for the more detailed view. The sand that

  3. Comparison of approaches for measuring the mass accommodation coefficient for the condensation of water and sensitivities to uncertainties in thermophysical properties.

    PubMed

    Miles, Rachael E H; Reid, Jonathan P; Riipinen, Ilona

    2012-11-01

    We compare and contrast measurements of the mass accommodation coefficient of water on a water surface made using ensemble and single particle techniques under conditions of supersaturation and subsaturation, respectively. In particular, we consider measurements made using an expansion chamber, a continuous flow streamwise thermal gradient cloud condensation nuclei chamber, the Leipzig Aerosol Cloud Interaction Simulator, aerosol optical tweezers, and electrodynamic balances. Although this assessment is not intended to be comprehensive, these five techniques are complementary in their approach and give values that span the range from near 0.1 to 1.0 for the mass accommodation coefficient. We use the same semianalytical treatment to assess the sensitivities of the measurements made by the various techniques to thermophysical quantities (diffusion constants, thermal conductivities, saturation pressure of water, latent heat, and solution density) and experimental parameters (saturation value and temperature). This represents the first effort to assess and compare measurements made by different techniques to attempt to reduce the uncertainty in the value of the mass accommodation coefficient. Broadly, we show that the measurements are consistent within the uncertainties inherent to the thermophysical and experimental parameters and that the value of the mass accommodation coefficient should be considered to be larger than 0.5. Accurate control and measurement of the saturation ratio is shown to be critical for a successful investigation of the surface transport kinetics during condensation/evaporation. This invariably requires accurate knowledge of the partial pressure of water, the system temperature, the droplet curvature and the saturation pressure of water. Further, the importance of including and quantifying the transport of heat in interpreting droplet measurements is highlighted; the particular issues associated with interpreting measurements of condensation

  4. An ensemble study of HyMeX IOP6 and IOP7a: sensitivity to physical and initial and boundary condition uncertainties

    NASA Astrophysics Data System (ADS)

    Hally, A.; Richard, E.; Ducrocq, V.

    2014-05-01

    The first Special Observation Period of the HyMeX campaign took place in the Mediterranean between September and November 2012 with the aim of better understanding the mechanisms which lead to heavy precipitation events (HPEs) in the region during the autumn months. Two such events, referred to as Intensive Observation Period 6 (IOP6) and Intensive Observation Period 7a (IOP7a), occurred respectively on 24 and 26 September over south-eastern France. IOP6 was characterised by moderate to weak low-level flow which led to heavy and concentrated convective rainfall over the plains near the coast, while IOP7a had strong low-level flow and consisted of a convective line over the mountainous regions further north and a band of stratiform rainfall further east. Firstly, an ensemble was constructed for each IOP using analyses from the AROME, AROME-WMED, ARPEGE and ECMWF operational models as initial (IC) and boundary (BC) conditions for the research model Meso-NH at a resolution of 2.5 km. A high level of model skill was seen for IOP7a, with a lower level of agreement with the observations for IOP6. Using the most accurate member of this ensemble as a CTRL simulation, three further ensembles were constructed in order to study uncertainties related to cloud physics and surface turbulence parameterisations. Perturbations were introduced by perturbing the time tendencies of the warm and cold microphysical and turbulence processes. An ensemble where all three sources of uncertainty were perturbed gave the greatest degree of dispersion in the surface rainfall for both IOPs. Comparing the level of dispersion to that of the ICBC ensemble demonstrated that when model skill is low (high) and low-level flow is weak to moderate (strong), the level of dispersion of the ICBC and physical perturbation ensembles is (is not) comparable. The level of sensitivity to these perturbations is thus concluded to be case dependent.

  5. An ensemble study of HyMeX IOP6 and IOP7a: sensitivity to physical and initial and boundary condition uncertainties

    NASA Astrophysics Data System (ADS)

    Hally, A.; Richard, E.; Ducrocq, V.

    2013-12-01

    The first Special Observation Period of the HyMeX campaign took place in the Mediterranean between September and November 2012 with the aim of better understanding the mechanisms which lead to heavy precipitation events (HPEs) in the region during the autumn months. Two such events, referred to as Intensive Observation Period 6 (IOP6) and Intensive Observation Period 7a (IOP7a), occurred respectively on 24 and 26 September over south-eastern France. IOP6 was characterised by moderate to weak low-level flow which led to heavy and concentrated convective rainfall over the plains near the coast, while IOP7a had strong low-level flow and consisted of a convective line over the mountainous regions further north and a band of stratiform rainfall further east. Firstly, an ensemble was constructed for each IOP using analyses from the AROME, AROME-WMED, ARPEGE and ECMWF operational models as initial (IC) and boundary (BC) conditions for the research model Meso-NH at a resolution of 2.5 km. A high level of model skill was seen for IOP7a, with a lower level of agreement with the observations for IOP6. Using the most accurate member of this ensemble as a CTRL simulation, three further ensembles were constructed in order to study uncertainties related to cloud physic and surface turbulence parameterisations. Perturbations were introduced by perturbing the time tendencies of the warm and cold microphysical and turbulence processes. An ensemble where all three sources of uncertainty were perturbed gave the greatest degree of dispersion in the surface rainfall for both IOPs. Comparing the level of dispersion to that of the ICBC ensemble demonstrated that when model skill is low (high) and low-level flow is weak to moderate (strong), the level of dispersion of the ICBC and physical perturbation ensembles is (is not) comparable. The level of sensitivity to these perturbations is thus concluded to be case dependent.

  6. Comparison of Approaches for Measuring the Mass Accommodation Coefficient for the Condensation of Water and Sensitivities to Uncertainties in Thermophysical Properties

    PubMed Central

    2012-01-01

    We compare and contrast measurements of the mass accommodation coefficient of water on a water surface made using ensemble and single particle techniques under conditions of supersaturation and subsaturation, respectively. In particular, we consider measurements made using an expansion chamber, a continuous flow streamwise thermal gradient cloud condensation nuclei chamber, the Leipzig Aerosol Cloud Interaction Simulator, aerosol optical tweezers, and electrodynamic balances. Although this assessment is not intended to be comprehensive, these five techniques are complementary in their approach and give values that span the range from near 0.1 to 1.0 for the mass accommodation coefficient. We use the same semianalytical treatment to assess the sensitivities of the measurements made by the various techniques to thermophysical quantities (diffusion constants, thermal conductivities, saturation pressure of water, latent heat, and solution density) and experimental parameters (saturation value and temperature). This represents the first effort to assess and compare measurements made by different techniques to attempt to reduce the uncertainty in the value of the mass accommodation coefficient. Broadly, we show that the measurements are consistent within the uncertainties inherent to the thermophysical and experimental parameters and that the value of the mass accommodation coefficient should be considered to be larger than 0.5. Accurate control and measurement of the saturation ratio is shown to be critical for a successful investigation of the surface transport kinetics during condensation/evaporation. This invariably requires accurate knowledge of the partial pressure of water, the system temperature, the droplet curvature and the saturation pressure of water. Further, the importance of including and quantifying the transport of heat in interpreting droplet measurements is highlighted; the particular issues associated with interpreting measurements of condensation

  7. Comparison of approaches for measuring the mass accommodation coefficient for the condensation of water and sensitivities to uncertainties in thermophysical properties.

    PubMed

    Miles, Rachael E H; Reid, Jonathan P; Riipinen, Ilona

    2012-11-01

    We compare and contrast measurements of the mass accommodation coefficient of water on a water surface made using ensemble and single particle techniques under conditions of supersaturation and subsaturation, respectively. In particular, we consider measurements made using an expansion chamber, a continuous flow streamwise thermal gradient cloud condensation nuclei chamber, the Leipzig Aerosol Cloud Interaction Simulator, aerosol optical tweezers, and electrodynamic balances. Although this assessment is not intended to be comprehensive, these five techniques are complementary in their approach and give values that span the range from near 0.1 to 1.0 for the mass accommodation coefficient. We use the same semianalytical treatment to assess the sensitivities of the measurements made by the various techniques to thermophysical quantities (diffusion constants, thermal conductivities, saturation pressure of water, latent heat, and solution density) and experimental parameters (saturation value and temperature). This represents the first effort to assess and compare measurements made by different techniques to attempt to reduce the uncertainty in the value of the mass accommodation coefficient. Broadly, we show that the measurements are consistent within the uncertainties inherent to the thermophysical and experimental parameters and that the value of the mass accommodation coefficient should be considered to be larger than 0.5. Accurate control and measurement of the saturation ratio is shown to be critical for a successful investigation of the surface transport kinetics during condensation/evaporation. This invariably requires accurate knowledge of the partial pressure of water, the system temperature, the droplet curvature and the saturation pressure of water. Further, the importance of including and quantifying the transport of heat in interpreting droplet measurements is highlighted; the particular issues associated with interpreting measurements of condensation

  8. Model-based decision analysis of remedial alternatives using info-gap theory and Agent-Based Analysis of Global Uncertainty and Sensitivity (ABAGUS)

    NASA Astrophysics Data System (ADS)

    Harp, D.; Vesselinov, V. V.

    2011-12-01

    A newly developed methodology to model-based decision analysis is presented. The methodology incorporates a sampling approach, referred to as Agent-Based Analysis of Global Uncertainty and Sensitivity (ABAGUS; Harp & Vesselinov; 2011), that efficiently collects sets of acceptable solutions (i.e. acceptable model parameter sets) for different levels of a model performance metric representing the consistency of model predictions to observations. In this case, the performance metric is based on model residuals (i.e. discrepancies between observations and simulations). ABAGUS collects acceptable solutions from a discretized parameter space and stores them in a KD-tree for efficient retrieval. The parameter space domain (parameter minimum/maximum ranges) and discretization are predefined. On subsequent visits to collected locations, agents are provided with a modified value of the performance metric, and the model solution is not recalculated. The modified values of the performance metric sculpt the response surface (convexities become concavities), repulsing agents from collected regions. This promotes global exploration of the parameter space and discourages reinvestigation of regions of previously collected acceptable solutions. The resulting sets of acceptable solutions are formulated into a decision analysis using concepts from info-gap theory (Ben-Haim, 2006). Using info-gap theory, the decision robustness and opportuneness are quantified, providing measures of the immunity to failure and windfall, respectively, of alternative decisions. The approach is intended for cases where the information is extremely limited, resulting in non-probabilistic uncertainties concerning model properties such as boundary and initial conditions, model parameters, conceptual model elements, etc. The information provided by this analysis is weaker than the information provided by probabilistic decision analyses (i.e. posterior parameter distributions are not produced), however, this

  9. Crater Wall and Floor

    NASA Technical Reports Server (NTRS)

    2003-01-01

    [figure removed for brevity, see original site]

    3D Projection onto MOLA data [figure removed for brevity, see original site]

    The impact crater observed in this THEMIS image taken in Terra Cimmeria suggests sediments have filled the crater due to the flat and smooth nature of the floor compared to rougher surfaces at higher elevations. The abundance of several smaller impact craters on the floor of the larger crater indicate however that the flat surface has been exposed for an extended period of time. The smooth surface of the crater floor and rougher surfaces at higher elevations are observed in the 3-D THEMIS image that is draped over MOLA topography (2X vertical exaggeration).

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.

    Image information: VIS instrument. Latitude -22.9, Longitude 155.7 East (204.3 West). 19 meter/pixel resolution.

  10. Reull Vallis Floor

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image shows the odd patterns of erosion on the floor of Reull Vallis, a major valley system east of the Hellas Basin in the martian southern hemisphere. Somewhat circular features in this image may have once been meteor craters that were eroded and deformed by erosive processes. This image is located near 42.1oS, 254.5oW. The picture covers an area about 3 km (1.9 mi) wide. Sunlight illuminates the scene from the upper left.

  11. Mesas on Depression Floor

    NASA Technical Reports Server (NTRS)

    2004-01-01

    3 August 2004 This Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image shows mesas and buttes on the floor of a depression in the Labyrinthus Noctis region of Mars. This is part of the western Valles Marineris. Each mesa is a remnant of a formerly more extensive sequence of rock. The image is located near 7.0oS, 99.2oW. It covers an area about 3 km (1.9 mi) across; sunlight illuminates the scene from the lower left.

  12. Rippled Valley Floor

    NASA Technical Reports Server (NTRS)

    2005-01-01

    15 September 2005 This Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image shows a ripple-covered valley floor in the Hyblaeus Fossae region. Winds blowing up and down the length of the valley have helped to concentrate windblown grains to form these large, megaripples.

    Location near: 26.3oN, 225.1oW Image width: width: 3 km (1.9 mi) Illumination from: lower left Season: Northern Autumn

  13. Crater Floor Yardangs

    NASA Technical Reports Server (NTRS)

    2004-01-01

    1 December 2004 This Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image shows a group of semi-parallel ridges--yardangs--etched by wind into layered sedimentary rock on the floor of an unnamed crater in Terra Cimmeria. Many craters on Mars have been the sites of sedimentation. Over time, these sediments have become lithified. This picture is located near 31.3oS, 214.6oW. The image covers an area approximately 3 km (1.9 mi) across. Sunlight illuminates the scene from the left/upper left.

  14. Concentric Crater Floor

    NASA Technical Reports Server (NTRS)

    2004-01-01

    8 July 2004 This Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image shows the interior of a typical crater in northern Acidalia Planitia. The floor is covered by material that forms an almost concentric pattern. In this case, the semi-concentric rings might be an expression of eroded layered material, although this interpretation is uncertain. The crater is located near 44.0oN, 27.7oW, and covers an area about 3 km (1.9 mi) wide. Sunlight illuminates the scene from the lower left.

  15. Evaluating sub-national building-energy efficiency policy options under uncertainty: Efficient sensitivity testing of alternative climate, technolgical, and socioeconomic futures in a regional intergrated-assessment model.

    SciTech Connect

    Scott, Michael J.; Daly, Don S.; Zhou, Yuyu; Rice, Jennie S.; Patel, Pralit L.; McJeon, Haewon C.; Kyle, G. Page; Kim, Son H.; Eom, Jiyong; Clarke, Leon E.

    2014-05-01

    Improving the energy efficiency of the building stock, commercial equipment and household appliances can have a major impact on energy use, carbon emissions, and building services. Subnational regions such as U.S. states wish to increase their energy efficiency, reduce carbon emissions or adapt to climate change. Evaluating subnational policies to reduce energy use and emissions is difficult because of the uncertainties in socioeconomic factors, technology performance and cost, and energy and climate policies. Climate change may undercut such policies. Assessing these uncertainties can be a significant modeling and computation burden. As part of this uncertainty assessment, this paper demonstrates how a decision-focused sensitivity analysis strategy using fractional factorial methods can be applied to reveal the important drivers for detailed uncertainty analysis.

  16. [Pelvic floor muscle training and pelvic floor disorders in women].

    PubMed

    Thubert, T; Bakker, E; Fritel, X

    2015-05-01

    Our goal is to provide an update on the results of pelvic floor rehabilitation in the treatment of urinary incontinence and genital prolapse symptoms. Pelvic floor muscle training allows a reduction of urinary incontinence symptoms. Pelvic floor muscle contractions supervised by a healthcare professional allow cure in half cases of stress urinary incontinence. Viewing this contraction through biofeedback improves outcomes, but this effect could also be due by a more intensive and prolonged program with the physiotherapist. The place of electrostimulation remains unclear. The results obtained with vaginal cones are similar to pelvic floor muscle training with or without biofeedback or electrostimulation. It is not known whether pelvic floor muscle training has an effect after one year. In case of stress urinary incontinence, supervised pelvic floor muscle training avoids surgery in half of the cases at 1-year follow-up. Pelvic floor muscle training is the first-line treatment of post-partum urinary incontinence. Its preventive effect is uncertain. Pelvic floor muscle training may reduce the symptoms associated with genital prolapse. In conclusion, pelvic floor rehabilitation supervised by a physiotherapist is an effective short-term treatment to reduce the symptoms of urinary incontinence or pelvic organ prolapse.

  17. 9. LOOKING FROM FLOOR 1 UP THROUGH OPENING TO FLOOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    9. LOOKING FROM FLOOR 1 UP THROUGH OPENING TO FLOOR 2; OPENING IN THE FLOOR IS TO ALLOW THE RUNNER STONES TO BE FLIPPED OVER FOR SHARPENING; AT THE FIRST FLOOR ARE THE POSTS SUPPORTING THE BRIDGEBEAMS ON WHICH THE BRIDGE TREES PIVOT; THE CENTER POST RISES ABOVE THE STONES TO RECEIVE THE FOOT BEARING OF THE UPRIGHT SHAFT; ALSO SEEN ARE THE STONE SPINDLWS, UNDER SIDES OF THE BED STONES, STONE NUT AND GREAT SPUR WHEEL. - Pantigo Windmill, James Lane, East Hampton, Suffolk County, NY

  18. Some Aspects of uncertainty in computational fluid dynamics results

    NASA Technical Reports Server (NTRS)

    Mehta, U. B.

    1991-01-01

    Uncertainties are inherent in computational fluid dynamics (CFD). These uncertainties need to be systematically addressed and managed. Sources of these uncertainty analysis are discussed. Some recommendations are made for quantification of CFD uncertainties. A practical method of uncertainty analysis is based on sensitivity analysis. When CFD is used to design fluid dynamic systems, sensitivity-uncertainty analysis is essential.

  19. Assessing the risk of bluetongue to UK livestock: uncertainty and sensitivity analyses of a temperature-dependent model for the basic reproduction number.

    PubMed

    Gubbins, Simon; Carpenter, Simon; Baylis, Matthew; Wood, James L N; Mellor, Philip S

    2008-03-01

    Since 1998 bluetongue virus (BTV), which causes bluetongue, a non-contagious, insect-borne infectious disease of ruminants, has expanded northwards in Europe in an unprecedented series of incursions, suggesting that there is a risk to the large and valuable British livestock industry. The basic reproduction number, R(0), provides a powerful tool with which to assess the level of risk posed by a disease. In this paper, we compute R(0) for BTV in a population comprising two host species, cattle and sheep. Estimates for each parameter which influences R(0) were obtained from the published literature, using those applicable to the UK situation wherever possible. Moreover, explicit temperature dependence was included for those parameters for which it had been quantified. Uncertainty and sensitivity analyses based on Latin hypercube sampling and partial rank correlation coefficients identified temperature, the probability of transmission from host to vector and the vector to host ratio as being most important in determining the magnitude of R(0). The importance of temperature reflects the fact that it influences many processes involved in the transmission of BTV and, in particular, the biting rate, the extrinsic incubation period and the vector mortality rate. PMID:17638649

  20. Canyon Floor Deposits

    NASA Technical Reports Server (NTRS)

    2005-01-01

    [figure removed for brevity, see original site] Context image for PIA03598 Canyon Floor Deposits

    The layered and wind eroded deposits seen in this VIS image occur on the floor of Chandor Chasma.

    Image information: VIS instrument. Latitude 5.2S, Longitude 283.4E. 17 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.

  1. Floor of Juventae Chasma

    NASA Technical Reports Server (NTRS)

    2002-01-01

    (Released 30 May 2002) Juventae Chasma is an enormous box canyon (250 km X 100 km) which opens to the north and forms the outflow channel Maja Vallis. Most Martian outflow channels such as Maja, Kasei, and Ares Valles begin at point sources such as box canyons and chaotic terrain and then flow unconfined into a basin region. This image captures a portion of the western floor of Juventae Chasma and shows a wide variety of landforms. Conical hills, mesas, buttes and plateaus of layered material dominate this scene and seem to be 'swimming' in vast sand sheets. The conical hills have a spur and gully topography associated with them while the flat topped buttes and mesas do not. This may be indicative of different materials that compose each of these landforms or it could be that the flat-topped layer has been completely eroded off of the conical hills thereby exposing a different rock type. Both the conical hills and flat-topped buttes and mesas have extensive scree slopes (heaps of eroded rock and debris). Ripples, which are inferred to be dunes, can also be seen amongst the hills. No impact craters can be seen in this image, indicating that the erosion and transport of material down the canyon wall and across the floor is occurring at a relatively rapid rate, so that any craters that form are rapidly buried or eroded.

  2. Fretted Terrain Valley Floor

    NASA Technical Reports Server (NTRS)

    2004-01-01

    30 December 2003 This December 2003 Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image shows lineated textures on the floor of a valley in the Deuteronilus region of Mars. Deuteronilus, and neighboring Protonilus and Nilosyrtis, have been known since the Mariner 9 mission as regions of 'fretted terrain.' In this context, 'fretted' does not mean 'worried,' it means 'eroded.' The fretted terrains of Mars are regions along the boundary between cratered highlands and northern lowland plains that have been broken-down into mesas, buttes, and valleys. On the floors of some of these valleys occurs a distinctive lineated and pitted texture--like the example shown here. The cause of the textures is not known, although for decades some scientists have speculated that ice is involved. While this is possible, it is far from a demonstrated fact. This picture is located near 40.1oN, 335.1oW, and covers an area approximately 3 km (1.9 mi) wide; sunlight illuminates the scene from the lower left.

  3. Candor Chasma Floor

    NASA Technical Reports Server (NTRS)

    2005-01-01

    [figure removed for brevity, see original site] Context image for PIA03080 Candor Chasma Floor

    This VIS image shows part of the layered and wind sculpted deposit that occurs on the floor of Candor Chasma.

    Image information: VIS instrument. Latitude 6.6S, Longitude 284.4E. 17 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.

  4. Spallanzani Cr. Floor

    NASA Technical Reports Server (NTRS)

    2005-01-01

    [figure removed for brevity, see original site] Context image for PIA03632 Spallanzani Cr. Floor

    This image was taken by one of the Mars Student Imaging Project (MSIP) teams. Their target is the unusual floor deposits in Spallanzani Crater. The wind may have affected the surface of the layered deposit. Small dunes have formed near the southern margin.

    Image information: VIS instrument. Latitude 57.9S, Longitude 86.5E. 17 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.

  5. Pelvic floor muscle training exercises

    MedlinePlus

    Kegel exercises ... Pelvic floor muscle training exercises are recommended for: Women with urinary stress incontinence Men with urinary stress incontinence after prostate surgery People who have fecal ...

  6. Measurement uncertainty.

    PubMed

    Bartley, David; Lidén, Göran

    2008-08-01

    The reporting of measurement uncertainty has recently undergone a major harmonization whereby characteristics of a measurement method obtained during establishment and application are combined componentwise. For example, the sometimes-pesky systematic error is included. A bias component of uncertainty can be often easily established as the uncertainty in the bias. However, beyond simply arriving at a value for uncertainty, meaning to this uncertainty if needed can sometimes be developed in terms of prediction confidence in uncertainty-based intervals covering what is to be measured. To this end, a link between concepts of accuracy and uncertainty is established through a simple yet accurate approximation to a random variable known as the non-central Student's t-distribution. Without a measureless and perpetual uncertainty, the drama of human life would be destroyed. Winston Churchill.

  7. Modeling, Uncertainty Quantification and Sensitivity Analysis of Subsurface Fluid Migration in the Above Zone Monitoring Interval of a Geologic Carbon Storage

    NASA Astrophysics Data System (ADS)

    Namhata, A.; Dilmore, R. M.; Oladyshkin, S.; Zhang, L.; Nakles, D. V.

    2015-12-01

    Carbon dioxide (CO2) storage into geological formations has significant potential for mitigating anthropogenic CO2 emissions. An increasing emphasis on the commercialization and implementation of this approach to store CO2 has led to the investigation of the physical processes involved and to the development of system-wide mathematical models for the evaluation of potential geologic storage sites and the risk associated with them. The sub-system components under investigation include the storage reservoir, caprock seals, and the above zone monitoring interval, or AZMI, to name a few. Diffusive leakage of CO2 through the caprock seal to overlying formations may occur due to its intrinsic permeability and/or the presence of natural/induced fractures. This results in a potential risk to environmental receptors such as underground sources of drinking water. In some instances, leaking CO2 also has the potential to reach the ground surface and result in atmospheric impacts. In this work, fluid (i.e., CO2 and brine) flow above the caprock, in the region designated as the AZMI, is modeled for a leakage event of a typical geologic storage system with different possible boundary scenarios. An analytical and approximate solution for radial migration of fluids in the AZMI with continuous inflow of fluids from the reservoir through the caprock has been developed. In its present form, the AZMI model predicts the spatial changes in pressure - gas saturations over time in a layer immediately above the caprock. The modeling is performed for a benchmark case and the data-driven approach of arbitrary Polynomial Chaos (aPC) Expansion is used to quantify the uncertainty of the model outputs based on the uncertainty of model input parameters such as porosity, permeability, formation thickness, and residual brine saturation. The recently developed aPC approach performs stochastic model reduction and approximates the models by a polynomial-based response surface. Finally, a global

  8. 18. MAIN FLOOR HOLDING TANKS Main floor, looking at ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    18. MAIN FLOOR - HOLDING TANKS Main floor, looking at holding tanks against the west wall, from which sluice gates are seen protruding. Right foreground-wooden holding tanks. Note narrow wooden flumes through which fish were sluiced into holding and brining tanks. - Hovden Cannery, 886 Cannery Row, Monterey, Monterey County, CA

  9. Floor Plans: Section "AA", Section "BB"; Floor Framing Plans: Section ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Floor Plans: Section "A-A", Section "B-B"; Floor Framing Plans: Section "A-A", Section "B-B" - Fort Washington, Fort Washington Light, Northeast side of Potomac River at Fort Washington Park, Fort Washington, Prince George's County, MD

  10. 18. FOURTH FLOOR BLDG. 28, RAISED CONCRETE SLAB FLOOR WITH ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    18. FOURTH FLOOR BLDG. 28, RAISED CONCRETE SLAB FLOOR WITH BLOCKS AND PULLEYS OVERHEAD LOOKING NORTHEAST. - Fafnir Bearing Plant, Bounded on North side by Myrtle Street, on South side by Orange Street, on East side by Booth Street & on West side by Grove Street, New Britain, Hartford County, CT

  11. 13. Bottom floor, tower interior showing concrete floor and cast ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    13. Bottom floor, tower interior showing concrete floor and cast iron bases for oil butts (oil butts removed when lighthouse lamp was converted to electric power.) - Block Island Southeast Light, Spring Street & Mohegan Trail at Mohegan Bluffs, New Shoreham, Washington County, RI

  12. Sea floor magnetic observatory

    NASA Astrophysics Data System (ADS)

    Korepanov, V.; Prystai, A.; Vallianatos, F.; Makris, J.

    2003-04-01

    The electromagnetic precursors of seismic hazards are widely accepted as strong evidence of the approaching earthquake or volcano eruption. The monitoring of these precursors are of main interest in densely populated areas, what creates serious problems to extract them at the strong industrial noise background. An interesting possibility to improve signal-to-noise ratio gives the installation of the observation points in the shelf zones near the possible earthquake places, what is fairly possible in most seismically active areas in Europe, e. g. in Greece and Italy. The serious restriction for this is the cost of the underwater instrumentation. To realize such experiments it requires the unification of efforts of several countries (e. g., GEOSTAR) or of the funds of some great companies (e. g., SIO magnetotelluric instrument). The progress in electronic components development as well as the appearance of inexpensive watertight glass spheres made it possible to decrease drastically the price of recently developed sea floor magnetic stations. The autonomous vector magnetometer LEMI-301 for sea bed application is described in the report. It is produced on the base of three-component flux-gate sensor. Non-magnetic housing and minimal magnetism of electronic components enable the instrument to be implemented as a monoblock construction where the electronic unit is placed close to the sensor. Automatic circuit provides convenient compensation of the initial field offset and readings of full value (6 digits) of the measured field. Timing by internal clock provides high accuracy synchronization of data. The internal flash memory assures long-term autonomous data storage. The system also has two-axes tilt measurement system. The methodological questions of magnetometer operation at sea bed were studied in order to avoid two types of errors appearing at such experimental cases. First is sea waving influence and second one magnetometer orientation at its random positioning on

  13. Stripped Crater Floor

    NASA Technical Reports Server (NTRS)

    2004-01-01

    10 February 2004 This full-resolution Mars Global Surveyor (MGS) Mars Orbiter Camera (MOC) image shows details on the floor of an ancient meteor crater in the northeastern part of Noachis Terra. After the crater formed, layers of material--perhaps sediment--were deposited in the crater. These materials became somewhat solidified, but later were eroded to form the patterns shown here. Many windblown ripples in the scene indicate the presence of coarse-grained sediment that was not completely stripped away by wind. The picture is located near 22.1oS, 307.0oW. Sunlight illuminates this scene from the left/upper left; the image covers an area 3 km (1.9 mi) wide.

  14. Floor-plan radar

    NASA Astrophysics Data System (ADS)

    Falconer, David G.; Ueberschaer, Ronald M.

    2000-07-01

    Urban-warfare specialists, law-enforcement officers, counter-drug agents, and counter-terrorism experts encounter operational situations where they must assault a target building and capture or rescue its occupants. To minimize potential casualties, the assault team needs a picture of the building's interior and a copy of its floor plan. With this need in mind, we constructed a scale model of a single- story house and imaged its interior using synthetic-aperture techniques. The interior and exterior walls nearest the radar set were imaged with good fidelity, but the distal ones appear poorly defined and surrounded by ghosts and artifacts. The latter defects are traceable to beam attenuation, wavefront distortion, multiple scattering, traveling waves, resonance phenomena, and other effects not accounted for in the traditional (noninteracting, isotropic point scatterer) model for radar imaging.

  15. Flow Along Valley Floors

    NASA Technical Reports Server (NTRS)

    2003-01-01

    [figure removed for brevity, see original site]

    Released 9 May 2003

    Lines indicative of flow in a valley floor (east to west) cut across similar lines in a slightly smaller valley (southeast to northwest), indicating both that material flowed along the valley floor (as opposed to across it) and that relative flow ages may be determined from crosscutting relationships.

    Image information: VIS instrument. Latitude 39.6, Longitude 31.1East (328.9). 19 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.

  16. What's New in Floor Care.

    ERIC Educational Resources Information Center

    Griffin, William R.

    1999-01-01

    Examines some of the new equipment, chemicals, and procedures in floor care to help educational facility managers develop floor care programs and improve performance. Trends include more mechanization, higher concentrations and environmentally preferable products for cleaning, and the use of written cleaning procedures. (GR)

  17. A Hybrid Waveform Inversion Scheme for the Determination of Locations and Moment Tensors of the Microseismic Events and the uncertainty and Sensitivity Analysis

    NASA Astrophysics Data System (ADS)

    Li, J.; Droujinine, A.; Shen, P.

    2011-12-01

    In this research, we developed a new hybrid waveform inversion scheme to determine the hypocenters, origin times and moment tensors of the microseismic events induced by hydraulic fracturing. To overcome the nonlinearity in the determination of the hypocenter and origin time of a microseismic event, we perform a global search for the hypocenter (x,y,z) and origin time (t0) in a gridded four-dimensional model space, and at each grid point of the four-dimensional model space, we perform a linear inversion for the moment tensor components (M11, M22, M33, M12, M13, M23) in a six-dimensional model subspace. By this two-step approach, we find a global estimate optimum solution in the four- plus six-dimensional total model space. Then we further perform a nonlinear, gradient-based inversion for a better hypocenter and origin time of the microseismic event starting from the global estimate optimum solution. The linear inversion for the moment tensor can also be performed at each iteration of the nonlinear inversion for the hypocenter and origin time. In the grid-linear-nonlinear hybrid approach, we avoid being trapped in the local minima in the inverse problem while reducing the computational cost. The Green's functions between a monitored regions and receivers are computed by the elastic wave reciprocity. We also have performed a systematic study of the uncertainty, resolution and sensitivity of the method and found that it has superior performance in determining the hypocenter and origin time of a microseismic event over the traditional travel time methods, while being able to deliver the focal mechanism solution for the event as well. The method is tested on a dataset from a hydraulic fracturing practice in an oil reservoir.

  18. Conceptual Uncertainty and Parameter Sensitivity in Subsurface Pathway Flow and Transport Modeling for the Idaho National Engineering and Environmental Laboratory's Subsurface Disposal Area

    NASA Astrophysics Data System (ADS)

    Magnuson, S. O.

    2002-05-01

    As part of an ongoing CERCLA evaluation, the migration of contaminants through the hydrologically complex subsurface at the Idaho National Engineering and Environmental Laboratory Subsurface Disposal Area (SDA) were modeled. The 180-meter thick vadose zone beneath the SDA is primarily composed of extrusive basalt flows that are extensively fractured. These flows are interrupted by thin, mostly continuous sedimentary interbeds that were deposited through aeolian and fluvial processes during periods of volcanic quiescence. The subsurface pathway modeling for the CERCLA assessment has been conducted in phases utilizing the results of characterization activities. The most recent model for the SDA used an equivalent porous continuum approach in a three-dimensional domain to represent movement of water and contaminants in the subsurface. Given the complexity of the subsurface at this site, the simulation results were acknowledged to be uncertain. This presentation will provide an overview of the current modeling effort for the SDA and how conceptual uncertainty was addressed by modeling different scenarios. These scenarios included assignment of infiltration boundary conditions, the effect of superimposing gaps in the interbeds, including the effect within the vadose zone from Big Lost River water discharged to the spreading areas approximately 1 km away, and a simplistic approximation to represent facilitated transport. Parametric sensitivity simulations were used to determine possible effects from assigned transport parameters such as partition coefficients and solubility limits that can vary widely with presumed geochemical conditions. Comparisons of simulated transport results to measured field concentrations in both the vadose zone and in the underlying Snake River Plain aquifer were made to determine the representativeness of the model results. Results of the SDA subsurface transport modeling have been used in part to guide additional field characterization

  19. Proposal for the utilization of the total cross section covariances and its correlations with channel reactions for sensitivity and uncertainty analysis

    SciTech Connect

    Sabouri, P.; Bidaud, A.

    2012-07-01

    An alternate method for the estimation of the global uncertainty on criticality, using the total cross section and its covariances, is proposed. Application of the method with currently available covariance data leads to an unrealistically large prediction of the global uncertainty on criticality. New covariances for total cross section and individual reactions are proposed. Analysis with the proposed covariance matrices is found to result in a global uncertainty for criticality consistent with the traditional method. Recommendations are made to evaluators for providing total cross section covariances. (authors)

  20. SU-E-J-166: Sensitivity of Clinically Relevant Dosimetric Parameters to Contouring Uncertainty During Post Implant Dosimetry of Prostate Permanent Seed Implants

    SciTech Connect

    Mashouf, S; Ravi, A; Morton, G; Song, W

    2015-06-15

    Purpose: There is a strong evidence relating post-implant dosimetry for permanent seed prostate brachytherpy to local control rates. The delineation of the prostate on CT images, however, represents a challenge as it is difficult to confidently identify the prostate borders from soft tissue surrounding it. This study aims at quantifying the sensitivity of clinically relevant dosimetric parameters to prostate contouring uncertainty. Methods: The post-implant CT images and plans for a cohort of 43 patients, who have received I–125 permanent prostate seed implant in our centre, were exported to MIM Symphony LDR brachytherapy treatment planning system (MIM Software Inc., Cleveland, OH). The prostate contours in post-implant CT images were expanded/contracted uniformly for margins of ±1.00mm, ±2.00mm, ±3.00mm, ±4.00mm and ±5.00mm (±0.01mm). The values for V100 and D90 were extracted from Dose Volume Histograms for each contour and compared. Results: The mean value of V100 and D90 was obtained as 92.3±8.4% and 108.4±12.3% respectively (Rx=145Gy). V100 was reduced by −3.2±1.5%, −7.2±3.0%, −12.8±4.0%, −19.0±4.8%, − 25.5±5.4% for expanded contours of prostate with margins of +1mm, +2mm, +3mm, +4mm, and +5mm, respectively, while it was increased by 1.6±1.2%, 2.4±2.4%, 2.7±3.2%, 2.9±4.2%, 2.9±5.1% for the contracted contours. D90 was reduced by −6.9±3.5%, −14.5±6.1%, −23.8±7.1%, − 33.6±8.5%, −40.6±8.7% and increased by 4.1±2.6%, 6.1±5.0%, 7.2±5.7%, 8.1±7.3% and 8.1±7.3% for the same set of contours. Conclusion: Systematic expansion errors of more than 1mm may likely render a plan sub-optimal. Conversely contraction errors may Result in labeling a plan likely as optimal. The use of MRI images to contour the prostate should results in better delineation of prostate organ which increases the predictive value of post-op plans. Since observers tend to overestimate the prostate volume on CT, compared with MRI, the impact of the

  1. Measurement Uncertainty

    NASA Astrophysics Data System (ADS)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  2. The Ocean Floor

    NASA Astrophysics Data System (ADS)

    Fox, Paul J.

    Over a relatively short period of time Bruce Heezen made significant, imaginative and timely contributions to our understanding of the processes that govern the origin and evolution of oceanic crust in space and time. It is certainly fitting that someone of Heezen's stature be honored by a memorial volume and the collection of papers in The Ocean Floor were gathered together for this purpose. Bruce was a gifted scientist with a wide-ranging appetite for all facets of earth science, and in this respect he would have appreciated the pot pourri of marine geological topics covered in the book (e.g., continental margin investigations, sedimentological processes, plate tectonic models). Unfortunately, the book does not have an overall impact that measures up to the man that it commemorates. Too many of the papers read as if the authors, after having agreed to contribute to the volume, reached deep into their files to dredge up a neglected manuscript on one subject or another. As a consequence, many of the papers lack zest and fail to stimulate interest beyond their narrowly focused themes.

  3. Waterproof Raised Floor Makes Utility Lines Accessible

    NASA Technical Reports Server (NTRS)

    Cohen, M. M.

    1984-01-01

    Floor for laboratories, hospitals and factories waterproof yet allows access to subfloor utilities. Elevated access floor system designed for installations with multitude of diverse utility systems routed under and up through floor and requirement of separation of potentially conflicting utility services. Floor covered by continuous sheet of heat resealable vinyl. Floor system cut open when changes are made in utility lines and ducts. After modifications, floor covering resealed to protect subfloor utilities from spills and leaks.

  4. Analysis of the Sensitivity and Uncertainty in 2-Stage Clonal Growth Models for Formaldehyde with Relevance to Other Biologically-Based Dose Response (BBDR) Models

    EPA Science Inventory

    The National Center for Environmental Assessment (NCEA) has conducted and supported research addressing uncertainties in 2-stage clonal growth models for cancer as applied to formaldehyde. In this report, we summarized publications resulting from this research effort, discussed t...

  5. Low floor mass transit vehicle

    DOEpatents

    Emmons, J. Bruce; Blessing, Leonard J.

    2004-02-03

    A mass transit vehicle includes a frame structure that provides an efficient and economical approach to providing a low floor bus. The inventive frame includes a stiff roof panel and a stiff floor panel. A plurality of generally vertical pillars extend between the roof and floor panels. A unique bracket arrangement is disclosed for connecting the pillars to the panels. Side panels are secured to the pillars and carry the shear stresses on the frame. A unique seating assembly that can be advantageously incorporated into the vehicle taking advantage of the load distributing features of the inventive frame is also disclosed.

  6. 21. VIEW OF THE FIRST FLOOR PLAN. THE FIRST FLOOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    21. VIEW OF THE FIRST FLOOR PLAN. THE FIRST FLOOR WAS USED FOR DEPLETED AND ENRICHED URANIUM FABRICATION. THE ORIGINAL DRAWING HAS BEEN ARCHIVED ON MICROFILM. THE DRAWING WAS REPRODUCED AT THE BEST QUALITY POSSIBLE. LETTERS AND NUMBERS IN THE CIRCLES INDICATE FOOTER AND/OR COLUMN LOCATIONS. - Rocky Flats Plant, Uranium Rolling & Forming Operations, Southeast section of plant, southeast quadrant of intersection of Central Avenue & Eighth Street, Golden, Jefferson County, CO

  7. 22. VIEW OF THE SECOND FLOOR PLAN. THE SECOND FLOOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    22. VIEW OF THE SECOND FLOOR PLAN. THE SECOND FLOOR CONTAINS THE AIR PLENUM ND SOME OFFICE SPACE. THE ORIGINAL DRAWING HAS BEEN ARCHIVED ON MICROFILM. THE DRAWING WAS REPRODUCED AT THE BEST QUALITY POSSIBLE. LETTERS AND NUMBERS IN THE CIRCLES INDICATE FOOTER AND/OR COLUMN LOCATIONS. - Rocky Flats Plant, Uranium Rolling & Forming Operations, Southeast section of plant, southeast quadrant of intersection of Central Avenue & Eighth Street, Golden, Jefferson County, CO

  8. STATISTICAL ANALYSIS OF TANK 18F FLOOR SAMPLE RESULTS

    SciTech Connect

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 18F as per the statistical sampling plan developed by Shine [1]. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL [2]. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples results [3] to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL{sub 95%}) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 18F. The uncertainty is quantified in this report by an upper 95% confidence limit (UCL{sub 95%}) on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL{sub 95%} was based entirely on the six current scrape sample results (each averaged across three analytical determinations).

  9. STATISTICAL ANALYSIS OF TANK 19F FLOOR SAMPLE RESULTS

    SciTech Connect

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 19F as per the statistical sampling plan developed by Harris and Shine. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples results to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL95%) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current scrape sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 19F. The uncertainty is quantified in this report by an UCL95% on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL95% was based entirely on the six current scrape sample results (each averaged across three analytical determinations).

  10. Adjoint-Based Uncertainty Quantification with MCNP

    SciTech Connect

    Seifried, Jeffrey E.

    2011-09-01

    This work serves to quantify the instantaneous uncertainties in neutron transport simulations born from nuclear data and statistical counting uncertainties. Perturbation and adjoint theories are used to derive implicit sensitivity expressions. These expressions are transformed into forms that are convenient for construction with MCNP6, creating the ability to perform adjoint-based uncertainty quantification with MCNP6. These new tools are exercised on the depleted-uranium hybrid LIFE blanket, quantifying its sensitivities and uncertainties to important figures of merit. Overall, these uncertainty estimates are small (< 2%). Having quantified the sensitivities and uncertainties, physical understanding of the system is gained and some confidence in the simulation is acquired.

  11. Channel Floor Yardangs

    NASA Technical Reports Server (NTRS)

    2004-01-01

    [figure removed for brevity, see original site]

    Released 19 July 2004 The atmosphere of Mars is a dynamic system. Water-ice clouds, fog, and hazes can make imaging the surface from space difficult. Dust storms can grow from local disturbances to global sizes, through which imaging is impossible. Seasonal temperature changes are the usual drivers in cloud and dust storm development and growth.

    Eons of atmospheric dust storm activity has left its mark on the surface of Mars. Dust carried aloft by the wind has settled out on every available surface; sand dunes have been created and moved by centuries of wind; and the effect of continual sand-blasting has modified many regions of Mars, creating yardangs and other unusual surface forms.

    The yardangs in this image are forming in channel floor deposits. The channel itself is funneling the wind to cause the erosion.

    Image information: VIS instrument. Latitude 4.5, Longitude 229.7 East (133.3 West). 19 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are

  12. Tangential Floor in a Classroom Setting

    ERIC Educational Resources Information Center

    Marti, Leyla

    2012-01-01

    This article examines floor management in two classroom sessions: a task-oriented computer lesson and a literature lesson. Recordings made in the computer lesson show the organization of floor when a task is given to students. Temporary or "incipient" side floors (Jones and Thornborrow, 2004) emerge beside the main floor. In the literature lesson,…

  13. 49 CFR 393.84 - Floors.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 5 2011-10-01 2011-10-01 false Floors. 393.84 Section 393.84 Transportation Other... Miscellaneous Parts and Accessories § 393.84 Floors. The flooring in all motor vehicles shall be substantially... fumes, exhaust gases, or fire. Floors shall not be permeated with oil or other substances likely...

  14. 49 CFR 393.84 - Floors.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 5 2010-10-01 2010-10-01 false Floors. 393.84 Section 393.84 Transportation Other... Miscellaneous Parts and Accessories § 393.84 Floors. The flooring in all motor vehicles shall be substantially... fumes, exhaust gases, or fire. Floors shall not be permeated with oil or other substances likely...

  15. Sensitivity of the Residual Circulation Diagnosed from the UARS Data to the Uncertainties in the Input Fields and to the Inclusion of Aerosols

    NASA Technical Reports Server (NTRS)

    Eluszkiewicz, J.; Crisp, D.; Granger, R. G.; Lambert, A.; Roche, A. E.; Kumer, J. B.; Mergenthaler, J. L.

    1996-01-01

    The simultaneous measurements of temperature, aerosol extinction, and of the radiatively active gases by several instruments onboard the Upper Atmosphere Research Satellite permits and assessment of the uncertainties in the diagnosed stratospheric heating rates and and in the resulting residual circulation.

  16. 17 CFR 1.62 - Contract market requirement for floor broker and floor trader registration.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... for floor broker and floor trader registration. 1.62 Section 1.62 Commodity and Securities Exchanges....62 Contract market requirement for floor broker and floor trader registration. (a)(1) Each contract... granted a temporary license as a floor broker; or (ii) Purchase or sell solely for such person's...

  17. 17 CFR 1.62 - Contract market requirement for floor broker and floor trader registration.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... for floor broker and floor trader registration. 1.62 Section 1.62 Commodity and Securities Exchanges....62 Contract market requirement for floor broker and floor trader registration. (a)(1) Each contract... granted a temporary license as a floor broker; or (ii) Purchase or sell solely for such person's...

  18. Functional anatomy of pelvic floor.

    PubMed

    Rocca Rossetti, Salvatore

    2016-03-31

    Generally, descriptions of the pelvic floor are discordant, since its complex structures and the complexity of pathological disorders of such structures; commonly the descriptions are sectorial, concerning muscles, fascial developments, ligaments and so on. On the contrary to understand completely nature and function of the pelvic floor it is necessary to study it in the most unitary view and in the most global aspect, considering embriology, philogenesy, anthropologic development and its multiple activities others than urological, gynaecological and intestinal ones. Recent acquirements succeeded in clarifying many aspects of pelvic floor activity, whose musculature has been investigated through electromyography, sonography, magnetic resonance, histology, histochemistry, molecular research. Utilizing recent research concerning not only urinary and gynecologic aspects but also those regarding statics and dynamics of pelvis and its floor, it is now possible to study this important body part as a unit; that means to consider it in the whole body economy to which maintaining upright position, walking and behavior or physical conduct do not share less than urinary, genital, and intestinal functions. It is today possible to consider the pelvic floor as a musclefascial unit with synergic and antagonistic activity of muscular bundles, among them more or less interlaced, with multiple functions and not only the function of pelvic cup closure.

  19. An uncertainty and sensitivity analysis applied to the prioritisation of pharmaceuticals as surface water contaminants from wastewater treatment plant direct emissions.

    PubMed

    Morais, Sérgio Alberto; Delerue-Matos, Cristina; Gabarrell, Xavier

    2014-08-15

    In this study, the concentration probability distributions of 82 pharmaceutical compounds detected in the effluents of 179 European wastewater treatment plants were computed and inserted into a multimedia fate model. The comparative ecotoxicological impact of the direct emission of these compounds from wastewater treatment plants on freshwater ecosystems, based on a potentially affected fraction (PAF) of species approach, was assessed to rank compounds based on priority. As many pharmaceuticals are acids or bases, the multimedia fate model accounts for regressions to estimate pH-dependent fate parameters. An uncertainty analysis was performed by means of Monte Carlo analysis, which included the uncertainty of fate and ecotoxicity model input variables, as well as the spatial variability of landscape characteristics on the European continental scale. Several pharmaceutical compounds were identified as being of greatest concern, including 7 analgesics/anti-inflammatories, 3 β-blockers, 3 psychiatric drugs, and 1 each of 6 other therapeutic classes. The fate and impact modelling relied extensively on estimated data, given that most of these compounds have little or no experimental fate or ecotoxicity data available, as well as a limited reported occurrence in effluents. The contribution of estimated model input variables to the variance of freshwater ecotoxicity impact, as well as the lack of experimental abiotic degradation data for most compounds, helped in establishing priorities for further testing. Generally, the effluent concentration and the ecotoxicity effect factor were the model input variables with the most significant effect on the uncertainty of output results.

  20. Uncertainty of modelled urban peak O3 concentrations and its sensitivity to input data perturbations based on the Monte Carlo analysis

    NASA Astrophysics Data System (ADS)

    Pineda Rojas, Andrea L.; Venegas, Laura E.; Mazzeo, Nicolás A.

    2016-09-01

    A simple urban air quality model [MODelo de Dispersión Atmosférica Ubana - Generic Reaction Set (DAUMOD-GRS)] was recently developed. One-hour peak O3 concentrations in the Metropolitan Area of Buenos Aires (MABA) during the summer estimated with the DAUMOD-GRS model have shown values lower than 20 ppb (the regional background concentration) in the urban area and levels greater than 40 ppb in its surroundings. Due to the lack of measurements outside the MABA, these relatively high ozone modelled concentrations constitute the only estimate for the area. In this work, a methodology based on the Monte Carlo analysis is implemented to evaluate the uncertainty in these modelled concentrations associated to possible errors of the model input data. Results show that the larger 1-h peak O3 levels in the MABA during the summer present larger uncertainties (up to 47 ppb). On the other hand, multiple linear regression analysis is applied at selected receptors in order to identify the variables explaining most of the obtained variance. Although their relative contributions vary spatially, the uncertainty of the regional background O3 concentration dominates at all the analysed receptors (34.4-97.6%), indicating that their estimations could be improved to enhance the ability of the model to simulate peak O3 concentrations in the MABA.

  1. 26. VIEW OF CUT AWAY FLOOR BUILDING 23 2ND FLOOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    26. VIEW OF CUT AWAY FLOOR BUILDING 23 2ND FLOOR SHOWING TYPICAL MILL CONSTRUCTION (SECTION OF FLOOR CONTAMINATED WITH HAZARDOUS MATERIAL WAS REMOVED FOR DISPOSAL) - Bryant Electric Company, 1421 State Street, Bridgeport, Fairfield County, CT

  2. Ploughing the deep sea floor.

    PubMed

    Puig, Pere; Canals, Miquel; Company, Joan B; Martín, Jacobo; Amblas, David; Lastras, Galderic; Palanques, Albert

    2012-09-13

    Bottom trawling is a non-selective commercial fishing technique whereby heavy nets and gear are pulled along the sea floor. The direct impact of this technique on fish populations and benthic communities has received much attention, but trawling can also modify the physical properties of seafloor sediments, water–sediment chemical exchanges and sediment fluxes. Most of the studies addressing the physical disturbances of trawl gear on the seabed have been undertaken in coastal and shelf environments, however, where the capacity of trawling to modify the seafloor morphology coexists with high-energy natural processes driving sediment erosion, transport and deposition. Here we show that on upper continental slopes, the reworking of the deep sea floor by trawling gradually modifies the shape of the submarine landscape over large spatial scales. We found that trawling-induced sediment displacement and removal from fishing grounds causes the morphology of the deep sea floor to become smoother over time, reducing its original complexity as shown by high-resolution seafloor relief maps. Our results suggest that in recent decades, following the industrialization of fishing fleets, bottom trawling has become an important driver of deep seascape evolution. Given the global dimension of this type of fishery, we anticipate that the morphology of the upper continental slope in many parts of the world’s oceans could be altered by intensive bottom trawling, producing comparable effects on the deep sea floor to those generated by agricultural ploughing on land. PMID:22951970

  3. Flooring for Schools: Unsightly Walkways

    ERIC Educational Resources Information Center

    Baxter, Mark

    2011-01-01

    Many mattress manufacturers recommend that consumers rotate their mattresses at least twice a year to help prevent soft spots from developing and increase the product's life span. It's unfortunate that the same kind of treatment can't be applied to flooring for schools, such as carpeting, especially in hallways. Being able to flip or turn a carpet…

  4. Ploughing the deep sea floor.

    PubMed

    Puig, Pere; Canals, Miquel; Company, Joan B; Martín, Jacobo; Amblas, David; Lastras, Galderic; Palanques, Albert

    2012-09-13

    Bottom trawling is a non-selective commercial fishing technique whereby heavy nets and gear are pulled along the sea floor. The direct impact of this technique on fish populations and benthic communities has received much attention, but trawling can also modify the physical properties of seafloor sediments, water–sediment chemical exchanges and sediment fluxes. Most of the studies addressing the physical disturbances of trawl gear on the seabed have been undertaken in coastal and shelf environments, however, where the capacity of trawling to modify the seafloor morphology coexists with high-energy natural processes driving sediment erosion, transport and deposition. Here we show that on upper continental slopes, the reworking of the deep sea floor by trawling gradually modifies the shape of the submarine landscape over large spatial scales. We found that trawling-induced sediment displacement and removal from fishing grounds causes the morphology of the deep sea floor to become smoother over time, reducing its original complexity as shown by high-resolution seafloor relief maps. Our results suggest that in recent decades, following the industrialization of fishing fleets, bottom trawling has become an important driver of deep seascape evolution. Given the global dimension of this type of fishery, we anticipate that the morphology of the upper continental slope in many parts of the world’s oceans could be altered by intensive bottom trawling, producing comparable effects on the deep sea floor to those generated by agricultural ploughing on land.

  5. Teaching Uncertainties

    ERIC Educational Resources Information Center

    Duerdoth, Ian

    2009-01-01

    The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…

  6. On the Sensitivity of Massive Star Nucleosynthesis and Evolution to Solar Abundances and to Uncertainties in Helium-Burning Reaction Rates

    NASA Astrophysics Data System (ADS)

    Tur, Clarisse; Heger, Alexander; Austin, Sam M.

    2007-12-01

    We explore the dependence of presupernova evolution and supernova nucleosynthesis yields on the uncertainties in helium-burning reaction rates. Using the revised solar abundances of Lodders for the initial stellar composition, instead of those of Anders and Grevesse, changes the supernova yields and limits the constraints that those yields place on the 12C(α,γ)16O reaction rate. The production factors of medium-weight elements (A=16-40) were found to be in reasonable agreement with observed solar ratios within the current experimental uncertainties in the triple-α reaction rate. Simultaneous variations by the same amount in both reaction rates or in either of them separately, however, can induce significant changes in the central 12C abundance at core carbon ignition and in the mass of the supernova remnant. It therefore remains important to have experimental determinations of the helium-burning rates so that their ratio and absolute values are known with an accuracy of 10% or better.

  7. Raise the Floor When Remodeling Science Labs

    ERIC Educational Resources Information Center

    Nation's Schools, 1972

    1972-01-01

    A new remodeling idea adopts the concept of raised floor covering gas, water, electrical, and drain lines. The accessible floor has removable panels set into an adjustable support frame 24 inches above a concrete subfloor. (Author)

  8. Sea-Floor Spreading and Transform Faults

    ERIC Educational Resources Information Center

    Armstrong, Ronald E.; And Others

    1978-01-01

    Presents the Crustal Evolution Education Project (CEEP) instructional module on Sea-Floor Spreading and Transform Faults. The module includes activities and materials required, procedures, summary questions, and extension ideas for teaching Sea-Floor Spreading. (SL)

  9. 49 CFR 38.59 - Floor surfaces.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 1 2012-10-01 2012-10-01 false Floor surfaces. 38.59 Section 38.59 Transportation Office of the Secretary of Transportation AMERICANS WITH DISABILITIES ACT (ADA) ACCESSIBILITY SPECIFICATIONS FOR TRANSPORTATION VEHICLES Rapid Rail Vehicles and Systems § 38.59 Floor surfaces. Floor surfaces on aisles, places for standees, and...

  10. 14 CFR 25.793 - Floor surfaces.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 14 Aeronautics and Space 1 2012-01-01 2012-01-01 false Floor surfaces. 25.793 Section 25.793 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY AIRPLANES Design and Construction Personnel and Cargo Accommodations § 25.793 Floor surfaces. The floor surface of...

  11. 49 CFR 38.59 - Floor surfaces.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 1 2013-10-01 2013-10-01 false Floor surfaces. 38.59 Section 38.59 Transportation Office of the Secretary of Transportation AMERICANS WITH DISABILITIES ACT (ADA) ACCESSIBILITY SPECIFICATIONS FOR TRANSPORTATION VEHICLES Rapid Rail Vehicles and Systems § 38.59 Floor surfaces. Floor surfaces on aisles, places for standees, and...

  12. 49 CFR 38.59 - Floor surfaces.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 1 2014-10-01 2014-10-01 false Floor surfaces. 38.59 Section 38.59 Transportation Office of the Secretary of Transportation AMERICANS WITH DISABILITIES ACT (ADA) ACCESSIBILITY SPECIFICATIONS FOR TRANSPORTATION VEHICLES Rapid Rail Vehicles and Systems § 38.59 Floor surfaces. Floor surfaces on aisles, places for standees, and...

  13. 7 CFR 2902.39 - Floor strippers.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Floor strippers. (a) Definition. Products that are formulated to loosen waxes, resins, or varnishes from floor surfaces. They can be in either liquid or gel form, and may also be used with or without... 7 Agriculture 15 2010-01-01 2010-01-01 false Floor strippers. 2902.39 Section 2902.39...

  14. 7 CFR 993.107 - Floor inspection.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Floor inspection. 993.107 Section 993.107 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Administrative Rules and Regulations Definitions § 993.107 Floor inspection. Floor inspection means inspection...

  15. 7 CFR 993.505 - Floor inspection.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Floor inspection. 993.505 Section 993.505 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Pack Specification as to Size Definitions § 993.505 Floor inspection. Floor inspection means...

  16. 49 CFR 38.59 - Floor surfaces.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 1 2011-10-01 2011-10-01 false Floor surfaces. 38.59 Section 38.59 Transportation Office of the Secretary of Transportation AMERICANS WITH DISABILITIES ACT (ADA) ACCESSIBILITY SPECIFICATIONS FOR TRANSPORTATION VEHICLES Rapid Rail Vehicles and Systems § 38.59 Floor surfaces. Floor...

  17. 7 CFR 993.505 - Floor inspection.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 8 2011-01-01 2011-01-01 false Floor inspection. 993.505 Section 993.505 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Pack Specification as to Size Definitions § 993.505 Floor inspection. Floor inspection means...

  18. 14 CFR 25.793 - Floor surfaces.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Floor surfaces. 25.793 Section 25.793 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT AIRWORTHINESS... Floor surfaces. The floor surface of all areas which are likely to become wet in service must have...

  19. 7 CFR 993.107 - Floor inspection.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 8 2011-01-01 2011-01-01 false Floor inspection. 993.107 Section 993.107 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Administrative Rules and Regulations Definitions § 993.107 Floor inspection. Floor inspection means inspection...

  20. 36 CFR 1192.59 - Floor surfaces.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 36 Parks, Forests, and Public Property 3 2011-07-01 2011-07-01 false Floor surfaces. 1192.59 Section 1192.59 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION BARRIERS COMPLIANCE... Rail Vehicles and Systems § 1192.59 Floor surfaces. Floor surfaces on aisles, places for standees,...

  1. 14 CFR 25.793 - Floor surfaces.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Floor surfaces. 25.793 Section 25.793 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF TRANSPORTATION AIRCRAFT AIRWORTHINESS... Floor surfaces. The floor surface of all areas which are likely to become wet in service must have...

  2. 36 CFR 1192.59 - Floor surfaces.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Floor surfaces. 1192.59 Section 1192.59 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION BARRIERS COMPLIANCE... Rail Vehicles and Systems § 1192.59 Floor surfaces. Floor surfaces on aisles, places for standees,...

  3. 49 CFR 38.59 - Floor surfaces.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false Floor surfaces. 38.59 Section 38.59 Transportation Office of the Secretary of Transportation AMERICANS WITH DISABILITIES ACT (ADA) ACCESSIBILITY SPECIFICATIONS FOR TRANSPORTATION VEHICLES Rapid Rail Vehicles and Systems § 38.59 Floor surfaces. Floor...

  4. Credible Computations: Standard and Uncertainty

    NASA Technical Reports Server (NTRS)

    Mehta, Unmeel B.; VanDalsem, William (Technical Monitor)

    1995-01-01

    The discipline of computational fluid dynamics (CFD) is at a crossroad. Most of the significant advances related to computational methods have taken place. The emphasis is now shifting from methods to results. Significant efforts are made in applying CFD to solve design problems. The value of CFD results in design depends on the credibility of computed results for the intended use. The process of establishing credibility requires a standard so that there is a consistency and uniformity in this process and in the interpretation of its outcome. The key element for establishing the credibility is the quantification of uncertainty. This paper presents salient features of a proposed standard and a procedure for determining the uncertainty. A customer of CFD products - computer codes and computed results - expects the following: A computer code in terms of its logic, numerics, and fluid dynamics and the results generated by this code are in compliance with specified requirements. This expectation is fulfilling by verification and validation of these requirements. The verification process assesses whether the problem is solved correctly and the validation process determines whether the right problem is solved. Standards for these processes are recommended. There is always some uncertainty, even if one uses validated models and verified computed results. The value of this uncertainty is important in the design process. This value is obtained by conducting a sensitivity-uncertainty analysis. Sensitivity analysis is generally defined as the procedure for determining the sensitivities of output parameters to input parameters. This analysis is a necessary step in the uncertainty analysis, and the results of this analysis highlight which computed quantities and integrated quantities in computations need to be determined accurately and which quantities do not require such attention. Uncertainty analysis is generally defined as the analysis of the effect of the uncertainties

  5. DAKOTA, A Multilevel Parallel Object-Oriented Framework for Design Optimization, Parameter Estimation, Uncertainty Quantification, and Sensitivity Analysis Version 3.0 Developers Manual (title change from electronic posting)

    SciTech Connect

    ELDRED, MICHAEL S.; GIUNTA, ANTHONY A.; VAN BLOEMEN WAANDERS, BART G.; WOJTKIEWICZ JR., STEVEN F.; HART, WILLIAM E.; ALLEVA, MARIO

    2002-04-01

    The DAKOTA (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a flexible and extensible interface between simulation codes and iterative analysis methods. DAKOTA contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quantification with sampling, analytic reliability, and stochastic finite element methods; parameter estimation with nonlinear least squares methods; and sensitivity analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components required for iterative systems analyses, the DAKOTA toolkit provides a flexible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a developers manual for the DAKOTA software and describes the DAKOTA class hierarchies and their interrelationships. It derives directly from annotation of the actual source code and provides detailed class documentation, including all member functions and attributes.

  6. [Functional aspects of pelvic floor surgery].

    PubMed

    Wagenlehner, F M E; Gunnemann, A; Liedl, B; Weidner, W

    2009-11-01

    Pelvic floor dysfunctions are frequently seen in females. The human pelvic floor is a complex structure and heavily stressed throughout female life. Recent findings in the functional anatomy of the pelvic floor have led to a much better understand-ing, on the basis of which enormous improvements in the therapeutic options have arisen. The pelvic floor activity is regulated by three main muscular forces that are responsible for vaginal tension and suspension of the pelvic floor -organs, bladder and rectum. For different reasons laxity in the vagina or its supporting ligaments as a result of altered connective tissue can distort this functional anatomy. A variety of symptoms can derive from these pelvic floor dysfunctions, such as urinary urge and stress incontinence, abnormal bladder emptying, faecal incontinence, obstructive bowel disease syndrome and pelvic pain. Pelvic floor reconstruction is nowadays driven by the concept that in the case of pelvic floor symptoms restoration of the anatomy will translate into restoration of the physiology and ultimately improve the patients' symptoms. The exact surgical reconstruction of the anatomy is there-fore almost exclusively focused on the restoration of the lax pelvic floor ligaments. An exact identification of the anatomic lesions preoperatively is eminently necessary, to allow for an exact anatomic reconstruction with respect to the muscular forces of the pelvic floor.

  7. Scaling on a limestone flooring

    NASA Astrophysics Data System (ADS)

    Carmona-Quiroga, P. M.; Blanco-Varela, M. T.; Martínez-Ramírez, S.

    2012-04-01

    Natural stone can be use on nearly every surface, inside and outside buildings, but decay is more commonly reported from the ones exposed to outdoor aggressively conditions. This study instead, is an example of limestone weathering of uncertain origin in the interior of a residential building. The stone, used as flooring, started to exhibit loss of material in the form of scaling. These damages were observed before the building, localized in the South of Spain (Málaga), was inhabited. Moreover, according to the company the limestone satisfies the following European standards UNE-EN 1341: 2002, UNE-EN 1343: 2003; UNE-EN 12058: 2004 for floorings. Under these circumstances the main objective of this study was to assess the causes of this phenomenon. For this reason the composition of the mortar was determined and the stone was characterized from a mineralogical and petrological point of view. The last material, which is a fossiliferous limestone from Egypt with natural fissure lines, is mainly composed of calcite, being quartz, kaolinite and apatite minor phases. Moreover, under different spectroscopic and microscopic techniques (FTIR, micro-Raman, SEM-EDX, etc) samples of the weathered, taken directly from the buildings, and unweathered limestone tiles were examined and a new mineralogical phase, trona, was identified at scaled areas which are connected with the natural veins of the stone. In fact, through BSE-mapping the presence of sodium has been detected in these veins. This soluble sodium carbonate would was dissolved in the natural waters from which limestone was precipitated and would migrate with the ascendant capilar humidity and crystallized near the surface of the stone starting the scaling phenomenon which in historic masonry could be very damaging. Therefore, the weathering of the limestone would be related with the hygroscopic behaviour of this salt, but not with the constructive methods used. This makes the limestone unable to be used on restoration

  8. Remote sensing of cirrus cloud optical thickness and effective particle size for the National Polar-orbiting Operational Environmental Satellite System Visible/Infrared Imager Radiometer Suite: sensitivity to instrument noise and uncertainties in environmental parameters.

    PubMed

    Ou, Szu-Cheng; Takano, Yoshihide; Liou, K N; Higgins, Glenn J; George, Adrian; Slonaker, Richard

    2003-12-20

    We describe sensitivity studies on the remote sensing of cirrus cloud optical thickness and effective particle size using the National Polar-orbiting Operational Environmental Satellite System Visible/Infrared Imager Radiometer Suite 0.67-, 1.24-, 1.61-, and 2.25-microm reflectances and thermal IR 3.70- and 10.76-microm radiances. To investigate the accuracy and precision of the solar and IR retrieval methods subject to instrument noise and uncertainties in environmental parameters, we carried out signal-to-noise ratio tests as well as the error budget study, where we used the University of California at Los Angeles line-by-line equivalent radiative transfer model to generate radiance tables for synthetic retrievals. The methodology and results of these error analyses are discussed. PMID:14717300

  9. Sensitivity analysis for critical control points determination and uncertainty analysis to link FSO and process criteria: application to Listeria monocytogenes in soft cheese made from pasteurized milk.

    PubMed

    Lamboni, Matieyendou; Sanaa, Moez; Tenenhaus-Aziza, Fanny

    2014-04-01

    Microbiological food safety is an important economic and health issue in the context of globalization and presents food business operators with new challenges in providing safe foods. The hazard analysis and critical control point approach involve identifying the main steps in food processing and the physical and chemical parameters that have an impact on the safety of foods. In the risk-based approach, as defined in the Codex Alimentarius, controlling these parameters in such a way that the final products meet a food safety objective (FSO), fixed by the competent authorities, is a big challenge and of great interest to the food business operators. Process risk models, issued from the quantitative microbiological risk assessment framework, provide useful tools in this respect. We propose a methodology, called multivariate factor mapping (MFM), for establishing a link between process parameters and compliance with a FSO. For a stochastic and dynamic process risk model of Listeriamonocytogenes in soft cheese made from pasteurized milk with many uncertain inputs, multivariate sensitivity analysis and MFM are combined to (i) identify the critical control points (CCPs) for L.monocytogenes throughout the food chain and (ii) compute the critical limits of the most influential process parameters, located at the CCPs, with regard to the specific process implemented in the model. Due to certain forms of interaction among parameters, the results show some new possibilities for the management of microbiological hazards when a FSO is specified. PMID:24168722

  10. Sensitivity analysis for critical control points determination and uncertainty analysis to link FSO and process criteria: application to Listeria monocytogenes in soft cheese made from pasteurized milk.

    PubMed

    Lamboni, Matieyendou; Sanaa, Moez; Tenenhaus-Aziza, Fanny

    2014-04-01

    Microbiological food safety is an important economic and health issue in the context of globalization and presents food business operators with new challenges in providing safe foods. The hazard analysis and critical control point approach involve identifying the main steps in food processing and the physical and chemical parameters that have an impact on the safety of foods. In the risk-based approach, as defined in the Codex Alimentarius, controlling these parameters in such a way that the final products meet a food safety objective (FSO), fixed by the competent authorities, is a big challenge and of great interest to the food business operators. Process risk models, issued from the quantitative microbiological risk assessment framework, provide useful tools in this respect. We propose a methodology, called multivariate factor mapping (MFM), for establishing a link between process parameters and compliance with a FSO. For a stochastic and dynamic process risk model of Listeriamonocytogenes in soft cheese made from pasteurized milk with many uncertain inputs, multivariate sensitivity analysis and MFM are combined to (i) identify the critical control points (CCPs) for L.monocytogenes throughout the food chain and (ii) compute the critical limits of the most influential process parameters, located at the CCPs, with regard to the specific process implemented in the model. Due to certain forms of interaction among parameters, the results show some new possibilities for the management of microbiological hazards when a FSO is specified.

  11. Crash Tests of Protective Airplane Floors

    NASA Technical Reports Server (NTRS)

    Carden, H. D.

    1986-01-01

    Energy-absorbing floors reduce structural buckling and impact forces on occupants. 56-page report discusses crash tests of energy-absorbing aircraft floors. Describes test facility and procedures; airplanes, structural modifications, and seats; crash dynamics; floor and seat behavior; and responses of anthropometric dummies seated in airplanes. Also presents plots of accelerations, photographs and diagrams of test facility, and photographs and drawings of airplanes before, during, and after testing.

  12. Pressure Sensitive Paints

    NASA Technical Reports Server (NTRS)

    Liu, Tianshu; Bencic, T.; Sullivan, J. P.

    1999-01-01

    This article reviews new advances and applications of pressure sensitive paints in aerodynamic testing. Emphasis is placed on important technical aspects of pressure sensitive paint including instrumentation, data processing, and uncertainty analysis.

  13. Pelvic floor muscle rehabilitation using biofeedback.

    PubMed

    Newman, Diane K

    2014-01-01

    Pelvic floor muscle exercises have been recommended for urinary incontinence since first described by obstetrician gynecologist Dr. Arnold Kegel more than six decades ago. These exercises are performed to strengthen pelvic floor muscles, provide urethral support to prevent urine leakage, and suppress urgency. In clinical urology practice, expert clinicians also teach patients how to relax the muscle to improve bladder emptying and relieve pelvic pain caused by muscle spasm. When treating lower urinary tract symptoms, an exercise training program combined with biofeedback therapy has been recommended as first-line treatment. This article provides clinical application of pelvic floor muscle rehabilitation using biofeedback as a technique to enhance pelvic floor muscle training.

  14. Side Elevation; 1/4 Plans of Floor Framing, Floor Planking, Roof ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Side Elevation; 1/4 Plans of Floor Framing, Floor Planking, Roof Framing and Roof; Longitudinal Section, Cross Section, End Elevation - Eames Covered Bridge, Spanning Henderson Creek, Oquawka, Henderson County, IL

  15. 17. 4th floor roof, view south, 4th and 5th floor ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    17. 4th floor roof, view south, 4th and 5th floor setback to left and atrium structure to right - Sheffield Farms Milk Plant, 1075 Webster Avenue (southwest corner of 166th Street), Bronx, Bronx County, NY

  16. Eastern Floor of Holden Crater

    NASA Technical Reports Server (NTRS)

    2002-01-01

    (Released 15 April 2002) The Science Today's THEMIS image covers territory on the eastern floor of Holden Crater, which is located in region of the southern hemisphere called Noachis Terra. Holden Crater is 154 km in diameter and named after American Astronomer Edward Holden (1846-1914). This image shows a mottled surface with channels, hills, ridges and impact craters. The largest crater seen in this image is 5 km in diameter. This crater has gullies and what appears to be horizontal layers in its walls. The Story With its beautiful symmetry and gullies radially streaming down to the floor, the dominant crater in this image is an impressive focal point. Yet, it is really just a small crater within a much larger one named Holden Crater. Take a look at the context image to the right to see just how much bigger Holden Crater is. Then come back to the image strip that shows the mottled surface of Holden Crater's eastern floor in greater detail, and count how many hills, ridges, channels, and small impact craters can be seen. No perfectly smooth terrain abounds there, that's for sure. The textured terrain of Holden Crater has been particularly intriguing ever since the Mars Orbital Camera on the Mars Global Surveyor spacecraft found evidence of sedimentary rock layers there that might have formed in lakes or shallow seas in Mars' ancient past. This finding suggests that Mars may have been more like Earth long ago, with water on its surface. Holden Crater might even have held a lake long ago. No one knows for sure, but it's an exciting possibility. Why? If water was once on the surface of Mars long enough to form sedimentary materials, maybe it was there long enough for microbial life to have developed too. (Life as we know it just isn't possible without the long-term presence of liquid water.) The question of life on the red planet is certainly tantalizing, but scientists will need to engage in a huge amount of further investigation to begin to know the answer. That

  17. 16. THIRD FLOOR BLDG. 28A, DETAIL CUTOUT IN FLOOR FOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    16. THIRD FLOOR BLDG. 28A, DETAIL CUTOUT IN FLOOR FOR WOOD BLOCK FLOORING LOOKING EAST. - Fafnir Bearing Plant, Bounded on North side by Myrtle Street, on South side by Orange Street, on East side by Booth Street & on West side by Grove Street, New Britain, Hartford County, CT

  18. Characterizing Epistemic Uncertainty for Launch Vehicle Designs

    NASA Technical Reports Server (NTRS)

    Novack, Steven D.; Rogers, Jim; Al Hassan, Mohammad; Hark, Frank

    2016-01-01

    NASA Probabilistic Risk Assessment (PRA) has the task of estimating the aleatory (randomness) and epistemic (lack of knowledge) uncertainty of launch vehicle loss of mission and crew risk, and communicating the results. Launch vehicles are complex engineered systems designed with sophisticated subsystems that are built to work together to accomplish mission success. Some of these systems or subsystems are in the form of heritage equipment, while some have never been previously launched. For these cases, characterizing the epistemic uncertainty is of foremost importance, and it is anticipated that the epistemic uncertainty of a modified launch vehicle design versus a design of well understood heritage equipment would be greater. For reasons that will be discussed, standard uncertainty propagation methods using Monte Carlo simulation produce counter intuitive results, and significantly underestimate epistemic uncertainty for launch vehicle models. Furthermore, standard PRA methods, such as Uncertainty-Importance analyses used to identify components that are significant contributors to uncertainty, are rendered obsolete, since sensitivity to uncertainty changes are not reflected in propagation of uncertainty using Monte Carlo methods. This paper provides a basis of the uncertainty underestimation for complex systems and especially, due to nuances of launch vehicle logic, for launch vehicles. It then suggests several alternative methods for estimating uncertainty and provides examples of estimation results. Lastly, the paper describes how to implement an Uncertainty-Importance analysis using one alternative approach, describes the results, and suggests ways to reduce epistemic uncertainty by focusing on additional data or testing of selected components.

  19. Characterizing Epistemic Uncertainty for Launch Vehicle Designs

    NASA Technical Reports Server (NTRS)

    Novack, Steven D.; Rogers, Jim; Hark, Frank; Al Hassan, Mohammad

    2016-01-01

    NASA Probabilistic Risk Assessment (PRA) has the task of estimating the aleatory (randomness) and epistemic (lack of knowledge) uncertainty of launch vehicle loss of mission and crew risk and communicating the results. Launch vehicles are complex engineered systems designed with sophisticated subsystems that are built to work together to accomplish mission success. Some of these systems or subsystems are in the form of heritage equipment, while some have never been previously launched. For these cases, characterizing the epistemic uncertainty is of foremost importance, and it is anticipated that the epistemic uncertainty of a modified launch vehicle design versus a design of well understood heritage equipment would be greater. For reasons that will be discussed, standard uncertainty propagation methods using Monte Carlo simulation produce counter intuitive results and significantly underestimate epistemic uncertainty for launch vehicle models. Furthermore, standard PRA methods such as Uncertainty-Importance analyses used to identify components that are significant contributors to uncertainty are rendered obsolete since sensitivity to uncertainty changes are not reflected in propagation of uncertainty using Monte Carlo methods.This paper provides a basis of the uncertainty underestimation for complex systems and especially, due to nuances of launch vehicle logic, for launch vehicles. It then suggests several alternative methods for estimating uncertainty and provides examples of estimation results. Lastly, the paper shows how to implement an Uncertainty-Importance analysis using one alternative approach, describes the results, and suggests ways to reduce epistemic uncertainty by focusing on additional data or testing of selected components.

  20. Uncertainties in climate stabilization

    SciTech Connect

    Wigley, T. M.; Clarke, Leon E.; Edmonds, James A.; Jacoby, H. D.; Paltsev, S.; Pitcher, Hugh M.; Reilly, J. M.; Richels, Richard G.; Sarofim, M. C.; Smith, Steven J.

    2009-11-01

    We explore the atmospheric composition, temperature and sea level implications of new reference and cost-optimized stabilization emissions scenarios produced using three different Integrated Assessment (IA) models for U.S. Climate Change Science Program (CCSP) Synthesis and Assessment Product 2.1a. We also consider an extension of one of these sets of scenarios out to 2300. Stabilization is defined in terms of radiative forcing targets for the sum of gases potentially controlled under the Kyoto Protocol. For the most stringent stabilization case (“Level 1” with CO2 concentration stabilizing at about 450 ppm), peak CO2 emissions occur close to today, implying a need for immediate CO2 emissions abatement if we wish to stabilize at this level. In the extended reference case, CO2 stabilizes at 1000 ppm in 2200 – but even to achieve this target requires large and rapid CO2 emissions reductions over the 22nd century. Future temperature changes for the Level 1 stabilization case show considerable uncertainty even when a common set of climate model parameters is used (a result of different assumptions for non-Kyoto gases). Uncertainties are about a factor of three when climate sensitivity uncertainties are accounted for. We estimate the probability that warming from pre-industrial times will be less than 2oC to be about 50%. For one of the IA models, warming in the Level 1 case is greater out to 2050 than in the reference case, due to the effect of decreasing SO2 emissions that occur as a side effect of the policy-driven reduction in CO2 emissions. Sea level rise uncertainties for the Level 1 case are very large, with increases ranging from 12 to 100 cm over 2000 to 2300.

  1. The Relationship of Cultural Similarity, Communication Effectiveness and Uncertainty Reduction.

    ERIC Educational Resources Information Center

    Koester, Jolene; Olebe, Margaret

    To investigate the relationship of cultural similarity/dissimilarity, communication effectiveness, and communication variables associated with uncertainty reduction theory, a study examined two groups of students--a multinational group living on an "international floor" in a dormitory at a state university and an unrelated group of U.S. students…

  2. [Functional rehabilitation of the pelvic floor].

    PubMed

    Minschaert, M

    2003-09-01

    Pelvic floor revalidation is devoted to conserve perineal functions as statics, urinary continence and sexual harmony. The therapeutics includes preventive and curative actions, and is based upon muscular and neuromuscular properties of pelvic floor. The different steps are: information, local muscular work, behavioral education, biofeedback, functional electrostimulation, intraabdominal pressure control. The therapeutics is only continued if clinical improvement is demonstrated after 10 sessions.

  3. 9 CFR 91.26 - Concrete flooring.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 9 Animals and Animal Products 1 2014-01-01 2014-01-01 false Concrete flooring. 91.26 Section 91.26... LIVESTOCK FOR EXPORTATION Inspection of Vessels and Accommodations § 91.26 Concrete flooring. (a) Pens aboard an ocean vessel shall have a 3 inch concrete pavement, proportioned and mixed to give 2000...

  4. 9 CFR 91.26 - Concrete flooring.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 9 Animals and Animal Products 1 2010-01-01 2010-01-01 false Concrete flooring. 91.26 Section 91.26... LIVESTOCK FOR EXPORTATION Inspection of Vessels and Accommodations § 91.26 Concrete flooring. (a) Pens aboard an ocean vessel shall have a 3 inch concrete pavement, proportioned and mixed to give 2000...

  5. Floor Time: Rethinking Play in the Classroom

    ERIC Educational Resources Information Center

    Kordt-Thomas, Chad; Lee, Ilene M.

    2006-01-01

    Floor time is a play-based, one-to-one approach to helping children develop relationships, language, and thinking. Developed by child psychiatrist Stanley Greenspan, floor time is helpful not only for children with special needs but also for children who are developing typically. It can be used by teachers, caregivers, and families in brief…

  6. Learning4Life on the Exhibit Floor

    ERIC Educational Resources Information Center

    Sullivan, Margaret

    2009-01-01

    The exhibit floor is a wealth of knowledge. One can read, view, and listen to information presented in many formats. Somewhere on the exhibit floor there are experts on every topic, ready and waiting for one's questions. But like any research topic, frequently a structured search is required to find the best answers. This article discusses how to…

  7. 36 CFR 910.60 - Gross floor area.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 36 Parks, Forests, and Public Property 3 2011-07-01 2011-07-01 false Gross floor area. 910.60... DEVELOPMENT AREA Glossary of Terms § 910.60 Gross floor area. Gross floor area is defined in section 1201... of the several floors from the ground floor up of all buildings of a development occurring on a...

  8. The maintenance of uncertainty

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    Introduction Preliminaries State-space dynamics Linearized dynamics of infinitesimal uncertainties Instantaneous infinitesimal dynamics Finite-time evolution of infinitesimal uncertainties Lyapunov exponents and predictability The Baker's apprentice map Infinitesimals and predictability Dimensions The Grassberger-Procaccia algorithm Towards a better estimate from Takens' estimators Space-time-separation diagrams Intrinsic limits to the analysis of geometry Takens' theorem The method of delays Noise Prediction, prophecy, and pontification Introduction Simulations, models and physics Ground rules Data-based models: dynamic reconstructions Analogue prediction Local prediction Global prediction Accountable forecasts of chaotic systems Evaluating ensemble forecasts The annulus Prophecies Aids for more reliable nonlinear analysis Significant results: surrogate data, synthetic data and self-deception Surrogate data and the bootstrap Surrogate predictors: Is my model any good? Hints for the evaluation of new techniques Avoiding simple straw men Feasibility tests for the identification of chaos On detecting "tiny" data sets Building models consistent with the observations Cost functions ι-shadowing: Is my model any good? (reprise) Casting infinitely long shadows (out-of-sample) Distinguishing model error and system sensitivity Forecast error and model sensitivity Accountability Residual predictability Deterministic or stochastic dynamics? Using ensembles to distinguish the expectation from the expected Numerical Weather Prediction Probabilistic prediction with a deterministic model The analysis Constructing and interpreting ensembles The outlook(s) for today Conclusion Summary

  9. Male pelvic floor: history and update.

    PubMed

    Dorey, Grace

    2005-08-01

    Our understanding of the male pelvic floor has evolved over more than 2,000 years. Gradually medical science has sought to dispel ancient myths and untruths. The male pelvic floor has many diverse functions. Importantly, it helps to support the abdominal contents, maintains urinary and fecal continence, and plays a major role in gaining and maintaining penile erection. Weakness of the male pelvic floor muscles may cause urinary and fecal incontinence and erectile dysfunction. Function may be restored in each of these areas by a comprehensive pelvic floor muscle training program. Spasm of the pelvic floor muscles may produce pain and require relaxation techniques. Additional research is needed to add further evidence to our knowledge base.

  10. Quantifying uncertainty in LCA-modelling of waste management systems

    SciTech Connect

    Clavreul, Julie; Guyonnet, Dominique; Christensen, Thomas H.

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer Uncertainty in LCA-modelling of waste management is significant. Black-Right-Pointing-Pointer Model, scenario and parameter uncertainties contribute. Black-Right-Pointing-Pointer Sequential procedure for quantifying uncertainty is proposed. Black-Right-Pointing-Pointer Application of procedure is illustrated by a case-study. - Abstract: Uncertainty analysis in LCA studies has been subject to major progress over the last years. In the context of waste management, various methods have been implemented but a systematic method for uncertainty analysis of waste-LCA studies is lacking. The objective of this paper is (1) to present the sources of uncertainty specifically inherent to waste-LCA studies, (2) to select and apply several methods for uncertainty analysis and (3) to develop a general framework for quantitative uncertainty assessment of LCA of waste management systems. The suggested method is a sequence of four steps combining the selected methods: (Step 1) a sensitivity analysis evaluating the sensitivities of the results with respect to the input uncertainties, (Step 2) an uncertainty propagation providing appropriate tools for representing uncertainties and calculating the overall uncertainty of the model results, (Step 3) an uncertainty contribution analysis quantifying the contribution of each parameter uncertainty to the final uncertainty and (Step 4) as a new approach, a combined sensitivity analysis providing a visualisation of the shift in the ranking of different options due to variations of selected key parameters. This tiered approach optimises the resources available to LCA practitioners by only propagating the most influential uncertainties.

  11. Ultrasonic Inspection Of The LTAB Floor

    SciTech Connect

    Thomas, G

    2001-07-31

    The National Ignition Facility's (NIF) floor is damaged by transporter operations. Two basic operations, rotating the wheels in place and traversing the floor numerous times can cause failure in the grout layer. The floor is composed of top wear surface (Stonhard) and an osmotic grout layer on top of concrete, Fig. 1. An ultrasonic technique was implemented to assess the condition of the floor as part of a study to determine the damage mechanisms. The study considered damage scenarios and ways to avoid the damage. A possible solution is to install thin steel plates where the transporter traverses on the floor. These tests were conducted with a fully loaded transporter that applies up to 1300 psi loads to the floor. A contact ultrasonic technique evaluated the condition of the grout layer in NIF's floor. Figure 1 displays the configuration of the ultrasonic transducer on the floor. We inspected the floor after wheel rotation damage and after wheel traversal damage. Figure 2a and 2b are photographs of the portable ultrasonic system and data acquisition. We acquired ultrasonic signals in a known pristine area and a damaged area to calibrate the inspection. Figure 3 is a plot of the typical ultrasonic response from an undamaged area (black) overlapped with a signal (red) from a damaged area. The damage area data was acquired at a location next to a hole in the floor that was caused by the transporter. Five megahertz pulses are propagated from the transducer and through a Plexiglas buffer rod into the floor. The ultrasonic pulse reflects from each discontinuity in the floor. The ultrasonic signal reflects from the top surface, the Stonhard-to-grout interface, and the grout to concrete interface. We expect to see reflections from each of these interfaces in an undamaged floor. If the grout layer pulverizes then the high frequency signal cannot traverse the layer and the grout to concrete interface signal will decrease or vanish. The more damage to the grout the more the

  12. The Floor Projection Maze: A novel behavioral apparatus for presenting visual stimuli to rats

    PubMed Central

    Furtak, Sharon C.; Cho, Christine E.; Kerr, Kristin M.; Barredo, Jennifer L.; Alleyne, Janelle E.; Patterson, Yolanda R.; Burwell, Rebecca D.

    2010-01-01

    There is a long tradition of studying visual learning in rats by presenting stimuli vertically on cards or monitors. The procedures are often labor intensive and the rate of acquisition can be prohibitively low. Available evidence suggests that rats process visual information presented in the lower visual hemifield more effectively than information presented in the upper visual hemifield. We capitalized on these findings by developing a novel apparatus, the Floor Projection Maze, for presenting visual information directly to the floor of an exploratory maze. Two-dimensional (2D) visual stimuli were presented on the floor by back-projecting an image from a standard digital projector to the semi-transparent underside of the floor of an open maze. Long-Evans rats rapidly acquired easy 2D visual discriminations (Experiment 1). Rats were also able to learn a more difficult shape discrimination in dramatically fewer trials than previously reported for the same discrimination when presented vertically (Experiment 2). The two choice discrimination task was adapted to determine contrast sensitivity thresholds in a naïve group of rats (Experiment 3). Contrast sensitivity thresholds were uniform across three subjects, demonstrating that the Floor Projection Maze can be used for visual psychophysics in rats. Our findings demonstrate that rats can rapidly acquire visual tasks when stimuli are presented horizontally on the floor, suggesting that this novel behavioral apparatus will provide a powerful behavioral paradigm in the future. PMID:19422855

  13. 41. Ground level photograph of two floors of skeleton complete ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    41. Ground level photograph of two floors of skeleton complete with 3rd and 4th floors being started,upper floors of county bldg visible - Chicago City Hall, 121 North LaSalle Street, Chicago, Cook County, IL

  14. Typical Newel Post, First Floor Newel Post, Typical Baluster, Typical ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Typical Newel Post, First Floor Newel Post, Typical Baluster, Typical Nosing, First Floor Stringer Profile, Second Floor Stringer Profile - National Home for Disabled Volunteer Soldiers - Battle Mountain Sanitarium, Treasurer's Quarters, 500 North Fifth Street, Hot Springs, Fall River County, SD

  15. Analysis of roof and pillar failure associated with weak floor at a limestone mine

    PubMed Central

    Murphy, Michael M.; Ellenberger, John L.; Esterhuizen, Gabriel S.; Miller, Tim

    2016-01-01

    A limestone mine in Ohio has had instability problems that have led to massive roof falls extending to the surface. This study focuses on the role that weak, moisture-sensitive floor has in the instability issues. Previous NIOSH research related to this subject did not include analysis for weak floor or weak bands and recommended that when such issues arise they should be investigated further using a more advanced analysis. Therefore, to further investigate the observed instability occurring on a large scale at the Ohio mine, FLAC3D numerical models were employed to demonstrate the effect that a weak floor has on roof and pillar stability. This case study will provide important information to limestone mine operators regarding the impact of weak floor causing the potential for roof collapse, pillar failure, and subsequent subsidence of the ground surface. PMID:27088041

  16. 3MRA UNCERTAINTY AND SENSITIVITY ANALYSIS

    EPA Science Inventory

    This presentation discusses the Multimedia, Multipathway, Multireceptor Risk Assessment (3MRA) modeling system. The outline of the presentation is: modeling system overview - 3MRA versions; 3MRA version 1.0; national-scale assessment dimensionality; SuperMUSE: windows-based super...

  17. Pelvic floor and sexual male dysfunction.

    PubMed

    Pischedda, Antonella; Fusco, Ferdinando; Curreli, Andrea; Grimaldi, Giovanni; Pirozzi Farina, Furio

    2013-04-19

    The pelvic floor is a complex multifunctional structure that corresponds to the genito-urinary-anal area and consists of muscle and connective tissue. It supports the urinary, fecal, sexual and reproductive functions and pelvic statics. The symptoms caused by pelvic floor dysfunction often affect the quality of life of those who are afflicted, worsening significantly more aspects of daily life. In fact, in addition to providing support to the pelvic organs, the deep floor muscles support urinary continence and intestinal emptying whereas the superficial floor muscles are involved in the mechanism of erection and ejaculation. So, conditions of muscle hypotonia or hypertonicity may affect the efficiency of the pelvic floor, altering both the functionality of the deep and superficial floor muscles. In this evolution of knowledge it is possible imagine how the rehabilitation techniques of pelvic floor muscles, if altered and able to support a voiding or evacuative or sexual dysfunction, may have a role in improving the health and the quality of life.

  18. Dunes in a Crater Floor

    NASA Technical Reports Server (NTRS)

    2003-01-01

    [figure removed for brevity, see original site]

    Released 6 August 2003

    This image shows the floor of a crater just north of the Argyre basin in the southern hemisphere. Dark dunes have been pushed up against the northeastern interior rim of the crater, indicating that the prevailing winds blow from the southwest.

    Image information: VIS instrument. Latitude -35.7, Longitude 324.1 East (35.9 West). 19 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.

  19. Ultrasound Imaging of the Pelvic Floor.

    PubMed

    Stone, Daniel E; Quiroz, Lieschen H

    2016-03-01

    This article discusses the background and appraisal of endoluminal ultrasound of the pelvic floor. It provides a detailed anatomic assessment of the muscles and surrounding organs of the pelvic floor. Different anatomic variability and pathology, such as prolapse, fecal incontinence, urinary incontinence, vaginal wall cysts, synthetic implanted material, and pelvic pain, are easily assessed with endoluminal vaginal ultrasound. With pelvic organ prolapse in particular, not only is the prolapse itself seen but the underlying cause related to the anatomic and functional abnormalities of the pelvic floor muscle structures are also visualized.

  20. 9 CFR 91.26 - Concrete flooring.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... composite material diagonally scored one-half inch deep may be used on iron decks instead of wooden flooring... aft with flat side down, and so placed as to provide in-between spaces of 12, 14, 26, and 14...

  1. 9 CFR 91.26 - Concrete flooring.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... composite material diagonally scored one-half inch deep may be used on iron decks instead of wooden flooring... aft with flat side down, and so placed as to provide in-between spaces of 12, 14, 26, and 14...

  2. 9 CFR 91.26 - Concrete flooring.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... composite material diagonally scored one-half inch deep may be used on iron decks instead of wooden flooring... aft with flat side down, and so placed as to provide in-between spaces of 12, 14, 26, and 14...

  3. Magnetic Resonance Imaging (MRI): Dynamic Pelvic Floor

    MedlinePlus

    ... a powerful magnetic field, radio waves and a computer to produce detailed pictures of the pelvic floor, ... powerful magnetic field, radio frequency pulses and a computer to produce detailed pictures of organs, soft tissues, ...

  4. Modeling uncertainty: quicksand for water temperature modeling

    USGS Publications Warehouse

    Bartholow, John M.

    2003-01-01

    Uncertainty has been a hot topic relative to science generally, and modeling specifically. Modeling uncertainty comes in various forms: measured data, limited model domain, model parameter estimation, model structure, sensitivity to inputs, modelers themselves, and users of the results. This paper will address important components of uncertainty in modeling water temperatures, and discuss several areas that need attention as the modeling community grapples with how to incorporate uncertainty into modeling without getting stuck in the quicksand that prevents constructive contributions to policy making. The material, and in particular the reference, are meant to supplement the presentation given at this conference.

  5. Communication and Uncertainty Management.

    ERIC Educational Resources Information Center

    Brashers, Dale E.

    2001-01-01

    Suggests the fundamental challenge for refining theories of communication and uncertainty is to abandon the assumption that uncertainty will produce anxiety. Outlines and extends a theory of uncertainty management and reviews current theory and research. Concludes that people want to reduce uncertainty because it is threatening, but uncertainty…

  6. Floor Fractured Craters around Syrtis Major, Mars

    NASA Astrophysics Data System (ADS)

    Bamberg, M.; Jaumann, R.; Asche, H.

    2012-04-01

    Craters around Syrtis Major are eroded and/or refilled. Syrtis Major is one of the large Hesperian-aged volcanic regions on Mars. Basaltic deposits originating from nearby Syrtis Major cover the floor of impact craters. In particular some craters exhibit a fractured floor. Floor Fractured Craters can be divided in types. The grade of erosion and the geologic process, which formed the crater, can be different. Type 1: Crater floor affected by pit chains or narrow crevices which are sometimes discontinuous. Type 2: More developed and dense networks of crevices as type 1. Crevices are wide and deep enough to be detected. A circular moat starts to develop as crevices concentrate along the rim. Type 3: Mainly distinguished from type 2 by the presence of a fully developed circular moat. The flat central part is divided into several blocks by crevices. Type 4: They show also a continuous moat along the rim but the central part consists of many flat-top blocks and small conical mounds. Type 5: Crater floor has many mounds of irregular sizes, but the flattop blocks are absent. It should be noted that the knobby surface shows typical characteristics of chaotic terrains and could be alternatively classified as such. Type 6: Crater without a circular moat, crevices are not fully developed, flat-top blocks are present. Fractured floor could have been reshaped through geologic processes. Floor fractured craters can be found in three different areas. The first area is located in the south-eastern part of Syrtis Major, bordering to the highlands. Volcanic features like lava flow fronts, lava flows and wrinkle ridges dominate this region. The crater floor is separated in sharp-edged plates and the interior seems to be flooded by basaltic material. The second area is in the north of Syrtis Major and transcend to the chaotic terrain further north. Near the martian dichotomy boundary fluvial activity was the decisive process. The crater rims are highly eroded, channels are cutting

  7. [Epidermoid cyst of the mouth floor].

    PubMed

    Sanjuán Rodríguez, S; Morán Penco, J M; Ruiz Orpez, A; Santamaria Ossorio, J I; Berchi García, F J

    2003-07-01

    The epidermoid cysts are frequent during childhood, however mouth floor location are very unusual, because of their more difficult diagnosis and therapeutic approach. We present a 5 years old male, symptoms free until a week before, when his parents noticed a well defined mass in the mouth floor. A physical examination leaded to the diagnosis of possible epidermoid cyst. The tumor was excised through an introral approach. A review of different diagnostic means and surgical management are undertaken.

  8. [Functional rehabilitation of the pelvic floor].

    PubMed

    Minschaert, M

    2003-09-01

    Pelvic floor revalidation is devoted to conserve perineal functions as statics, urinary continence and sexual harmony. The therapeutics includes preventive and curative actions, and is based upon muscular and neuromuscular properties of pelvic floor. The different steps are: information, local muscular work, behavioral education, biofeedback, functional electrostimulation, intraabdominal pressure control. The therapeutics is only continued if clinical improvement is demonstrated after 10 sessions. PMID:14606287

  9. Physical therapy for female pelvic floor disorders.

    PubMed

    Bourcier, A P

    1994-08-01

    Non-surgical, non-pharmacological treatment for female pelvic floor dysfunction is represented by rehabilitation in urogynecology. Since Kegel, in 1948, who proposed the concept of functional restoration of the perineal muscles, no specific term has actually been established. Owing to the number of specialists involved in the management of female pelvic floor disorders (such as gynecologists, urologists, coloproctologists, and neurologists) and the different types of health care providers concerned (such as physicians, physical therapists, nurses, and midwives), it is difficult to make the proper choice between 'physical therapy for pelvic floor', 'pelvic floor rehabilitation', 'pelvic muscle re-education', and 'pelvic floor training'. Because muscle re-education is under the control of physical therapists, we have chosen the term of physical therapy for female pelvic floor disorders. Muscle re-education has an important role in the primary treatment of lower urinary tract dysfunction. A multidisciplinary collaboration may be of particular interest, and a thorough evaluation is useful for a proper selection of patients.

  10. ETRA, TRA642. ON BASEMENT FLOOR. IBEAM COLUMNS SUPPORTING CONSOLE FLOOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETRA, TRA-642. ON BASEMENT FLOOR. I-BEAM COLUMNS SUPPORTING CONSOLE FLOOR HAVE BEEN SURROUNDED BY CONCRETE IN RECTANGULAR PILLARS. BASEMENT FLOOR IS BEING PREPARED FOR PLACEMENT OF CONCRETE. ABOVE CEILING IS CONSOLE FLOOR, IN WHICH CUT-OUT HAS PRESERVED SPACE FOR REACTOR AND ITS SHIELDING. CIRCULAR FORM IN REACTOR AREA IS CONCRETE FORMING. NOTE VERTICAL CONDUIT AT INTERVALS AROUND REACTOR PITS. INL NEGATIVE NO. 56-1237. Jack L. Anderson, Photographer, 4/17/1956 - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  11. Coupled semivariogram uncertainty of hydrogeological and geophysical data on capture zone uncertainty analysis

    USGS Publications Warehouse

    Rahman, A.; Tsai, F.T.-C.; White, C.D.; Willson, C.S.

    2008-01-01

    This study investigates capture zone uncertainty that relates to the coupled semivariogram uncertainty of hydrogeological and geophysical data. Semivariogram uncertainty is represented by the uncertainty in structural parameters (range, sill, and nugget). We used the beta distribution function to derive the prior distributions of structural parameters. The probability distributions of structural parameters were further updated through the Bayesian approach with the Gaussian likelihood functions. Cokriging of noncollocated pumping test data and electrical resistivity data was conducted to better estimate hydraulic conductivity through autosemivariograms and pseudo-cross-semivariogram. Sensitivities of capture zone variability with respect to the spatial variability of hydraulic conductivity, porosity and aquifer thickness were analyzed using ANOVA. The proposed methodology was applied to the analysis of capture zone uncertainty at the Chicot aquifer in Southwestern Louisiana, where a regional groundwater flow model was developed. MODFLOW-MODPATH was adopted to delineate the capture zone. The ANOVA results showed that both capture zone area and compactness were sensitive to hydraulic conductivity variation. We concluded that the capture zone uncertainty due to the semivariogram uncertainty is much higher than that due to the kriging uncertainty for given semivariograms. In other words, the sole use of conditional variances of kriging may greatly underestimate the flow response uncertainty. Semivariogram uncertainty should also be taken into account in the uncertainty analysis. ?? 2008 ASCE.

  12. 2. VIEW OF LOWER MILL FLOOR FOUNDATION, SHOWING, LEFT TO ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    2. VIEW OF LOWER MILL FLOOR FOUNDATION, SHOWING, LEFT TO RIGHT, EDGE OF MILLING FLOOR, TABLE FLOOR, VANNING FLOOR, LOADING LEVEL, TAILINGS POND IN RIGHT BACKGROUND. VIEW IS LOOKING FROM THE NORTHWEST - Mountain King Gold Mine & Mill, 4.3 Air miles Northwest of Copperopolis, Copperopolis, Calaveras County, CA

  13. Indirect orbital floor fractures: a meta-analysis.

    PubMed

    Gonzalez, Mithra O; Durairaj, Vikram D

    2010-04-01

    Orbit fractures are common in the context of orbital trauma. Fractures of the orbital floor without orbital rim involvement are known as indirect orbital floor fractures, pure internal floor fractures, and orbital blowout fractures. In this paper, we have reported a meta-analysis of orbital floor fractures focusing on indications and timing of surgical repair, outcomes, and complications. PMID:20616920

  14. 76 FR 7098 - Dealer Floor Plan Pilot Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-09

    ... ADMINISTRATION 13 CFR Parts 120 and 121 Dealer Floor Plan Pilot Program AGENCY: U.S. Small Business... Dealer Floor Plan Pilot Program to make available 7(a) loan guaranties for lines of credit that provide floor plan financing. This new Dealer Floor Plan Pilot Program was created in the Small Business...

  15. Uncertainty as knowledge

    PubMed Central

    Lewandowsky, Stephan; Ballard, Timothy; Pancost, Richard D.

    2015-01-01

    This issue of Philosophical Transactions examines the relationship between scientific uncertainty about climate change and knowledge. Uncertainty is an inherent feature of the climate system. Considerable effort has therefore been devoted to understanding how to effectively respond to a changing, yet uncertain climate. Politicians and the public often appeal to uncertainty as an argument to delay mitigative action. We argue that the appropriate response to uncertainty is exactly the opposite: uncertainty provides an impetus to be concerned about climate change, because greater uncertainty increases the risks associated with climate change. We therefore suggest that uncertainty can be a source of actionable knowledge. We survey the papers in this issue, which address the relationship between uncertainty and knowledge from physical, economic and social perspectives. We also summarize the pervasive psychological effects of uncertainty, some of which may militate against a meaningful response to climate change, and we provide pointers to how those difficulties may be ameliorated. PMID:26460108

  16. Using multiple barometers to detect the floor location of smart phones with built-in barometric sensors for indoor positioning.

    PubMed

    Xia, Hao; Wang, Xiaogang; Qiao, Yanyou; Jian, Jun; Chang, Yuanfei

    2015-03-31

    Following the popularity of smart phones and the development of mobile Internet, the demands for accurate indoor positioning have grown rapidly in recent years. Previous indoor positioning methods focused on plane locations on a floor and did not provide accurate floor positioning. In this paper, we propose a method that uses multiple barometers as references for the floor positioning of smart phones with built-in barometric sensors. Some related studies used barometric formula to investigate the altitude of mobile devices and compared the altitude with the height of the floors in a building to obtain the floor number. These studies assume that the accurate height of each floor is known, which is not always the case. They also did not consider the difference in the barometric-pressure pattern at different floors, which may lead to errors in the altitude computation. Our method does not require knowledge of the accurate heights of buildings and stories. It is robust and less sensitive to factors such as temperature and humidity and considers the difference in the barometric-pressure change trends at different floors. We performed a series of experiments to validate the effectiveness of this method. The results are encouraging.

  17. Comparison of slime-producing coagulase-negative Staphylococcus colonization rates on vinyl and ceramic tile flooring materials.

    PubMed

    Yazgi, H; Uyanik, M H; Ayyildiz, A

    2009-01-01

    This study investigated the colonization of slime-producing coagulase-negative Staphylococcus (CoNS) in 80 patient wards in Turkey (40 vinyl and 40 ceramic tile floors). A total of 480 samples that included 557 CoNS isolates were obtained. Slime production was investigated with the Christensen method and methicillin-susceptibility was tested by the disk-diffusion method. There was a significant difference in the percentage of slime-producing CoNS isolates on vinyl (12.4%) versus ceramic tile flooring (4.4%). From vinyl flooring, the percentage of slime producing methicillin-resistant CoNS (MRCoNS) (8.9%) was significantly higher than for methicillin-sensitive CoNS (MSCoNS) (3.6%), whereas there was no difference from ceramic tile flooring (2.5% MRCoNS versus 1.8% MSCoNS). The most commonly isolated slime-producing CoNS species was S. epidermidis on both types of flooring. It is concluded that vinyl flooring seems to be a more suitable colonization surface for slime-producing CoNS than ceramic tile floors. Further studies are needed to investigate bacterial strains colonized on flooring materials, which are potential pathogens for nosocomial infections. PMID:19589249

  18. Using Multiple Barometers to Detect the Floor Location of Smart Phones with Built-in Barometric Sensors for Indoor Positioning

    PubMed Central

    Xia, Hao; Wang, Xiaogang; Qiao, Yanyou; Jian, Jun; Chang, Yuanfei

    2015-01-01

    Following the popularity of smart phones and the development of mobile Internet, the demands for accurate indoor positioning have grown rapidly in recent years. Previous indoor positioning methods focused on plane locations on a floor and did not provide accurate floor positioning. In this paper, we propose a method that uses multiple barometers as references for the floor positioning of smart phones with built-in barometric sensors. Some related studies used barometric formula to investigate the altitude of mobile devices and compared the altitude with the height of the floors in a building to obtain the floor number. These studies assume that the accurate height of each floor is known, which is not always the case. They also did not consider the difference in the barometric-pressure pattern at different floors, which may lead to errors in the altitude computation. Our method does not require knowledge of the accurate heights of buildings and stories. It is robust and less sensitive to factors such as temperature and humidity and considers the difference in the barometric-pressure change trends at different floors. We performed a series of experiments to validate the effectiveness of this method. The results are encouraging. PMID:25835189

  19. Using multiple barometers to detect the floor location of smart phones with built-in barometric sensors for indoor positioning.

    PubMed

    Xia, Hao; Wang, Xiaogang; Qiao, Yanyou; Jian, Jun; Chang, Yuanfei

    2015-01-01

    Following the popularity of smart phones and the development of mobile Internet, the demands for accurate indoor positioning have grown rapidly in recent years. Previous indoor positioning methods focused on plane locations on a floor and did not provide accurate floor positioning. In this paper, we propose a method that uses multiple barometers as references for the floor positioning of smart phones with built-in barometric sensors. Some related studies used barometric formula to investigate the altitude of mobile devices and compared the altitude with the height of the floors in a building to obtain the floor number. These studies assume that the accurate height of each floor is known, which is not always the case. They also did not consider the difference in the barometric-pressure pattern at different floors, which may lead to errors in the altitude computation. Our method does not require knowledge of the accurate heights of buildings and stories. It is robust and less sensitive to factors such as temperature and humidity and considers the difference in the barometric-pressure change trends at different floors. We performed a series of experiments to validate the effectiveness of this method. The results are encouraging. PMID:25835189

  20. Ozone Uncertainties Study Algorithm (OUSA)

    NASA Technical Reports Server (NTRS)

    Bahethi, O. P.

    1982-01-01

    An algorithm to carry out sensitivities, uncertainties and overall imprecision studies to a set of input parameters for a one dimensional steady ozone photochemistry model is described. This algorithm can be used to evaluate steady state perturbations due to point source or distributed ejection of H2O, CLX, and NOx, besides, varying the incident solar flux. This algorithm is operational on IBM OS/360-91 computer at NASA/Goddard Space Flight Center's Science and Applications Computer Center (SACC).

  1. ETR, TRA642. FLOOR PLAN UNDER BALCONY ON CONSOLE FLOOR. MOTORGENERATOR ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR, TRA-642. FLOOR PLAN UNDER BALCONY ON CONSOLE FLOOR. MOTOR-GENERATOR SETS AND OTHER ELECTRICAL EQUIPMENT. PHILLIPS PETROLEUM COMPANY ETR-D-1781, 7/1960. INL INDEX NO. 532-0642-00-706-020384, REV. 1. - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  2. Total pelvic floor ultrasound for pelvic floor defaecatory dysfunction: a pictorial review.

    PubMed

    Hainsworth, Alison J; Solanki, Deepa; Schizas, Alexis M P; Williams, Andrew B

    2015-01-01

    Total pelvic floor ultrasound is used for the dynamic assessment of pelvic floor dysfunction and allows multicompartmental anatomical and functional assessment. Pelvic floor dysfunction includes defaecatory, urinary and sexual dysfunction, pelvic organ prolapse and pain. It is common, increasingly recognized and associated with increasing age and multiparity. Other options for assessment include defaecation proctography and defaecation MRI. Total pelvic floor ultrasound is a cheap, safe, imaging tool, which may be performed as a first-line investigation in outpatients. It allows dynamic assessment of the entire pelvic floor, essential for treatment planning for females who often have multiple diagnoses where treatment should address all aspects of dysfunction to yield optimal results. Transvaginal scanning using a rotating single crystal probe provides sagittal views of bladder neck support anteriorly. Posterior transvaginal ultrasound may reveal rectocoele, enterocoele or intussusception whilst bearing down. The vaginal probe is also used to acquire a 360° cross-sectional image to allow anatomical visualization of the pelvic floor and provides information regarding levator plate integrity and pelvic organ alignment. Dynamic transperineal ultrasound using a conventional curved array probe provides a global view of the anterior, middle and posterior compartments and may show cystocoele, enterocoele, sigmoidocoele or rectocoele. This pictorial review provides an atlas of normal and pathological images required for global pelvic floor assessment in females presenting with defaecatory dysfunction. Total pelvic floor ultrasound may be used with complementary endoanal ultrasound to assess the sphincter complex, but this is beyond the scope of this review. PMID:26388109

  3. ETR ELECTRICAL BUILDING, TRA648. FLOOR PLANS FOR FIRST FLOOR AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    ETR ELECTRICAL BUILDING, TRA-648. FLOOR PLANS FOR FIRST FLOOR AND BASEMENT. SECTIONS. KAISER ETR-5528-MTR-648-A-2, 12/1955. INL INDEX NO. 532-0648-00-486-101402, REV. 6. - Idaho National Engineering Laboratory, Test Reactor Area, Materials & Engineering Test Reactors, Scoville, Butte County, ID

  4. 24. FIFTH FLOOR BLDG. 28B, DETAIL WOOD BLOCK FLOORING LOOKING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    24. FIFTH FLOOR BLDG. 28B, DETAIL WOOD BLOCK FLOORING LOOKING NORTH. - Fafnir Bearing Plant, Bounded on North side by Myrtle Street, on South side by Orange Street, on East side by Booth Street & on West side by Grove Street, New Britain, Hartford County, CT

  5. 23. FIFTH FLOOR BLDG. 28B, DETAIL WOOD BLOCK FLOORING LOOKING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    23. FIFTH FLOOR BLDG. 28B, DETAIL WOOD BLOCK FLOORING LOOKING WEST. - Fafnir Bearing Plant, Bounded on North side by Myrtle Street, on South side by Orange Street, on East side by Booth Street & on West side by Grove Street, New Britain, Hartford County, CT

  6. LOFT, TAN650. Service building preamp tower, top three floors. Floor ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    LOFT, TAN-650. Service building pre-amp tower, top three floors. Floor plan, cable mazes, duct labyrinth. Borated water tank enclosure on roof. Kaiser engineers 6413-11-STEP/LOFT-650-A-3. Date: October 1964. INEEL index code no. 036-650-00-486-122215 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  7. Evaluation of cage floor systems for production of commercial broilers.

    PubMed

    Akpobome, G O; Fanguy, R C

    1992-02-01

    Flooring materials evaluated consisted of three types of mesh (wire, steel, and plastic), three types of perforated floor (wood, styrofoam, and plastic), and three types of doweling (rigid, rotating, and padded). A solid wood floor with wood shavings litter served as a control. Parameters measured included body weight at 4, 6, and 8 wk and dressed carcass weight. Breast blisters, feather soilage, broken bones, feed consumption, percentage abdominal fat, and mortality rate for each floor type were also evaluated. Birds grown on wire mesh floors experienced a significant reduction in live body weight at 6 and 8 wk of age when compared with all other floor types tested. The remaining experimental floor types were comparable to the litter floor control group when using body weight as the performance criterion. The mesh floors experienced the highest incidence of breast blisters and the padded dowel group experienced the least. Feather soilage was a problem only with the perforated wood and styrofoam floor systems. Abdominal fat did not seem to be related to experimental floor type. The incidence of wing breakage during processing was significantly greater than leg breakage for all floor systems tested. Mortality was only a problem with the birds reared on wire mesh floors. The overall data suggests that a padded dowel floor system can be used to produce cage broilers about 2,500 g in weight without leg or breast damage and that these birds will be equivalent to those currently produced by the industry on a litter floor system.

  8. Simulating the Formation of Lunar Floor-Fracture Craters Using Elastoviscoplastic Relaxation

    NASA Technical Reports Server (NTRS)

    Dombard, A. J.; Gillis, J. J.

    1999-01-01

    summation of the elastic, creep, and plastic strains. In relaxation phenomena in general, the system takes advantage of any means possible to eliminate deviatoric stresses by relaxing away the topography. Previous analyses have only modeled the viscous response. Comparatively, the elastic response in our model can augment the relaxation, to a point. This effect decreases as the elastic response becomes stiffer; indeed, in the limit of infinite elastic Young's modulus (and with no plasticity), the solution converges on the purely viscous solution. Igneous rocks common to the lunar near-surface have Young's modulii in the range of 10-100 GPa. To maximize relaxation, we use a Young's modulus of 10 GPa. (There is negligible sensitivity to the other elastic modulus, the Poisson's ratio; we use 0.25.) For the viscous response, we use a flow law for steady-state creep in thoroughly dried Columbia diabase, because the high plagioclase (about 70 vol%) and orthopyroxene (about 17 vol%) content is similar to the composition of the lunar highland crust as described by remote sensing and sample studies: noritic anorthosite. This flow law is highly non-Newtonian, i.e., the viscosity is highly stress dependent. That, and the variability with temperature, stands in strong contrast to previous examinations of lunar floor-fracture crater relaxation. To model discrete, brittle faulting, we assume "Byerlee's rule," a standard geodynamical technique. We implement this "rule" with an-angle of internal friction of about 40 deg, and a higher-than-normal cohesion of about 3.2 MPa (to approximate the breaking of unfractured rock). The actual behavior of geologic materials is more complex than in our rheological model, so the uncertainties in the plasticity do not represent the state-of-the-art error. Additional information is contained in the original.

  9. Interoceptive Ability Predicts Survival on a London Trading Floor.

    PubMed

    Kandasamy, Narayanan; Garfinkel, Sarah N; Page, Lionel; Hardy, Ben; Critchley, Hugo D; Gurnell, Mark; Coates, John M

    2016-01-01

    Interoception is the sensing of physiological signals originating inside the body, such as hunger, pain and heart rate. People with greater sensitivity to interoceptive signals, as measured by, for example, tests of heart beat detection, perform better in laboratory studies of risky decision-making. However, there has been little field work to determine if interoceptive sensitivity contributes to success in real-world, high-stakes risk taking. Here, we report on a study in which we quantified heartbeat detection skills in a group of financial traders working on a London trading floor. We found that traders are better able to perceive their own heartbeats than matched controls from the non-trading population. Moreover, the interoceptive ability of traders predicted their relative profitability, and strikingly, how long they survived in the financial markets. Our results suggest that signals from the body - the gut feelings of financial lore - contribute to success in the markets. PMID:27641692

  10. Interoceptive Ability Predicts Survival on a London Trading Floor

    PubMed Central

    Kandasamy, Narayanan; Garfinkel, Sarah N.; Page, Lionel; Hardy, Ben; Critchley, Hugo D.; Gurnell, Mark; Coates, John M.

    2016-01-01

    Interoception is the sensing of physiological signals originating inside the body, such as hunger, pain and heart rate. People with greater sensitivity to interoceptive signals, as measured by, for example, tests of heart beat detection, perform better in laboratory studies of risky decision-making. However, there has been little field work to determine if interoceptive sensitivity contributes to success in real-world, high-stakes risk taking. Here, we report on a study in which we quantified heartbeat detection skills in a group of financial traders working on a London trading floor. We found that traders are better able to perceive their own heartbeats than matched controls from the non-trading population. Moreover, the interoceptive ability of traders predicted their relative profitability, and strikingly, how long they survived in the financial markets. Our results suggest that signals from the body - the gut feelings of financial lore - contribute to success in the markets. PMID:27641692

  11. Synthetic biomaterials for pelvic floor reconstruction.

    PubMed

    Karlovsky, Matthew E; Kushner, Leslie; Badlani, Gopal H

    2005-09-01

    Pelvic organ prolapse and stress urinary incontinence increase with age. The increasing proportion of the aging female population is likely to result in a demand for care of pelvic floor prolapse and incontinence. Experimental evidence of altered connective tissue metabolism may predispose to pelvic floor dysfunction, supporting the use of biomaterials, such as synthetic mesh, to correct pelvic fascial defects. Re-establishing pelvic support and continence calls for a biomaterial to be inert, flexible, and durable and to simultaneously minimize infection and erosion risk. Mesh as a biomaterial has evolved considerably throughout the past half century to the current line that combines ease of use, achieves good outcomes, and minimizes risk. This article explores the biochemical basis for pelvic floor attenuation and reviews various pelvic reconstructive mesh materials, their successes, failures, complications, and management.

  12. Pathophysiology of pelvic floor hypertonic disorders.

    PubMed

    Butrick, Charles W

    2009-09-01

    The pelvic floor represents the neuromuscular unit that provides support and functional control for the pelvic viscera. Its integrity, both anatomic and functional, is the key in some of the basic functions of life: storage of urine and feces, evacuation of urine and feces, support of pelvic organs, and sexual function. When this integrity is compromised, the results lead to many of the problems seen by clinicians. Pelvic floor dysfunction can involve weakness and result in stress incontinence, fecal incontinence, and pelvic organ prolapse. Pelvic floor dysfunction can also involve the development of hypertonic, dysfunctional muscles. This article discusses the pathophysiology of hypertonic disorders that often result in elimination problems, chronic pelvic pain, and bladder disorders that include bladder pain syndromes, retention, and incontinence. The hypertonic disorders are very common and are often not considered in the evaluation and management of patients with these problems.

  13. Fission Spectrum Related Uncertainties

    SciTech Connect

    G. Aliberti; I. Kodeli; G. Palmiotti; M. Salvatores

    2007-10-01

    The paper presents a preliminary uncertainty analysis related to potential uncertainties on the fission spectrum data. Consistent results are shown for a reference fast reactor design configuration and for experimental thermal configurations. However the results obtained indicate the need for further analysis, in particular in terms of fission spectrum uncertainty data assessment.

  14. Development, Testing, and Sensitivity and Uncertainty Analyses of a Transport and Reaction Simulation Engine (TaRSE) for Spatially Distributed Modeling of Phosphorus in South Florida Peat Marsh Wetlands

    USGS Publications Warehouse

    Jawitz, James W.; Munoz-Carpena, Rafael; Muller, Stuart; Grace, Kevin A.; James, Andrew I.

    2008-01-01

    in the phosphorus cycling mechanisms were simulated in these case studies using different combinations of phosphorus reaction equations. Changes in water column phosphorus concentrations observed under the controlled conditions of laboratory incubations, and mesocosm studies were reproduced with model simulations. Short-term phosphorus flux rates and changes in phosphorus storages were within the range of values reported in the literature, whereas unknown rate constants were used to calibrate the model output. In STA-1W Cell 4, the dominant mechanism for phosphorus flow and transport is overland flow. Over many life cycles of the biological components, however, soils accrue and become enriched in phosphorus. Inflow total phosphorus concentrations and flow rates for the period between 1995 and 2000 were used to simulate Cell 4 phosphorus removal, outflow concentrations, and soil phosphorus enrichment over time. This full-scale application of the model successfully incorporated parameter values derived from the literature and short-term experiments, and reproduced the observed long-term outflow phosphorus concentrations and increased soil phosphorus storage within the system. A global sensitivity and uncertainty analysis of the model was performed using modern techniques such as a qualitative screening tool (Morris method) and the quantitative, variance-based, Fourier Amplitude Sensitivity Test (FAST) method. These techniques allowed an in-depth exploration of the effect of model complexity and flow velocity on model outputs. Three increasingly complex levels of possible application to southern Florida were studied corresponding to a simple soil pore-water and surface-water system (level 1), the addition of plankton (level 2), and of macrophytes (level 3). In the analysis for each complexity level, three surface-water velocities were considered that each correspond to residence times for the selected area (1-kilometer long) of 2, 10, and 20

  15. Uncertainty and cognitive control.

    PubMed

    Mushtaq, Faisal; Bland, Amy R; Schaefer, Alexandre

    2011-01-01

    A growing trend of neuroimaging, behavioral, and computational research has investigated the topic of outcome uncertainty in decision-making. Although evidence to date indicates that humans are very effective in learning to adapt to uncertain situations, the nature of the specific cognitive processes involved in the adaptation to uncertainty are still a matter of debate. In this article, we reviewed evidence suggesting that cognitive control processes are at the heart of uncertainty in decision-making contexts. Available evidence suggests that: (1) There is a strong conceptual overlap between the constructs of uncertainty and cognitive control; (2) There is a remarkable overlap between the neural networks associated with uncertainty and the brain networks subserving cognitive control; (3) The perception and estimation of uncertainty might play a key role in monitoring processes and the evaluation of the "need for control"; (4) Potential interactions between uncertainty and cognitive control might play a significant role in several affective disorders.

  16. A&M. TAN607 floor plans. Shows three floor levels of pool, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    A&M. TAN-607 floor plans. Shows three floor levels of pool, hot shop, and warm shop. Includes view of pool vestibule, personnel labyrinth, location of floor rails, and room numbers of office areas, labs, instrument rooms, and stairways. This drawing was re-drawn to show as-built features in 1993. Ralph M. Parsons 902-3-ANP-607-A 96. Date of original: December 1952. Approved by INEEL Classification Office for public release. INEEL index code no. 034-0607-00-693-106748 - Idaho National Engineering Laboratory, Test Area North, Scoville, Butte County, ID

  17. Experimental uncertainty estimation and statistics for data having interval uncertainty.

    SciTech Connect

    Kreinovich, Vladik (Applied Biomathematics, Setauket, New York); Oberkampf, William Louis (Applied Biomathematics, Setauket, New York); Ginzburg, Lev (Applied Biomathematics, Setauket, New York); Ferson, Scott (Applied Biomathematics, Setauket, New York); Hajagos, Janos (Applied Biomathematics, Setauket, New York)

    2007-05-01

    This report addresses the characterization of measurements that include epistemic uncertainties in the form of intervals. It reviews the application of basic descriptive statistics to data sets which contain intervals rather than exclusively point estimates. It describes algorithms to compute various means, the median and other percentiles, variance, interquartile range, moments, confidence limits, and other important statistics and summarizes the computability of these statistics as a function of sample size and characteristics of the intervals in the data (degree of overlap, size and regularity of widths, etc.). It also reviews the prospects for analyzing such data sets with the methods of inferential statistics such as outlier detection and regressions. The report explores the tradeoff between measurement precision and sample size in statistical results that are sensitive to both. It also argues that an approach based on interval statistics could be a reasonable alternative to current standard methods for evaluating, expressing and propagating measurement uncertainties.

  18. Sea Floor off San Diego, California

    USGS Publications Warehouse

    Dartnell, Peter; Gibbons, Helen

    2009-01-01

    Ocean-floor image generated from multibeam-bathymetry data acquired by the U.S. Geological Survey (USGS); Woods Hole Oceanographic Institution; Scripps Institution of Oceanography; California State University, Monterey Bay; and Fugro Pelagos. To learn more, visit http://pubs.usgs.gov/sim/2007/2959/.

  19. 36 CFR 1192.59 - Floor surfaces.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 36 Parks, Forests, and Public Property 3 2013-07-01 2012-07-01 true Floor surfaces. 1192.59 Section 1192.59 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION BARRIERS COMPLIANCE BOARD AMERICANS WITH DISABILITIES ACT (ADA) ACCESSIBILITY GUIDELINES FOR TRANSPORTATION VEHICLES Rapid Rail Vehicles and Systems § 1192.59...

  20. 36 CFR 1192.59 - Floor surfaces.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 36 Parks, Forests, and Public Property 3 2012-07-01 2012-07-01 false Floor surfaces. 1192.59 Section 1192.59 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION BARRIERS COMPLIANCE BOARD AMERICANS WITH DISABILITIES ACT (ADA) ACCESSIBILITY GUIDELINES FOR TRANSPORTATION VEHICLES Rapid Rail Vehicles and Systems § 1192.59...

  1. 36 CFR 1192.59 - Floor surfaces.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 36 Parks, Forests, and Public Property 3 2014-07-01 2014-07-01 false Floor surfaces. 1192.59 Section 1192.59 Parks, Forests, and Public Property ARCHITECTURAL AND TRANSPORTATION BARRIERS COMPLIANCE BOARD AMERICANS WITH DISABILITIES ACT (ADA) ACCESSIBILITY GUIDELINES FOR TRANSPORTATION VEHICLES Rapid Rail Vehicles and Systems § 1192.59...

  2. Lead exposures from varnished floor refinishing.

    PubMed

    Schirmer, Joseph; Havlena, Jeff; Jacobs, David E; Dixon, Sherry; Ikens, Robert

    2012-01-01

    We evaluated the presence of lead in varnish and factors predicting lead exposure from floor refinishing and inexpensive dust suppression control methods. Lead in varnish, settled dust, and air were measured using XRF, laboratory analysis of scrape and wipe samples, and National Institute for Occupational Safety and Health (NIOSH) Method 7300, respectively, during refinishing (n = 35 homes). Data were analyzed using step-wise logistic regression. Compared with federal standards, no lead in varnish samples exceeded 1.0 mg/cm(2), but 52% exceeded 5000 ppm and 70% of settled dust samples after refinishing exceeded 40 μg/ft(2). Refinishing pre-1930 dwellings or stairs predicted high lead dust on floors. Laboratory analysis of lead in varnish was significantly correlated with airborne lead (r = 0.23, p = 0.014). Adding dust collection bags into drum sanders and HEPA vacuums to edgers and buffers reduced mean floor lead dust by 8293 μg Pb/ft(2) (p<0.05) on floors and reduced most airborne lead exposures to less than 50 μg/m(3). Refinishing varnished surfaces in older housing produces high but controllable lead exposures. PMID:22494405

  3. Seeing Results in Flooring for Schools

    ERIC Educational Resources Information Center

    Simmons, Brian

    2011-01-01

    Operations staffs at education facilities of all sizes are tasked with selecting a hard floor cleaning program that is cost-effective, efficient and highly productive. With an increased focus on the sustainability of an environment, facility managers also must select a program that meets sustainability goals while maintaining a healthful, safe…

  4. Building Trades. Block III. Floor Framing.

    ERIC Educational Resources Information Center

    Texas A and M Univ., College Station. Vocational Instructional Services.

    This document contains three units of a course on floor framing to be used as part of a building trades program. Each unit consists, first, of an informational lesson, with complete lesson plan for the teacher's use. Included in each lesson plan are the lesson aim; lists of teaching aids, materials, references, and prerequisites for students;…

  5. Concentric Crater Floor Deposits in Daedalia Planum

    NASA Technical Reports Server (NTRS)

    2003-01-01

    [figure removed for brevity, see original site]

    Released 3 September 2003

    Concentric crater floor deposits in Daedalia Planum. Lava flows appear to be converging on this crater from the northeast as well as on the crater floor. The concentric floor deposits may be the result of exposed and eroded layers of sediment that make up the crater floor.

    Image information: VIS instrument. Latitude -22.3, Longitude 221.5 East (138.5 West). 19 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.

  6. Performance Support on the Shop Floor.

    ERIC Educational Resources Information Center

    Kasvi, Jyrki J. J.; Vartiainen, Matti

    2000-01-01

    Discussion of performance support on the shop floor highlights four support systems for assembly lines that incorporate personal computer workstations in local area networks and use multimedia documents. Considers new customer-focused production paradigms; organizational learning; knowledge development; and electronic performance support systems…

  7. Nontraumatic orbital floor fracture after nose blowing.

    PubMed

    Sandhu, Ranjit S; Shah, Akash D

    2016-03-01

    A 40-year-old woman with no history of trauma or prior surgery presented to the emergency department with headache and left eye pain after nose blowing. Noncontrast maxillofacial computed tomography examination revealed an orbital floor fracture that ultimately required surgical repair. There are nontraumatic causes of orbital blowout fractures, and imaging should be obtained irrespective of trauma history. PMID:26973725

  8. Experimental and analytical studies on the vibration serviceability of pre-stressed cable RC truss floor systems

    NASA Astrophysics Data System (ADS)

    Zhou, Xuhong; Cao, Liang; Chen, Y. Frank; Liu, Jiepeng; Li, Jiang

    2016-01-01

    The developed pre-stressed cable reinforced concrete truss (PCT) floor system is a relatively new floor structure, which can be applied to various long-span structures such as buildings, stadiums, and bridges. Due to the lighter mass and longer span, floor vibration would be a serviceability concern problem for such systems. In this paper, field testing and theoretical analysis for the PCT floor system were conducted. Specifically, heel-drop impact and walking tests were performed on the PCT floor system to capture the dynamic properties including natural frequencies, mode shapes, damping ratios, and acceleration response. The PCT floor system was found to be a low frequency (<10 Hz) and low damping (damping ratio<2 percent) structural system. The comparison of the experimental results with the AISC's limiting values indicates that the investigated PCT system exhibits satisfactory vibration perceptibility, however. The analytical solution obtained from the weighted residual method agrees well with the experimental results and thus validates the proposed analytical expression. Sensitivity studies using the analytical solution were also conducted to investigate the vibration performance of the PCT floor system.

  9. The uncertainty of errors: Intolerance of uncertainty is associated with error-related brain activity.

    PubMed

    Jackson, Felicia; Nelson, Brady D; Hajcak, Greg

    2016-01-01

    Errors are unpredictable events that have the potential to cause harm. The error-related negativity (ERN) is the electrophysiological index of errors and has been posited to reflect sensitivity to threat. Intolerance of uncertainty (IU) is the tendency to perceive uncertain events as threatening. In the present study, 61 participants completed a self-report measure of IU and a flanker task designed to elicit the ERN. Results indicated that IU subscales were associated with the ERN in opposite directions. Cognitive distress in the face of uncertainty (Prospective IU) was associated with a larger ERN and slower reaction time. Inhibition in response to uncertainty (Inhibitory IU) was associated with a smaller ERN and faster reaction time. This study suggests that sensitivity to the uncertainty of errors contributes to the magnitude of the ERN. Furthermore, these findings highlight the importance of considering the heterogeneity of anxiety phenotypes in relation to measures of threat sensitivity. PMID:26607441

  10. The uncertainty of errors: Intolerance of uncertainty is associated with error-related brain activity.

    PubMed

    Jackson, Felicia; Nelson, Brady D; Hajcak, Greg

    2016-01-01

    Errors are unpredictable events that have the potential to cause harm. The error-related negativity (ERN) is the electrophysiological index of errors and has been posited to reflect sensitivity to threat. Intolerance of uncertainty (IU) is the tendency to perceive uncertain events as threatening. In the present study, 61 participants completed a self-report measure of IU and a flanker task designed to elicit the ERN. Results indicated that IU subscales were associated with the ERN in opposite directions. Cognitive distress in the face of uncertainty (Prospective IU) was associated with a larger ERN and slower reaction time. Inhibition in response to uncertainty (Inhibitory IU) was associated with a smaller ERN and faster reaction time. This study suggests that sensitivity to the uncertainty of errors contributes to the magnitude of the ERN. Furthermore, these findings highlight the importance of considering the heterogeneity of anxiety phenotypes in relation to measures of threat sensitivity.

  11. FLOOR PLAN OF MAIN PROCESSING BUILDING (CPP601), SECOND FLOOR SHOWING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    FLOOR PLAN OF MAIN PROCESSING BUILDING (CPP-601), SECOND FLOOR SHOWING PROCESS MAKEUP AREA AND EIGHTEEN CELLS AND ADJOINING REMOTE ANALYTICAL FACILITY (CPP-627) SHOWING COLD LAB, DECONTAMINATION ROOM, MULTICURIE CELL ROOM, AND OFFICES. TO LEFT ARE LABORATORY BUILDING (CPP-602) AND MAINTENANCE BUILDING (CPP-630). INL DRAWING NUMBER 200-0601-00-706-051980. ALTERNATE ID NUMBER CPP-E-1980. - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID

  12. FLOOR PLAN OF MAIN PROCESSING BUILDING (CPP601), FIRST FLOOR SHOWING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    FLOOR PLAN OF MAIN PROCESSING BUILDING (CPP-601), FIRST FLOOR SHOWING SAMPLE CORRIDORS AND EIGHTEEN CELLS AND ADJOINING REMOTE ANALYTICAL FACILITY (CPP-627) SHOWING REMOTE ANALYTICAL FACILITIES LAB, DECONTAMINATION ROOM, AND MULTICURIE CELL ROOM. TO LEFT ARE LABORATORY BUILDING (CPP-602) AND MAINTENANCE BUILDING (CPP-630). INL DRAWING NUMBER 200-0601-00-706-051979. ALTERNATE ID NUMBER CPP-E-1979. - Idaho National Engineering Laboratory, Idaho Chemical Processing Plant, Fuel Reprocessing Complex, Scoville, Butte County, ID

  13. Hydrological model uncertainty assessment in southern Africa

    NASA Astrophysics Data System (ADS)

    Hughes, D. A.; Kapangaziwiri, E.; Sawunyama, T.

    2010-06-01

    The importance of hydrological uncertainty analysis has been emphasized in recent years and there is an urgent need to incorporate uncertainty estimation into water resources assessment procedures used in the southern Africa region. The region is characterized by a paucity of accurate data and limited human resources, but the need for informed development decisions is critical to social and economic development. One of the main sources of uncertainty is related to the estimation of the parameters of hydrological models. This paper proposes a framework for establishing parameter values, exploring parameter inter-dependencies and setting parameter uncertainty bounds for a monthly time-step rainfall-runoff model (Pitman model) that is widely used in the region. The method is based on well-documented principles of sensitivity and uncertainty analysis, but recognizes the limitations that exist within the region (data scarcity and accuracy, model user attitudes, etc.). Four example applications taken from different climate and physiographic regions of South Africa illustrate that the methods are appropriate for generating behavioural stream flow simulations which include parameter uncertainty. The parameters that dominate the model response and their degree of uncertainty vary between regions. Some of the results suggest that the uncertainty bounds will be too wide for effective water resources decision making. Further work is required to reduce some of the subjectivity in the methods and to investigate other approaches for constraining the uncertainty. The paper recognizes that probability estimates of uncertainty and methods to include input climate data uncertainties need to be incorporated into the framework in the future.

  14. 27 CFR 46.233 - Payment of floor stocks tax.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... PRODUCTS AND CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette Tubes Held for Sale on April 1, 2009 Filing Requirements § 46.233 Payment of floor stocks tax....

  15. 27 CFR 46.231 - Floor stocks tax return.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette....28T09, 2009 Floor Stocks Tax Return—Tobacco Products and Cigarette Papers and Tubes, is available...

  16. 27 CFR 46.231 - Floor stocks tax return.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette....28T09, 2009 Floor Stocks Tax Return—Tobacco Products and Cigarette Papers and Tubes, is available...

  17. 27 CFR 46.233 - Payment of floor stocks tax.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... PRODUCTS AND CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette Tubes Held for Sale on April 1, 2009 Filing Requirements § 46.233 Payment of floor stocks tax....

  18. 27 CFR 46.231 - Floor stocks tax return.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette....28T09, 2009 Floor Stocks Tax Return—Tobacco Products and Cigarette Papers and Tubes, is available...

  19. 27 CFR 46.233 - Payment of floor stocks tax.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... PRODUCTS AND CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette Tubes Held for Sale on April 1, 2009 Filing Requirements § 46.233 Payment of floor stocks tax....

  20. 27 CFR 46.231 - Floor stocks tax return.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette....28T09, 2009 Floor Stocks Tax Return—Tobacco Products and Cigarette Papers and Tubes, is available...

  1. 27 CFR 46.233 - Payment of floor stocks tax.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... PRODUCTS AND CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette Tubes Held for Sale on April 1, 2009 Filing Requirements § 46.233 Payment of floor stocks tax....

  2. 27 CFR 46.231 - Floor stocks tax return.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... CIGARETTE PAPERS AND TUBES Floor Stocks Tax on Certain Tobacco Products, Cigarette Papers, and Cigarette....28T09, 2009 Floor Stocks Tax Return—Tobacco Products and Cigarette Papers and Tubes, is available...

  3. 5. EAST SECTION OF BUILDING, FIRST FLOOR, WEST ROOM. NOTE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. EAST SECTION OF BUILDING, FIRST FLOOR, WEST ROOM. NOTE OVEN AT LEFT. All construction original except wood flooring, plumbing and electricity. - Ralph Izard House, Kitchen Building, 110 Broad Street, Charleston, Charleston County, SC

  4. 12. TRIPLE WINDOW, FIRST FLOOR, SOUTH SIDE. Typical for all ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    12. TRIPLE WINDOW, FIRST FLOOR, SOUTH SIDE. Typical for all triple windows on first and second floors. Note single swing jib door - John Joyner Smith House, 400 Wilmington Street, Beaufort, Beaufort County, SC

  5. 11. BUILDING 1: FIRST FLOOR (Center Section), WEST AND NORTH ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. BUILDING 1: FIRST FLOOR (Center Section), WEST AND NORTH WALLS, SHOWING TWO TIERS OF COLUMNS WITH SECOND FLOOR REMOVED - Boston Beer Company, 225-249 West Second Street, South Boston, Suffolk County, MA

  6. Portable flooring protects finished surfaces, is easily moved

    NASA Technical Reports Server (NTRS)

    Carmody, R. J.

    1964-01-01

    To protect curved, finished surface and provide support for workmen, portable flooring has been made from rigid plastic foam blocks, faced with aluminum strips. Held together by nylon webbing, the flooring can be rolled up for easy carrying.

  7. 23. GRAINELEVATOR SECTION, FIRST FLOOR, INTERIOR, DETAIL OF FRAMING IN ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    23. GRAIN-ELEVATOR SECTION, FIRST FLOOR, INTERIOR, DETAIL OF FRAMING IN SOUTHWEST CORNER SUPPORTING GRAIN STORAGE CRIBS ON FLOORS ABOVE, LOOKING SOUTHWEST - Standard Mill, 116-118 Portland Avenue South, Minneapolis, Hennepin County, MN

  8. Refrigeration Plant, North Elevation, Second Floor Plan, East Elevation, Ground ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    Refrigeration Plant, North Elevation, Second Floor Plan, East Elevation, Ground Floor Plan, Section A-A - Kennecott Copper Corporation, On Copper River & Northwestern Railroad, Kennicott, Valdez-Cordova Census Area, AK

  9. 15. View northeast, interior, second floor, central hallway (door to ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    15. View northeast, interior, second floor, central hallway (door to northwest bedroom at left, doorway to second floor porch at right in photograph) - Abraham Cyrus Farmstead, Farmhouse, 3271 Cyrus Road (County Road 1/6), Cyrus, Wayne County, WV

  10. 25. Interior view, second floor, showing numerous spouts and Simpson ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    25. Interior view, second floor, showing numerous spouts and Simpson Rotex sifter (Orville Simpson, Co; Cincinnati) on floor in middle-foreground. - Fisher-Fallgatter Mill, Waupaca, Waupaca County, WI

  11. 30. GENERAL TEST ROOM IN 1946 ADDITION, FOURTH FLOOR, LOOKING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    30. GENERAL TEST ROOM IN 1946 ADDITION, FOURTH FLOOR, LOOKING WEST. ORIGINALLY HAD SUSPENDED ACOUSTICAL CEILINGS WITH FLOURESCENT LIGHTING AND ASPHALT MASTIC TILE FLOORS - Underwriters' Laboratories, 207-231 East Ohio Street, Chicago, Cook County, IL

  12. CAR MACHINE SHOP, SECOND FLOOR, PAINT SPRAY ROOM EXTERIOR AND ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    CAR MACHINE SHOP, SECOND FLOOR, PAINT SPRAY ROOM EXTERIOR AND ATTIC FLOOR SUPPORT COLUMNS AND BEAMS, LOOKING WEST. - Southern Pacific, Sacramento Shops, Car Machine Shop, 111 I Street, Sacramento, Sacramento County, CA

  13. INTERIOR VIEW OF THE SECOND FLOOR HALL. SHOWING THE IRON ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR VIEW OF THE SECOND FLOOR HALL. SHOWING THE IRON RAILING. NOTE THE TONGUE-AND-GROOVE WOOD FLOORING AND SINGLE PANEL DOORS. VIEW FACING SOUTH. - Hickam Field, Officers' Housing Type C, 208 Second Street, Honolulu, Honolulu County, HI

  14. INTERIOR VIEW OF THE SECOND FLOOR STAIR HALL. NOTE THE ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    INTERIOR VIEW OF THE SECOND FLOOR STAIR HALL. NOTE THE TONGUE-AND-GROOVE WOOD FLOORING AND THE WINDOW ABOVE THE STAIR LANDING. VIEW FACING SOUTH. - Hickam Field, Officers' Housing Type D, 111 Beard Avenue, Honolulu, Honolulu County, HI

  15. 3. MILK BARN, INTERIOR VIEW OF GROUND FLOOR, LOOKING 132 ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. MILK BARN, INTERIOR VIEW OF GROUND FLOOR, LOOKING 132 DEGREES SOUTHEAST, SHOWING RAISED FLOOR OF CENTRAL AISLE. - Hudson-Cippa-Wolf Ranch, Milk Barn, Sorento Road, Sacramento, Sacramento County, CA

  16. 50. Ground floor, looking northwest at former location of ground ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    50. Ground floor, looking northwest at former location of ground floor (bottom) level of milk room - Sheffield Farms Milk Plant, 1075 Webster Avenue (southwest corner of 166th Street), Bronx, Bronx County, NY

  17. 27. INTERIOR, FIRST FLOOR, SOUTH ENTRANCE, SOUTH LOBBY, DETAIL OF ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    27. INTERIOR, FIRST FLOOR, SOUTH ENTRANCE, SOUTH LOBBY, DETAIL OF BRONZE SEAL IN FLOOR (4' x 5' negative; 8' x 10' print) - U.S. Department of the Interior, Eighteenth & C Streets Northwest, Washington, District of Columbia, DC

  18. 3. FIRST FLOOR, FRONT SOUTHWEST CORNER ROOM WITH STAIRWAY TO ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    3. FIRST FLOOR, FRONT SOUTHWEST CORNER ROOM WITH STAIRWAY TO SECOND FLOOR - Penn School Historic District, Benezet House, 1 mile South of Frogmore, Route 37, St Helena Island, Frogmore, Beaufort County, SC

  19. 33. Third floor, looking north, elevator and central stair to ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    33. Third floor, looking north, elevator and central stair to the right (original ice manufacturing floor) - Sheffield Farms Milk Plant, 1075 Webster Avenue (southwest corner of 166th Street), Bronx, Bronx County, NY

  20. 71. FIRST FLOOR, SENATE OFFICE HALLWAY IN 1902 ADDITION, LOOKING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    71. FIRST FLOOR, SENATE OFFICE HALLWAY IN 1902 ADDITION, LOOKING NORTHWEST, SHOWING ITALIAN MARBLE TILE FLOOR, PILASTERS, WAINSCOATING, DOOR SURROUND AND PLASTER CORNICE - Maryland State House, State Circle, Annapolis, Anne Arundel County, MD