Sample records for integrated uncertainty analysis

  1. Performance and Reliability Optimization for Aerospace Systems subject to Uncertainty and Degradation

    NASA Technical Reports Server (NTRS)

    Miller, David W.; Uebelhart, Scott A.; Blaurock, Carl

    2004-01-01

    This report summarizes work performed by the Space Systems Laboratory (SSL) for NASA Langley Research Center in the field of performance optimization for systems subject to uncertainty. The objective of the research is to develop design methods and tools to the aerospace vehicle design process which take into account lifecycle uncertainties. It recognizes that uncertainty between the predictions of integrated models and data collected from the system in its operational environment is unavoidable. Given the presence of uncertainty, the goal of this work is to develop means of identifying critical sources of uncertainty, and to combine these with the analytical tools used with integrated modeling. In this manner, system uncertainty analysis becomes part of the design process, and can motivate redesign. The specific program objectives were: 1. To incorporate uncertainty modeling, propagation and analysis into the integrated (controls, structures, payloads, disturbances, etc.) design process to derive the error bars associated with performance predictions. 2. To apply modern optimization tools to guide in the expenditure of funds in a way that most cost-effectively improves the lifecycle productivity of the system by enhancing the subsystem reliability and redundancy. The results from the second program objective are described. This report describes the work and results for the first objective: uncertainty modeling, propagation, and synthesis with integrated modeling.

  2. Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment

    NASA Astrophysics Data System (ADS)

    Taner, M. U.; Wi, S.; Brown, C.

    2017-12-01

    The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.

  3. Integrating model behavior, optimization, and sensitivity/uncertainty analysis: overview and application of the MOUSE software toolbox

    USDA-ARS?s Scientific Manuscript database

    This paper provides an overview of the Model Optimization, Uncertainty, and SEnsitivity Analysis (MOUSE) software application, an open-source, Java-based toolbox of visual and numerical analysis components for the evaluation of environmental models. MOUSE is based on the OPTAS model calibration syst...

  4. Uncertainties in stormwater runoff data collection from a small urban catchment, Southeast China.

    PubMed

    Huang, Jinliang; Tu, Zhenshun; Du, Pengfei; Lin, Jie; Li, Qingsheng

    2010-01-01

    Monitoring data are often used to identify stormwater runoff characteristics and in stormwater runoff modelling without consideration of their inherent uncertainties. Integrated with discrete sample analysis and error propagation analysis, this study attempted to quantify the uncertainties of discrete chemical oxygen demand (COD), total suspended solids (TSS) concentration, stormwater flowrate, stormwater event volumes, COD event mean concentration (EMC), and COD event loads in terms of flow measurement, sample collection, storage and laboratory analysis. The results showed that the uncertainties due to sample collection, storage and laboratory analysis of COD from stormwater runoff are 13.99%, 19.48% and 12.28%. Meanwhile, flow measurement uncertainty was 12.82%, and the sample collection uncertainty of TSS from stormwater runoff was 31.63%. Based on the law of propagation of uncertainties, the uncertainties regarding event flow volume, COD EMC and COD event loads were quantified as 7.03%, 10.26% and 18.47%.

  5. Detailed modeling of the statistical uncertainty of Thomson scattering measurements

    NASA Astrophysics Data System (ADS)

    Morton, L. A.; Parke, E.; Den Hartog, D. J.

    2013-11-01

    The uncertainty of electron density and temperature fluctuation measurements is determined by statistical uncertainty introduced by multiple noise sources. In order to quantify these uncertainties precisely, a simple but comprehensive model was made of the noise sources in the MST Thomson scattering system and of the resulting variance in the integrated scattered signals. The model agrees well with experimental and simulated results. The signal uncertainties are then used by our existing Bayesian analysis routine to find the most likely electron temperature and density, with confidence intervals. In the model, photonic noise from scattered light and plasma background light is multiplied by the noise enhancement factor (F) of the avalanche photodiode (APD). Electronic noise from the amplifier and digitizer is added. The amplifier response function shapes the signal and induces correlation in the noise. The data analysis routine fits a characteristic pulse to the digitized signals from the amplifier, giving the integrated scattered signals. A finite digitization rate loses information and can cause numerical integration error. We find a formula for the variance of the scattered signals in terms of the background and pulse amplitudes, and three calibration constants. The constants are measured easily under operating conditions, resulting in accurate estimation of the scattered signals' uncertainty. We measure F ≈ 3 for our APDs, in agreement with other measurements for similar APDs. This value is wavelength-independent, simplifying analysis. The correlated noise we observe is reproduced well using a Gaussian response function. Numerical integration error can be made negligible by using an interpolated characteristic pulse, allowing digitization rates as low as the detector bandwidth. The effect of background noise is also determined.

  6. Incorporating climate-system and carbon-cycle uncertainties in integrated assessments of climate change. (Invited)

    NASA Astrophysics Data System (ADS)

    Rogelj, J.; McCollum, D. L.; Reisinger, A.; Knutti, R.; Riahi, K.; Meinshausen, M.

    2013-12-01

    The field of integrated assessment draws from a large body of knowledge across a range of disciplines to gain robust insights about possible interactions, trade-offs, and synergies. Integrated assessment of climate change, for example, uses knowledge from the fields of energy system science, economics, geophysics, demography, climate change impacts, and many others. Each of these fields comes with its associated caveats and uncertainties, which should be taken into account when assessing any results. The geophysical system and its associated uncertainties are often represented by models of reduced complexity in integrated assessment modelling frameworks. Such models include simple representations of the carbon-cycle and climate system, and are often based on the global energy balance equation. A prominent example of such model is the 'Model for the Assessment of Greenhouse Gas Induced Climate Change', MAGICC. Here we show how a model like MAGICC can be used for the representation of geophysical uncertainties. Its strengths, weaknesses, and limitations are discussed and illustrated by means of an analysis which attempts to integrate socio-economic and geophysical uncertainties. These uncertainties in the geophysical response of the Earth system to greenhouse gases remains key for estimating the cost of greenhouse gas emission mitigation scenarios. We look at uncertainties in four dimensions: geophysical, technological, social and political. Our results indicate that while geophysical uncertainties are an important factor influencing projections of mitigation costs, political choices that delay mitigation by one or two decades a much more pronounced effect.

  7. A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules

    NASA Astrophysics Data System (ADS)

    Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.

    2012-08-01

    Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.

  8. Uncertainty of climate change impact on groundwater reserves - Application to a chalk aquifer

    NASA Astrophysics Data System (ADS)

    Goderniaux, Pascal; Brouyère, Serge; Wildemeersch, Samuel; Therrien, René; Dassargues, Alain

    2015-09-01

    Recent studies have evaluated the impact of climate change on groundwater resources for different geographical and climatic contexts. However, most studies have either not estimated the uncertainty around projected impacts or have limited the analysis to the uncertainty related to climate models. In this study, the uncertainties around impact projections from several sources (climate models, natural variability of the weather, hydrological model calibration) are calculated and compared for the Geer catchment (465 km2) in Belgium. We use a surface-subsurface integrated model implemented using the finite element code HydroGeoSphere, coupled with climate change scenarios (2010-2085) and the UCODE_2005 inverse model, to assess the uncertainty related to the calibration of the hydrological model. This integrated model provides a more realistic representation of the water exchanges between surface and subsurface domains and constrains more the calibration with the use of both surface and subsurface observed data. Sensitivity and uncertainty analyses were performed on predictions. The linear uncertainty analysis is approximate for this nonlinear system, but it provides some measure of uncertainty for computationally demanding models. Results show that, for the Geer catchment, the most important uncertainty is related to calibration of the hydrological model. The total uncertainty associated with the prediction of groundwater levels remains large. By the end of the century, however, the uncertainty becomes smaller than the predicted decline in groundwater levels.

  9. An integrated uncertainty analysis and data assimilation approach for improved streamflow predictions

    NASA Astrophysics Data System (ADS)

    Hogue, T. S.; He, M.; Franz, K. J.; Margulis, S. A.; Vrugt, J. A.

    2010-12-01

    The current study presents an integrated uncertainty analysis and data assimilation approach to improve streamflow predictions while simultaneously providing meaningful estimates of the associated uncertainty. Study models include the National Weather Service (NWS) operational snow model (SNOW17) and rainfall-runoff model (SAC-SMA). The proposed approach uses the recently developed DiffeRential Evolution Adaptive Metropolis (DREAM) to simultaneously estimate uncertainties in model parameters, forcing, and observations. An ensemble Kalman filter (EnKF) is configured with the DREAM-identified uncertainty structure and applied to assimilating snow water equivalent data into the SNOW17 model for improved snowmelt simulations. Snowmelt estimates then serves as an input to the SAC-SMA model to provide streamflow predictions at the basin outlet. The robustness and usefulness of the approach is evaluated for a snow-dominated watershed in the northern Sierra Mountains. This presentation describes the implementation of DREAM and EnKF into the coupled SNOW17 and SAC-SMA models and summarizes study results and findings.

  10. Uncertainty as Knowledge: Constraints on Policy Choices Provided by Analysis of Uncertainty

    NASA Astrophysics Data System (ADS)

    Lewandowsky, S.; Risbey, J.; Smithson, M.; Newell, B. R.

    2012-12-01

    Uncertainty forms an integral part of climate science, and it is often cited in connection with arguments against mitigative action. We argue that an analysis of uncertainty must consider existing knowledge as well as uncertainty, and the two must be evaluated with respect to the outcomes and risks associated with possible policy options. Although risk judgments are inherently subjective, an analysis of the role of uncertainty within the climate system yields two constraints that are robust to a broad range of assumptions. Those constraints are that (a) greater uncertainty about the climate system is necessarily associated with greater expected damages from warming, and (b) greater uncertainty translates into a greater risk of the failure of mitigation efforts. These ordinal constraints are unaffected by subjective or cultural risk-perception factors, they are independent of the discount rate, and they are independent of the magnitude of the estimate for climate sensitivity. The constraints mean that any appeal to uncertainty must imply a stronger, rather than weaker, need to cut greenhouse gas emissions than in the absence of uncertainty.

  11. Quantifying and Reducing Uncertainty in Correlated Multi-Area Short-Term Load Forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Yannan; Hou, Zhangshuan; Meng, Da

    2016-07-17

    In this study, we represent and reduce the uncertainties in short-term electric load forecasting by integrating time series analysis tools including ARIMA modeling, sequential Gaussian simulation, and principal component analysis. The approaches are mainly focusing on maintaining the inter-dependency between multiple geographically related areas. These approaches are applied onto cross-correlated load time series as well as their forecast errors. Multiple short-term prediction realizations are then generated from the reduced uncertainty ranges, which are useful for power system risk analyses.

  12. Micropollutants throughout an integrated urban drainage model: Sensitivity and uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Mannina, Giorgio; Cosenza, Alida; Viviani, Gaspare

    2017-11-01

    The paper presents the sensitivity and uncertainty analysis of an integrated urban drainage model which includes micropollutants. Specifically, a bespoke integrated model developed in previous studies has been modified in order to include the micropollutant assessment (namely, sulfamethoxazole - SMX). The model takes into account also the interactions between the three components of the system: sewer system (SS), wastewater treatment plant (WWTP) and receiving water body (RWB). The analysis has been applied to an experimental catchment nearby Palermo (Italy): the Nocella catchment. Overall, five scenarios, each characterized by different uncertainty combinations of sub-systems (i.e., SS, WWTP and RWB), have been considered applying, for the sensitivity analysis, the Extended-FAST method in order to select the key factors affecting the RWB quality and to design a reliable/useful experimental campaign. Results have demonstrated that sensitivity analysis is a powerful tool for increasing operator confidence in the modelling results. The approach adopted here can be used for blocking some non-identifiable factors, thus wisely modifying the structure of the model and reducing the related uncertainty. The model factors related to the SS have been found to be the most relevant factors affecting the SMX modeling in the RWB when all model factors (scenario 1) or model factors of SS (scenarios 2 and 3) are varied. If the only factors related to the WWTP are changed (scenarios 4 and 5), the SMX concentration in the RWB is mainly influenced (till to 95% influence of the total variance for SSMX,max) by the aerobic sorption coefficient. A progressive uncertainty reduction from the upstream to downstream was found for the soluble fraction of SMX in the RWB.

  13. Removal of Asperger's syndrome from the DSM V: community response to uncertainty.

    PubMed

    Parsloe, Sarah M; Babrow, Austin S

    2016-01-01

    The May 2013 release of the new version of the Diagnostic and Statistical Manual of Mental Disorders (DSM V) subsumed Asperger's syndrome under the wider diagnostic label of autism spectrum disorder (ASD). The revision has created much uncertainty in the community affected by this condition. This study uses problematic integration theory and thematic analysis to investigate how participants in Wrong Planet, a large online community associated with autism and Asperger's syndrome, have constructed these uncertainties. The analysis illuminates uncertainties concerning both the likelihood of diagnosis and value of diagnosis, and it details specific issues within these two general areas of uncertainty. The article concludes with both conceptual and practical implications.

  14. Rocket engine system reliability analyses using probabilistic and fuzzy logic techniques

    NASA Technical Reports Server (NTRS)

    Hardy, Terry L.; Rapp, Douglas C.

    1994-01-01

    The reliability of rocket engine systems was analyzed by using probabilistic and fuzzy logic techniques. Fault trees were developed for integrated modular engine (IME) and discrete engine systems, and then were used with the two techniques to quantify reliability. The IRRAS (Integrated Reliability and Risk Analysis System) computer code, developed for the U.S. Nuclear Regulatory Commission, was used for the probabilistic analyses, and FUZZYFTA (Fuzzy Fault Tree Analysis), a code developed at NASA Lewis Research Center, was used for the fuzzy logic analyses. Although both techniques provided estimates of the reliability of the IME and discrete systems, probabilistic techniques emphasized uncertainty resulting from randomness in the system whereas fuzzy logic techniques emphasized uncertainty resulting from vagueness in the system. Because uncertainty can have both random and vague components, both techniques were found to be useful tools in the analysis of rocket engine system reliability.

  15. Operationalising uncertainty in data and models for integrated water resources management.

    PubMed

    Blind, M W; Refsgaard, J C

    2007-01-01

    Key sources of uncertainty of importance for water resources management are (1) uncertainty in data; (2) uncertainty related to hydrological models (parameter values, model technique, model structure); and (3) uncertainty related to the context and the framing of the decision-making process. The European funded project 'Harmonised techniques and representative river basin data for assessment and use of uncertainty information in integrated water management (HarmoniRiB)' has resulted in a range of tools and methods to assess such uncertainties, focusing on items (1) and (2). The project also engaged in a number of discussions surrounding uncertainty and risk assessment in support of decision-making in water management. Based on the project's results and experiences, and on the subsequent discussions a number of conclusions can be drawn on the future needs for successful adoption of uncertainty analysis in decision support. These conclusions range from additional scientific research on specific uncertainties, dedicated guidelines for operational use to capacity building at all levels. The purpose of this paper is to elaborate on these conclusions and anchoring them in the broad objective of making uncertainty and risk assessment an essential and natural part in future decision-making processes.

  16. Robustness Analysis and Reliable Flight Regime Estimation of an Integrated Resilent Control System for a Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Shin, Jong-Yeob; Belcastro, Christine

    2008-01-01

    Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. As a part of the validation process, this paper describes an analysis method for determining a reliable flight regime in the flight envelope within which an integrated resilent control system can achieve the desired performance of tracking command signals and detecting additive faults in the presence of parameter uncertainty and unmodeled dynamics. To calculate a reliable flight regime, a structured singular value analysis method is applied to analyze the closed-loop system over the entire flight envelope. To use the structured singular value analysis method, a linear fractional transform (LFT) model of a transport aircraft longitudinal dynamics is developed over the flight envelope by using a preliminary LFT modeling software tool developed at the NASA Langley Research Center, which utilizes a matrix-based computational approach. The developed LFT model can capture original nonlinear dynamics over the flight envelope with the ! block which contains key varying parameters: angle of attack and velocity, and real parameter uncertainty: aerodynamic coefficient uncertainty and moment of inertia uncertainty. Using the developed LFT model and a formal robustness analysis method, a reliable flight regime is calculated for a transport aircraft closed-loop system.

  17. PC-BASED SUPERCOMPUTING FOR UNCERTAINTY AND SENSITIVITY ANALYSIS OF MODELS

    EPA Science Inventory

    Evaluating uncertainty and sensitivity of multimedia environmental models that integrate assessments of air, soil, sediments, groundwater, and surface water is a difficult task. It can be an enormous undertaking even for simple, single-medium models (i.e. groundwater only) descr...

  18. The option to abandon: stimulating innovative groundwater remediation technologies characterized by technological uncertainty.

    PubMed

    Compernolle, T; Van Passel, S; Huisman, K; Kort, P

    2014-10-15

    Many studies on technology adoption demonstrate that uncertainty leads to a postponement of investments by integrating a wait option in the economic analysis. The aim of this study however is to demonstrate how the investment in new technologies can be stimulated by integrating an option to abandon. Furthermore, this real option analysis not only considers the ex ante decision analysis of the investment in a new technology under uncertainty, but also allows for an ex post evaluation of the investment. Based on a case study regarding the adoption of an innovative groundwater remediation strategy, it is demonstrated that when the option to abandon the innovative technology is taken into account, the decision maker decides to invest in this technology, while at the same time it determines an optimal timing to abandon the technology if its operation proves to be inefficient. To reduce uncertainty about the effectiveness of groundwater remediation technologies, samples are taken. Our analysis shows that when the initial belief in an effective innovative technology is low, it is important that these samples provide correct information in order to justify the adoption of the innovative technology. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Characterizing spatial uncertainty when integrating social data in conservation planning.

    PubMed

    Lechner, A M; Raymond, C M; Adams, V M; Polyakov, M; Gordon, A; Rhodes, J R; Mills, M; Stein, A; Ives, C D; Lefroy, E C

    2014-12-01

    Recent conservation planning studies have presented approaches for integrating spatially referenced social (SRS) data with a view to improving the feasibility of conservation action. We reviewed the growing conservation literature on SRS data, focusing on elicited or stated preferences derived through social survey methods such as choice experiments and public participation geographic information systems. Elicited SRS data includes the spatial distribution of willingness to sell, willingness to pay, willingness to act, and assessments of social and cultural values. We developed a typology for assessing elicited SRS data uncertainty which describes how social survey uncertainty propagates when projected spatially and the importance of accounting for spatial uncertainty such as scale effects and data quality. These uncertainties will propagate when elicited SRS data is integrated with biophysical data for conservation planning and may have important consequences for assessing the feasibility of conservation actions. To explore this issue further, we conducted a systematic review of the elicited SRS data literature. We found that social survey uncertainty was commonly tested for, but that these uncertainties were ignored when projected spatially. Based on these results we developed a framework which will help researchers and practitioners estimate social survey uncertainty and use these quantitative estimates to systematically address uncertainty within an analysis. This is important when using SRS data in conservation applications because decisions need to be made irrespective of data quality and well characterized uncertainty can be incorporated into decision theoretic approaches. © 2014 Society for Conservation Biology.

  20. Decision Making Under Uncertainty and Complexity: A Model-Based Scenario Approach to Supporting Integrated Water Resources Management

    NASA Astrophysics Data System (ADS)

    Liu, Y.; Gupta, H.; Wagener, T.; Stewart, S.; Mahmoud, M.; Hartmann, H.; Springer, E.

    2007-12-01

    Some of the most challenging issues facing contemporary water resources management are those typified by complex coupled human-environmental systems with poorly characterized uncertainties. In other words, major decisions regarding water resources have to be made in the face of substantial uncertainty and complexity. It has been suggested that integrated models can be used to coherently assemble information from a broad set of domains, and can therefore serve as an effective means for tackling the complexity of environmental systems. Further, well-conceived scenarios can effectively inform decision making, particularly when high complexity and poorly characterized uncertainties make the problem intractable via traditional uncertainty analysis methods. This presentation discusses the integrated modeling framework adopted by SAHRA, an NSF Science & Technology Center, to investigate stakeholder-driven water sustainability issues within the semi-arid southwestern US. The multi-disciplinary, multi-resolution modeling framework incorporates a formal scenario approach to analyze the impacts of plausible (albeit uncertain) alternative futures to support adaptive management of water resources systems. Some of the major challenges involved in, and lessons learned from, this effort will be discussed.

  1. Risk Assessment of Groundwater Contamination: A Multilevel Fuzzy Comprehensive Evaluation Approach Based on DRASTIC Model

    PubMed Central

    Zhang, Yan; Zhong, Ming

    2013-01-01

    Groundwater contamination is a serious threat to water supply. Risk assessment of groundwater contamination is an effective way to protect the safety of groundwater resource. Groundwater is a complex and fuzzy system with many uncertainties, which is impacted by different geological and hydrological factors. In order to deal with the uncertainty in the risk assessment of groundwater contamination, we propose an approach with analysis hierarchy process and fuzzy comprehensive evaluation integrated together. Firstly, the risk factors of groundwater contamination are identified by the sources-pathway-receptor-consequence method, and a corresponding index system of risk assessment based on DRASTIC model is established. Due to the complexity in the process of transitions between the possible pollution risks and the uncertainties of factors, the method of analysis hierarchy process is applied to determine the weights of each factor, and the fuzzy sets theory is adopted to calculate the membership degrees of each factor. Finally, a case study is presented to illustrate and test this methodology. It is concluded that the proposed approach integrates the advantages of both analysis hierarchy process and fuzzy comprehensive evaluation, which provides a more flexible and reliable way to deal with the linguistic uncertainty and mechanism uncertainty in groundwater contamination without losing important information. PMID:24453883

  2. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis.

    PubMed

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  3. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-03-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.

  4. Uncertainties in Earthquake Loss Analysis: A Case Study From Southern California

    NASA Astrophysics Data System (ADS)

    Mahdyiar, M.; Guin, J.

    2005-12-01

    Probabilistic earthquake hazard and loss analyses play important roles in many areas of risk management, including earthquake related public policy and insurance ratemaking. Rigorous loss estimation for portfolios of properties is difficult since there are various types of uncertainties in all aspects of modeling and analysis. It is the objective of this study to investigate the sensitivity of earthquake loss estimation to uncertainties in regional seismicity, earthquake source parameters, ground motions, and sites' spatial correlation on typical property portfolios in Southern California. Southern California is an attractive region for such a study because it has a large population concentration exposed to significant levels of seismic hazard. During the last decade, there have been several comprehensive studies of most regional faults and seismogenic sources. There have also been detailed studies on regional ground motion attenuations and regional and local site responses to ground motions. This information has been used by engineering seismologists to conduct regional seismic hazard and risk analysis on a routine basis. However, one of the more difficult tasks in such studies is the proper incorporation of uncertainties in the analysis. From the hazard side, there are uncertainties in the magnitudes, rates and mechanisms of the seismic sources and local site conditions and ground motion site amplifications. From the vulnerability side, there are considerable uncertainties in estimating the state of damage of buildings under different earthquake ground motions. From an analytical side, there are challenges in capturing the spatial correlation of ground motions and building damage, and integrating thousands of loss distribution curves with different degrees of correlation. In this paper we propose to address some of these issues by conducting loss analyses of a typical small portfolio in southern California, taking into consideration various source and ground motion uncertainties. The approach is designed to integrate loss distribution functions with different degrees of correlation for portfolio analysis. The analysis is based on USGS 2002 regional seismicity model.

  5. Can integrative catchment management mitigate future water quality issues caused by climate change and socio-economic development?

    NASA Astrophysics Data System (ADS)

    Honti, Mark; Schuwirth, Nele; Rieckermann, Jörg; Stamm, Christian

    2017-03-01

    The design and evaluation of solutions for integrated surface water quality management requires an integrated modelling approach. Integrated models have to be comprehensive enough to cover the aspects relevant for management decisions, allowing for mapping of larger-scale processes such as climate change to the regional and local contexts. Besides this, models have to be sufficiently simple and fast to apply proper methods of uncertainty analysis, covering model structure deficits and error propagation through the chain of sub-models. Here, we present a new integrated catchment model satisfying both conditions. The conceptual iWaQa model was developed to support the integrated management of small streams. It can be used to predict traditional water quality parameters, such as nutrients and a wide set of organic micropollutants (plant and material protection products), by considering all major pollutant pathways in urban and agricultural environments. Due to its simplicity, the model allows for a full, propagative analysis of predictive uncertainty, including certain structural and input errors. The usefulness of the model is demonstrated by predicting future surface water quality in a small catchment with mixed land use in the Swiss Plateau. We consider climate change, population growth or decline, socio-economic development, and the implementation of management strategies to tackle urban and agricultural point and non-point sources of pollution. Our results indicate that input and model structure uncertainties are the most influential factors for certain water quality parameters. In these cases model uncertainty is already high for present conditions. Nevertheless, accounting for today's uncertainty makes management fairly robust to the foreseen range of potential changes in the next decades. The assessment of total predictive uncertainty allows for selecting management strategies that show small sensitivity to poorly known boundary conditions. The identification of important sources of uncertainty helps to guide future monitoring efforts and pinpoints key indicators, whose evolution should be closely followed to adapt management. The possible impact of climate change is clearly demonstrated by water quality substantially changing depending on single climate model chains. However, when all climate trajectories are combined, the human land use and management decisions have a larger influence on water quality against a time horizon of 2050 in the study.

  6. Method for estimating effects of unknown correlations in spectral irradiance data on uncertainties of spectrally integrated colorimetric quantities

    NASA Astrophysics Data System (ADS)

    Kärhä, Petri; Vaskuri, Anna; Mäntynen, Henrik; Mikkonen, Nikke; Ikonen, Erkki

    2017-08-01

    Spectral irradiance data are often used to calculate colorimetric properties, such as color coordinates and color temperatures of light sources by integration. The spectral data may contain unknown correlations that should be accounted for in the uncertainty estimation. We propose a new method for estimating uncertainties in such cases. The method goes through all possible scenarios of deviations using Monte Carlo analysis. Varying spectral error functions are produced by combining spectral base functions, and the distorted spectra are used to calculate the colorimetric quantities. Standard deviations of the colorimetric quantities at different scenarios give uncertainties assuming no correlations, uncertainties assuming full correlation, and uncertainties for an unfavorable case of unknown correlations, which turn out to be a significant source of uncertainty. With 1% standard uncertainty in spectral irradiance, the expanded uncertainty of the correlated color temperature of a source corresponding to the CIE Standard Illuminant A may reach as high as 37.2 K in unfavorable conditions, when calculations assuming full correlation give zero uncertainty, and calculations assuming no correlations yield the expanded uncertainties of 5.6 K and 12.1 K, with wavelength steps of 1 nm and 5 nm used in spectral integrations, respectively. We also show that there is an absolute limit of 60.2 K in the error of the correlated color temperature for Standard Illuminant A when assuming 1% standard uncertainty in the spectral irradiance. A comparison of our uncorrelated uncertainties with those obtained using analytical methods by other research groups shows good agreement. We re-estimated the uncertainties for the colorimetric properties of our 1 kW photometric standard lamps using the new method. The revised uncertainty of color temperature is a factor of 2.5 higher than the uncertainty assuming no correlations.

  7. Landmark based localization in urban environment

    NASA Astrophysics Data System (ADS)

    Qu, Xiaozhi; Soheilian, Bahman; Paparoditis, Nicolas

    2018-06-01

    A landmark based localization with uncertainty analysis based on cameras and geo-referenced landmarks is presented in this paper. The system is developed to adapt different camera configurations for six degree-of-freedom pose estimation. Local bundle adjustment is applied for optimization and the geo-referenced landmarks are integrated to reduce the drift. In particular, the uncertainty analysis is taken into account. On the one hand, we estimate the uncertainties of poses to predict the precision of localization. On the other hand, uncertainty propagation is considered for matching, tracking and landmark registering. The proposed method is evaluated on both KITTI benchmark and the data acquired by a mobile mapping system. In our experiments, decimeter level accuracy can be reached.

  8. Assessment of BTEX-induced health risk under multiple uncertainties at a petroleum-contaminated site: An integrated fuzzy stochastic approach

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaodong; Huang, Guo H.

    2011-12-01

    Groundwater pollution has gathered more and more attention in the past decades. Conducting an assessment of groundwater contamination risk is desired to provide sound bases for supporting risk-based management decisions. Therefore, the objective of this study is to develop an integrated fuzzy stochastic approach to evaluate risks of BTEX-contaminated groundwater under multiple uncertainties. It consists of an integrated interval fuzzy subsurface modeling system (IIFMS) and an integrated fuzzy second-order stochastic risk assessment (IFSOSRA) model. The IIFMS is developed based on factorial design, interval analysis, and fuzzy sets approach to predict contaminant concentrations under hybrid uncertainties. Two input parameters (longitudinal dispersivity and porosity) are considered to be uncertain with known fuzzy membership functions, and intrinsic permeability is considered to be an interval number with unknown distribution information. A factorial design is conducted to evaluate interactive effects of the three uncertain factors on the modeling outputs through the developed IIFMS. The IFSOSRA model can systematically quantify variability and uncertainty, as well as their hybrids, presented as fuzzy, stochastic and second-order stochastic parameters in health risk assessment. The developed approach haw been applied to the management of a real-world petroleum-contaminated site within a western Canada context. The results indicate that multiple uncertainties, under a combination of information with various data-quality levels, can be effectively addressed to provide supports in identifying proper remedial efforts. A unique contribution of this research is the development of an integrated fuzzy stochastic approach for handling various forms of uncertainties associated with simulation and risk assessment efforts.

  9. A Peep into the Uncertainty-Complexity-Relevance Modeling Trilemma through Global Sensitivity and Uncertainty Analysis

    NASA Astrophysics Data System (ADS)

    Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.

    2014-12-01

    Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping points) in the face of environmental and anthropogenic change (Perz, Muñoz-Carpena, Kiker and Holt, 2013), and through MonteCarlo mapping potential management activities over the most important factors or processes to influence the system towards behavioral (desirable) outcomes (Chu-Agor, Muñoz-Carpena et al., 2012).

  10. Uncertainty in monitoring E. coli concentrations in streams and stormwater runoff

    NASA Astrophysics Data System (ADS)

    Harmel, R. D.; Hathaway, J. M.; Wagner, K. L.; Wolfe, J. E.; Karthikeyan, R.; Francesconi, W.; McCarthy, D. T.

    2016-03-01

    Microbial contamination of surface waters, a substantial public health concern throughout the world, is typically identified by fecal indicator bacteria such as Escherichia coli. Thus, monitoring E. coli concentrations is critical to evaluate current conditions, determine restoration effectiveness, and inform model development and calibration. An often overlooked component of these monitoring and modeling activities is understanding the inherent random and systematic uncertainty present in measured data. In this research, a review and subsequent analysis was performed to identify, document, and analyze measurement uncertainty of E. coli data collected in stream flow and stormwater runoff as individual discrete samples or throughout a single runoff event. Data on the uncertainty contributed by sample collection, sample preservation/storage, and laboratory analysis in measured E. coli concentrations were compiled and analyzed, and differences in sampling method and data quality scenarios were compared. The analysis showed that: (1) manual integrated sampling produced the lowest random and systematic uncertainty in individual samples, but automated sampling typically produced the lowest uncertainty when sampling throughout runoff events; (2) sample collection procedures often contributed the highest amount of uncertainty, although laboratory analysis introduced substantial random uncertainty and preservation/storage introduced substantial systematic uncertainty under some scenarios; and (3) the uncertainty in measured E. coli concentrations was greater than that of sediment and nutrients, but the difference was not as great as may be assumed. This comprehensive analysis of uncertainty in E. coli concentrations measured in streamflow and runoff should provide valuable insight for designing E. coli monitoring projects, reducing uncertainty in quality assurance efforts, regulatory and policy decision making, and fate and transport modeling.

  11. The behavior of integral abutment bridges.

    DOT National Transportation Integrated Search

    1999-01-01

    This report presents findings of a literature review, a field trip, and a finite element analysis pertaining to integral bridges. The purpose of the report is to identify problems and uncertainties, and to gain insight into the interactions between t...

  12. Linked Sensitivity Analysis, Calibration, and Uncertainty Analysis Using a System Dynamics Model for Stroke Comparative Effectiveness Research.

    PubMed

    Tian, Yuan; Hassmiller Lich, Kristen; Osgood, Nathaniel D; Eom, Kirsten; Matchar, David B

    2016-11-01

    As health services researchers and decision makers tackle more difficult problems using simulation models, the number of parameters and the corresponding degree of uncertainty have increased. This often results in reduced confidence in such complex models to guide decision making. To demonstrate a systematic approach of linked sensitivity analysis, calibration, and uncertainty analysis to improve confidence in complex models. Four techniques were integrated and applied to a System Dynamics stroke model of US veterans, which was developed to inform systemwide intervention and research planning: Morris method (sensitivity analysis), multistart Powell hill-climbing algorithm and generalized likelihood uncertainty estimation (calibration), and Monte Carlo simulation (uncertainty analysis). Of 60 uncertain parameters, sensitivity analysis identified 29 needing calibration, 7 that did not need calibration but significantly influenced key stroke outcomes, and 24 not influential to calibration or stroke outcomes that were fixed at their best guess values. One thousand alternative well-calibrated baselines were obtained to reflect calibration uncertainty and brought into uncertainty analysis. The initial stroke incidence rate among veterans was identified as the most influential uncertain parameter, for which further data should be collected. That said, accounting for current uncertainty, the analysis of 15 distinct prevention and treatment interventions provided a robust conclusion that hypertension control for all veterans would yield the largest gain in quality-adjusted life years. For complex health care models, a mixed approach was applied to examine the uncertainty surrounding key stroke outcomes and the robustness of conclusions. We demonstrate that this rigorous approach can be practical and advocate for such analysis to promote understanding of the limits of certainty in applying models to current decisions and to guide future data collection. © The Author(s) 2016.

  13. Data uncertainties in material flow analysis: Municipal solid waste management system in Maputo City, Mozambique.

    PubMed

    Dos Muchangos, Leticia Sarmento; Tokai, Akihiro; Hanashima, Atsuko

    2017-01-01

    Material flow analysis can effectively trace and quantify the flows and stocks of materials such as solid wastes in urban environments. However, the integrity of material flow analysis results is compromised by data uncertainties, an occurrence that is particularly acute in low-and-middle-income study contexts. This article investigates the uncertainties in the input data and their effects in a material flow analysis study of municipal solid waste management in Maputo City, the capital of Mozambique. The analysis is based on data collected in 2007 and 2014. Initially, the uncertainties and their ranges were identified by the data classification model of Hedbrant and Sörme, followed by the application of sensitivity analysis. The average lower and upper bounds were 29% and 71%, respectively, in 2007, increasing to 41% and 96%, respectively, in 2014. This indicates higher data quality in 2007 than in 2014. Results also show that not only data are partially missing from the established flows such as waste generation to final disposal, but also that they are limited and inconsistent in emerging flows and processes such as waste generation to material recovery (hence the wider variation in the 2014 parameters). The sensitivity analysis further clarified the most influencing parameter and the degree of influence of each parameter on the waste flows and the interrelations among the parameters. The findings highlight the need for an integrated municipal solid waste management approach to avoid transferring or worsening the negative impacts among the parameters and flows.

  14. Uncertainty analysis in vulnerability estimations for elements at risk- a review of concepts and some examples on landslides

    NASA Astrophysics Data System (ADS)

    Ciurean, R. L.; Glade, T.

    2012-04-01

    Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.

  15. SOARCA Peach Bottom Atomic Power Station Long-Term Station Blackout Uncertainty Analysis: Knowledge Advancement.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gauntt, Randall O.; Mattie, Patrick D.; Bixler, Nathan E.

    2014-02-01

    This paper describes the knowledge advancements from the uncertainty analysis for the State-of- the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout accident scenario at the Peach Bottom Atomic Power Station. This work assessed key MELCOR and MELCOR Accident Consequence Code System, Version 2 (MACCS2) modeling uncertainties in an integrated fashion to quantify the relative importance of each uncertain input on potential accident progression, radiological releases, and off-site consequences. This quantitative uncertainty analysis provides measures of the effects on consequences, of each of the selected uncertain parameters both individually and in interaction with other parameters. The results measure the modelmore » response (e.g., variance in the output) to uncertainty in the selected input. Investigation into the important uncertain parameters in turn yields insights into important phenomena for accident progression and off-site consequences. This uncertainty analysis confirmed the known importance of some parameters, such as failure rate of the Safety Relief Valve in accident progression modeling and the dry deposition velocity in off-site consequence modeling. The analysis also revealed some new insights, such as dependent effect of cesium chemical form for different accident progressions. (auth)« less

  16. Crossing Science-Policy-Societal Boundaries to Reduce Scientific and Institutional Uncertainty in Small-Scale Fisheries.

    PubMed

    Sutton, Abigail M; Rudd, Murray A

    2016-10-01

    The governance of small-scale fisheries (SSF) is challenging due to the uncertainty, complexity, and interconnectedness of social, political, ecological, and economical processes. Conventional SSF management has focused on a centralized and top-down approach. A major criticism of conventional management is the over-reliance on 'expert science' to guide decision-making and poor consideration of fishers' contextually rich knowledge. That is thought to exacerbate the already low governance potential of SSF. Integrating scientific knowledge with fishers' knowledge is increasingly popular and is often assumed to help reduce levels of biophysical and institutional uncertainties. Many projects aimed at encouraging knowledge integration have, however, been unsuccessful. Our objective in this research was to assess factors that influence knowledge integration and the uptake of integrated knowledge into policy-making. We report results from 54 semi-structured interviews with SSF researchers and practitioners from around the globe. Our analysis is framed in terms of scientific credibility, societal legitimacy, and policy saliency, and we discuss cases that have been partially or fully successful in reducing uncertainty via push-and-pull-oriented boundary crossing initiatives. Our findings suggest that two important factors affect the science-policy-societal boundary: a lack of consensus among stakeholders about what constitutes credible knowledge and institutional uncertainty resulting from shifting policies and leadership change. A lack of training for scientific leaders and an apparent 'shelf-life' for community organizations highlight the importance of ongoing institutional support for knowledge integration projects. Institutional support may be enhanced through such investments, such as capacity building and specialized platforms for knowledge integration.

  17. Crossing Science-Policy-Societal Boundaries to Reduce Scientific and Institutional Uncertainty in Small-Scale Fisheries

    NASA Astrophysics Data System (ADS)

    Sutton, Abigail M.; Rudd, Murray A.

    2016-10-01

    The governance of small-scale fisheries (SSF) is challenging due to the uncertainty, complexity, and interconnectedness of social, political, ecological, and economical processes. Conventional SSF management has focused on a centralized and top-down approach. A major criticism of conventional management is the over-reliance on `expert science' to guide decision-making and poor consideration of fishers' contextually rich knowledge. That is thought to exacerbate the already low governance potential of SSF. Integrating scientific knowledge with fishers' knowledge is increasingly popular and is often assumed to help reduce levels of biophysical and institutional uncertainties. Many projects aimed at encouraging knowledge integration have, however, been unsuccessful. Our objective in this research was to assess factors that influence knowledge integration and the uptake of integrated knowledge into policy-making. We report results from 54 semi-structured interviews with SSF researchers and practitioners from around the globe. Our analysis is framed in terms of scientific credibility, societal legitimacy, and policy saliency, and we discuss cases that have been partially or fully successful in reducing uncertainty via push-and-pull-oriented boundary crossing initiatives. Our findings suggest that two important factors affect the science-policy-societal boundary: a lack of consensus among stakeholders about what constitutes credible knowledge and institutional uncertainty resulting from shifting policies and leadership change. A lack of training for scientific leaders and an apparent `shelf-life' for community organizations highlight the importance of ongoing institutional support for knowledge integration projects. Institutional support may be enhanced through such investments, such as capacity building and specialized platforms for knowledge integration.

  18. Tolerance of uncertainty: Conceptual analysis, integrative model, and implications for healthcare.

    PubMed

    Hillen, Marij A; Gutheil, Caitlin M; Strout, Tania D; Smets, Ellen M A; Han, Paul K J

    2017-05-01

    Uncertainty tolerance (UT) is an important, well-studied phenomenon in health care and many other important domains of life, yet its conceptualization and measurement by researchers in various disciplines have varied substantially and its essential nature remains unclear. The objectives of this study were to: 1) analyze the meaning and logical coherence of UT as conceptualized by developers of UT measures, and 2) develop an integrative conceptual model to guide future empirical research regarding the nature, causes, and effects of UT. A narrative review and conceptual analysis of 18 existing measures of Uncertainty and Ambiguity Tolerance was conducted, focusing on how measure developers in various fields have defined both the "uncertainty" and "tolerance" components of UT-both explicitly through their writings and implicitly through the items constituting their measures. Both explicit and implicit conceptual definitions of uncertainty and tolerance vary substantially and are often poorly and inconsistently specified. A logically coherent, unified understanding or theoretical model of UT is lacking. To address these gaps, we propose a new integrative definition and multidimensional conceptual model that construes UT as the set of negative and positive psychological responses-cognitive, emotional, and behavioral-provoked by the conscious awareness of ignorance about particular aspects of the world. This model synthesizes insights from various disciplines and provides an organizing framework for future research. We discuss how this model can facilitate further empirical and theoretical research to better measure and understand the nature, determinants, and outcomes of UT in health care and other domains of life. Uncertainty tolerance is an important and complex phenomenon requiring more precise and consistent definition. An integrative definition and conceptual model, intended as a tentative and flexible point of departure for future research, adds needed breadth, specificity, and precision to efforts to conceptualize and measure UT. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. An interval precise integration method for transient unbalance response analysis of rotor system with uncertainty

    NASA Astrophysics Data System (ADS)

    Fu, Chao; Ren, Xingmin; Yang, Yongfeng; Xia, Yebao; Deng, Wangqun

    2018-07-01

    A non-intrusive interval precise integration method (IPIM) is proposed in this paper to analyze the transient unbalance response of uncertain rotor systems. The transfer matrix method (TMM) is used to derive the deterministic equations of motion of a hollow-shaft overhung rotor. The uncertain transient dynamic problem is solved by combing the Chebyshev approximation theory with the modified precise integration method (PIM). Transient response bounds are calculated by interval arithmetic of the expansion coefficients. Theoretical error analysis of the proposed method is provided briefly, and its accuracy is further validated by comparing with the scanning method in simulations. Numerical results show that the IPIM can keep good accuracy in vibration prediction of the start-up transient process. Furthermore, the proposed method can also provide theoretical guidance to other transient dynamic mechanical systems with uncertainties.

  20. A GIS based spatially-explicit sensitivity and uncertainty analysis approach for multi-criteria decision analysis☆

    PubMed Central

    Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas

    2014-01-01

    GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987

  1. Integrated Arrival and Departure Schedule Optimization Under Uncertainty

    NASA Technical Reports Server (NTRS)

    Xue, Min; Zelinski, Shannon

    2014-01-01

    In terminal airspace, integrating arrivals and departures with shared waypoints provides the potential of improving operational efficiency by allowing direct routes when possible. Incorporating stochastic evaluation as a post-analysis process of deterministic optimization, and imposing a safety buffer in deterministic optimization, are two ways to learn and alleviate the impact of uncertainty and to avoid unexpected outcomes. This work presents a third and direct way to take uncertainty into consideration during the optimization. The impact of uncertainty was incorporated into cost evaluations when searching for the optimal solutions. The controller intervention count was computed using a heuristic model and served as another stochastic cost besides total delay. Costs under uncertainty were evaluated using Monte Carlo simulations. The Pareto fronts that contain a set of solutions were identified and the trade-off between delays and controller intervention count was shown. Solutions that shared similar delays but had different intervention counts were investigated. The results showed that optimization under uncertainty could identify compromise solutions on Pareto fonts, which is better than deterministic optimization with extra safety buffers. It helps decision-makers reduce controller intervention while achieving low delays.

  2. Probability and possibility-based representations of uncertainty in fault tree analysis.

    PubMed

    Flage, Roger; Baraldi, Piero; Zio, Enrico; Aven, Terje

    2013-01-01

    Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic-possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility-probability (probability-possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context. © 2012 Society for Risk Analysis.

  3. The integrated effects of future climate and hydrologic uncertainty on sustainable flood risk management

    NASA Astrophysics Data System (ADS)

    Steinschneider, S.; Wi, S.; Brown, C. M.

    2013-12-01

    Flood risk management performance is investigated within the context of integrated climate and hydrologic modeling uncertainty to explore system robustness. The research question investigated is whether structural and hydrologic parameterization uncertainties are significant relative to other uncertainties such as climate change when considering water resources system performance. Two hydrologic models are considered, a conceptual, lumped parameter model that preserves the water balance and a physically-based model that preserves both water and energy balances. In the conceptual model, parameter and structural uncertainties are quantified and propagated through the analysis using a Bayesian modeling framework with an innovative error model. Mean climate changes and internal climate variability are explored using an ensemble of simulations from a stochastic weather generator. The approach presented can be used to quantify the sensitivity of flood protection adequacy to different sources of uncertainty in the climate and hydrologic system, enabling the identification of robust projects that maintain adequate performance despite the uncertainties. The method is demonstrated in a case study for the Coralville Reservoir on the Iowa River, where increased flooding over the past several decades has raised questions about potential impacts of climate change on flood protection adequacy.

  4. Robustness Analysis of Integrated LPV-FDI Filters and LTI-FTC System for a Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Khong, Thuan H.; Shin, Jong-Yeob

    2007-01-01

    This paper proposes an analysis framework for robustness analysis of a nonlinear dynamics system that can be represented by a polynomial linear parameter varying (PLPV) system with constant bounded uncertainty. The proposed analysis framework contains three key tools: 1) a function substitution method which can convert a nonlinear system in polynomial form into a PLPV system, 2) a matrix-based linear fractional transformation (LFT) modeling approach, which can convert a PLPV system into an LFT system with the delta block that includes key uncertainty and scheduling parameters, 3) micro-analysis, which is a well known robust analysis tool for linear systems. The proposed analysis framework is applied to evaluating the performance of the LPV-fault detection and isolation (FDI) filters of the closed-loop system of a transport aircraft in the presence of unmodeled actuator dynamics and sensor gain uncertainty. The robustness analysis results are compared with nonlinear time simulations.

  5. Integrating uncertainties for climate change mitigation

    NASA Astrophysics Data System (ADS)

    Rogelj, Joeri; McCollum, David; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan

    2013-04-01

    The target of keeping global average temperature increase to below 2°C has emerged in the international climate debate more than a decade ago. In response, the scientific community has tried to estimate the costs of reaching such a target through modelling and scenario analysis. Producing such estimates remains a challenge, particularly because of relatively well-known, but ill-quantified uncertainties, and owing to limited integration of scientific knowledge across disciplines. The integrated assessment community, on one side, has extensively assessed the influence of technological and socio-economic uncertainties on low-carbon scenarios and associated costs. The climate modelling community, on the other side, has worked on achieving an increasingly better understanding of the geophysical response of the Earth system to emissions of greenhouse gases (GHG). This geophysical response remains a key uncertainty for the cost of mitigation scenarios but has only been integrated with assessments of other uncertainties in a rudimentary manner, i.e., for equilibrium conditions. To bridge this gap between the two research communities, we generate distributions of the costs associated with limiting transient global temperature increase to below specific temperature limits, taking into account uncertainties in multiple dimensions: geophysical, technological, social and political. In other words, uncertainties resulting from our incomplete knowledge about how the climate system precisely reacts to GHG emissions (geophysical uncertainties), about how society will develop (social uncertainties and choices), which technologies will be available (technological uncertainty and choices), when we choose to start acting globally on climate change (political choices), and how much money we are or are not willing to spend to achieve climate change mitigation. We find that political choices that delay mitigation have the largest effect on the cost-risk distribution, followed by geophysical, future energy demand, and mitigation technology uncertainties. This information provides central information for policy making, since it helps to understand the relationship between mitigation costs and their potential to reduce the risk of exceeding 2°C, or other temperature limits like 3°C or 1.5°C, under a wide range of scenarios.

  6. Uncertainty aggregation and reduction in structure-material performance prediction

    NASA Astrophysics Data System (ADS)

    Hu, Zhen; Mahadevan, Sankaran; Ao, Dan

    2018-02-01

    An uncertainty aggregation and reduction framework is presented for structure-material performance prediction. Different types of uncertainty sources, structural analysis model, and material performance prediction model are connected through a Bayesian network for systematic uncertainty aggregation analysis. To reduce the uncertainty in the computational structure-material performance prediction model, Bayesian updating using experimental observation data is investigated based on the Bayesian network. It is observed that the Bayesian updating results will have large error if the model cannot accurately represent the actual physics, and that this error will be propagated to the predicted performance distribution. To address this issue, this paper proposes a novel uncertainty reduction method by integrating Bayesian calibration with model validation adaptively. The observation domain of the quantity of interest is first discretized into multiple segments. An adaptive algorithm is then developed to perform model validation and Bayesian updating over these observation segments sequentially. Only information from observation segments where the model prediction is highly reliable is used for Bayesian updating; this is found to increase the effectiveness and efficiency of uncertainty reduction. A composite rotorcraft hub component fatigue life prediction model, which combines a finite element structural analysis model and a material damage model, is used to demonstrate the proposed method.

  7. Integrated Modeling Activities for the James Webb Space Telescope (JWST): Structural-Thermal-Optical Analysis

    NASA Technical Reports Server (NTRS)

    Johnston, John D.; Parrish, Keith; Howard, Joseph M.; Mosier, Gary E.; McGinnis, Mark; Bluth, Marcel; Kim, Kevin; Ha, Hong Q.

    2004-01-01

    This is a continuation of a series of papers on modeling activities for JWST. The structural-thermal- optical, often referred to as "STOP", analysis process is used to predict the effect of thermal distortion on optical performance. The benchmark STOP analysis for JWST assesses the effect of an observatory slew on wavefront error. The paper begins an overview of multi-disciplinary engineering analysis, or integrated modeling, which is a critical element of the JWST mission. The STOP analysis process is then described. This process consists of the following steps: thermal analysis, structural analysis, and optical analysis. Temperatures predicted using geometric and thermal math models are mapped to the structural finite element model in order to predict thermally-induced deformations. Motions and deformations at optical surfaces are input to optical models and optical performance is predicted using either an optical ray trace or WFE estimation techniques based on prior ray traces or first order optics. Following the discussion of the analysis process, results based on models representing the design at the time of the System Requirements Review. In addition to baseline performance predictions, sensitivity studies are performed to assess modeling uncertainties. Of particular interest is the sensitivity of optical performance to uncertainties in temperature predictions and variations in metal properties. The paper concludes with a discussion of modeling uncertainty as it pertains to STOP analysis.

  8. Integrated assessment of urban drainage system under the framework of uncertainty analysis.

    PubMed

    Dong, X; Chen, J; Zeng, S; Zhao, D

    2008-01-01

    Due to a rapid urbanization as well as the presence of large number of aging urban infrastructures in China, the urban drainage system is facing a dual pressure of construction and renovation nationwide. This leads to the need for an integrated assessment when an urban drainage system is under planning or re-design. In this paper, an integrated assessment methodology is proposed based upon the approaches of analytic hierarchy process (AHP), uncertainty analysis, mathematical simulation of urban drainage system and fuzzy assessment. To illustrate this methodology, a case study in Shenzhen City of south China has been implemented to evaluate and compare two different urban drainage system renovation plans, i.e., the distributed plan and the centralized plan. By comparing their water quality impacts, ecological impacts, technological feasibility and economic costs, the integrated performance of the distributed plan is found to be both better and robust. The proposed methodology is also found to be both effective and practical. (c) IWA Publishing 2008.

  9. Modelling ecological and human exposure to POPs in Venice lagoon - Part II: Quantitative uncertainty and sensitivity analysis in coupled exposure models.

    PubMed

    Radomyski, Artur; Giubilato, Elisa; Ciffroy, Philippe; Critto, Andrea; Brochot, Céline; Marcomini, Antonio

    2016-11-01

    The study is focused on applying uncertainty and sensitivity analysis to support the application and evaluation of large exposure models where a significant number of parameters and complex exposure scenarios might be involved. The recently developed MERLIN-Expo exposure modelling tool was applied to probabilistically assess the ecological and human exposure to PCB 126 and 2,3,7,8-TCDD in the Venice lagoon (Italy). The 'Phytoplankton', 'Aquatic Invertebrate', 'Fish', 'Human intake' and PBPK models available in MERLIN-Expo library were integrated to create a specific food web to dynamically simulate bioaccumulation in various aquatic species and in the human body over individual lifetimes from 1932 until 1998. MERLIN-Expo is a high tier exposure modelling tool allowing propagation of uncertainty on the model predictions through Monte Carlo simulation. Uncertainty in model output can be further apportioned between parameters by applying built-in sensitivity analysis tools. In this study, uncertainty has been extensively addressed in the distribution functions to describe the data input and the effect on model results by applying sensitivity analysis techniques (screening Morris method, regression analysis, and variance-based method EFAST). In the exposure scenario developed for the Lagoon of Venice, the concentrations of 2,3,7,8-TCDD and PCB 126 in human blood turned out to be mainly influenced by a combination of parameters (half-lives of the chemicals, body weight variability, lipid fraction, food assimilation efficiency), physiological processes (uptake/elimination rates), environmental exposure concentrations (sediment, water, food) and eating behaviours (amount of food eaten). In conclusion, this case study demonstrated feasibility of MERLIN-Expo to be successfully employed in integrated, high tier exposure assessment. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Transit light curves with finite integration time: Fisher information analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Price, Ellen M.; Rogers, Leslie A.

    2014-10-10

    Kepler has revolutionized the study of transiting planets with its unprecedented photometric precision on more than 150,000 target stars. Most of the transiting planet candidates detected by Kepler have been observed as long-cadence targets with 30 minute integration times, and the upcoming Transiting Exoplanet Survey Satellite will record full frame images with a similar integration time. Integrations of 30 minutes affect the transit shape, particularly for small planets and in cases of low signal to noise. Using the Fisher information matrix technique, we derive analytic approximations for the variances and covariances on the transit parameters obtained from fitting light curvemore » photometry collected with a finite integration time. We find that binning the light curve can significantly increase the uncertainties and covariances on the inferred parameters when comparing scenarios with constant total signal to noise (constant total integration time in the absence of read noise). Uncertainties on the transit ingress/egress time increase by a factor of 34 for Earth-size planets and 3.4 for Jupiter-size planets around Sun-like stars for integration times of 30 minutes compared to instantaneously sampled light curves. Similarly, uncertainties on the mid-transit time for Earth and Jupiter-size planets increase by factors of 3.9 and 1.4. Uncertainties on the transit depth are largely unaffected by finite integration times. While correlations among the transit depth, ingress duration, and transit duration all increase in magnitude with longer integration times, the mid-transit time remains uncorrelated with the other parameters. We provide code in Python and Mathematica for predicting the variances and covariances at www.its.caltech.edu/∼eprice.« less

  11. The Uncertainties on the GIS Based Land Suitability Assessment for Urban and Rural Planning

    NASA Astrophysics Data System (ADS)

    Liu, H.; Zhan, Q.; Zhan, M.

    2017-09-01

    The majority of the research on the uncertainties of spatial data and spatial analysis focuses on some specific data feature or analysis tool. Few have accomplished the uncertainties of the whole process of an application like planning, making the research of uncertainties detached from practical applications. The paper discusses the uncertainties of the geographical information systems (GIS) based land suitability assessment in planning on the basis of literature review. The uncertainties considered range from index system establishment to the classification of the final result. Methods to reduce the uncertainties arise from the discretization of continuous raster data and the index weight determination are summarized. The paper analyzes the merits and demerits of the "Nature Breaks" method which is broadly used by planners. It also explores the other factors which impact the accuracy of the final classification like the selection of class numbers, intervals and the autocorrelation of the spatial data. In the conclusion part, the paper indicates that the adoption of machine learning methods should be modified to integrate the complexity of land suitability assessment. The work contributes to the application of spatial data and spatial analysis uncertainty research on land suitability assessment, and promotes the scientific level of the later planning and decision-making.

  12. Probabilistic sizing of laminates with uncertainties

    NASA Technical Reports Server (NTRS)

    Shah, A. R.; Liaw, D. G.; Chamis, C. C.

    1993-01-01

    A reliability based design methodology for laminate sizing and configuration for a special case of composite structures is described. The methodology combines probabilistic composite mechanics with probabilistic structural analysis. The uncertainties of constituent materials (fiber and matrix) to predict macroscopic behavior are simulated using probabilistic theory. Uncertainties in the degradation of composite material properties are included in this design methodology. A multi-factor interaction equation is used to evaluate load and environment dependent degradation of the composite material properties at the micromechanics level. The methodology is integrated into a computer code IPACS (Integrated Probabilistic Assessment of Composite Structures). Versatility of this design approach is demonstrated by performing a multi-level probabilistic analysis to size the laminates for design structural reliability of random type structures. The results show that laminate configurations can be selected to improve the structural reliability from three failures in 1000, to no failures in one million. Results also show that the laminates with the highest reliability are the least sensitive to the loading conditions.

  13. Passivity analysis for uncertain BAM neural networks with time delays and reaction-diffusions

    NASA Astrophysics Data System (ADS)

    Zhou, Jianping; Xu, Shengyuan; Shen, Hao; Zhang, Baoyong

    2013-08-01

    This article deals with the problem of passivity analysis for delayed reaction-diffusion bidirectional associative memory (BAM) neural networks with weight uncertainties. By using a new integral inequality, we first present a passivity condition for the nominal networks, and then extend the result to the case with linear fractional weight uncertainties. The proposed conditions are expressed in terms of linear matrix inequalities, and thus can be checked easily. Examples are provided to demonstrate the effectiveness of the proposed results.

  14. Optimizing integrated airport surface and terminal airspace operations under uncertainty

    NASA Astrophysics Data System (ADS)

    Bosson, Christabelle S.

    In airports and surrounding terminal airspaces, the integration of surface, arrival and departure scheduling and routing have the potential to improve the operations efficiency. Moreover, because both the airport surface and the terminal airspace are often altered by random perturbations, the consideration of uncertainty in flight schedules is crucial to improve the design of robust flight schedules. Previous research mainly focused on independently solving arrival scheduling problems, departure scheduling problems and surface management scheduling problems and most of the developed models are deterministic. This dissertation presents an alternate method to model the integrated operations by using a machine job-shop scheduling formulation. A multistage stochastic programming approach is chosen to formulate the problem in the presence of uncertainty and candidate solutions are obtained by solving sample average approximation problems with finite sample size. The developed mixed-integer-linear-programming algorithm-based scheduler is capable of computing optimal aircraft schedules and routings that reflect the integration of air and ground operations. The assembled methodology is applied to a Los Angeles case study. To show the benefits of integrated operations over First-Come-First-Served, a preliminary proof-of-concept is conducted for a set of fourteen aircraft evolving under deterministic conditions in a model of the Los Angeles International Airport surface and surrounding terminal areas. Using historical data, a representative 30-minute traffic schedule and aircraft mix scenario is constructed. The results of the Los Angeles application show that the integration of air and ground operations and the use of a time-based separation strategy enable both significant surface and air time savings. The solution computed by the optimization provides a more efficient routing and scheduling than the First-Come-First-Served solution. Additionally, a data driven analysis is performed for the Los Angeles environment and probabilistic distributions of pertinent uncertainty sources are obtained. A sensitivity analysis is then carried out to assess the methodology performance and find optimal sampling parameters. Finally, simulations of increasing traffic density in the presence of uncertainty are conducted first for integrated arrivals and departures, then for integrated surface and air operations. To compare the optimization results and show the benefits of integrated operations, two aircraft separation methods are implemented that offer different routing options. The simulations of integrated air operations and the simulations of integrated air and surface operations demonstrate that significant traveling time savings, both total and individual surface and air times, can be obtained when more direct routes are allowed to be traveled even in the presence of uncertainty. The resulting routings induce however extra take off delay for departing flights. As a consequence, some flights cannot meet their initial assigned runway slot which engenders runway position shifting when comparing resulting runway sequences computed under both deterministic and stochastic conditions. The optimization is able to compute an optimal runway schedule that represents an optimal balance between total schedule delays and total travel times.

  15. Uncertainty analysis of integrated gasification combined cycle systems based on Frame 7H versus 7F gas turbines.

    PubMed

    Zhu, Yunhua; Frey, H Christopher

    2006-12-01

    Integrated gasification combined cycle (IGCC) technology is a promising alternative for clean generation of power and coproduction of chemicals from coal and other feedstocks. Advanced concepts for IGCC systems that incorporate state-of-the-art gas turbine systems, however, are not commercially demonstrated. Therefore, there is uncertainty regarding the future commercial-scale performance, emissions, and cost of such technologies. The Frame 7F gas turbine represents current state-of-practice, whereas the Frame 7H is the most recently introduced advanced commercial gas turbine. The objective of this study was to evaluate the risks and potential payoffs of IGCC technology based on different gas turbine combined cycle designs. Models of entrained-flow gasifier-based IGCC systems with Frame 7F (IGCC-7F) and 7H gas turbine combined cycles (IGCC-7H) were developed in ASPEN Plus. An uncertainty analysis was conducted. Gasifier carbon conversion and project cost uncertainty are identified as the most important uncertain inputs with respect to system performance and cost. The uncertainties in the difference of the efficiencies and costs for the two systems are characterized. Despite uncertainty, the IGCC-7H system is robustly preferred to the IGCC-7F system. Advances in gas turbine design will improve the performance, emissions, and cost of IGCC systems. The implications of this study for decision-making regarding technology selection, research planning, and plant operation are discussed.

  16. Integrating Solar PV in Utility System Operations: Analytical Framework and Arizona Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Jing; Botterud, Audun; Mills, Andrew

    2015-06-01

    A systematic framework is proposed to estimate the impact on operating costs due to uncertainty and variability in renewable resources. The framework quantifies the integration costs associated with subhourly variability and uncertainty as well as day-ahead forecasting errors in solar PV (photovoltaics) power. A case study illustrates how changes in system operations may affect these costs for a utility in the southwestern United States (Arizona Public Service Company). We conduct an extensive sensitivity analysis under different assumptions about balancing reserves, system flexibility, fuel prices, and forecasting errors. We find that high solar PV penetrations may lead to operational challenges, particularlymore » during low-load and high solar periods. Increased system flexibility is essential for minimizing integration costs and maintaining reliability. In a set of sensitivity cases where such flexibility is provided, in part, by flexible operations of nuclear power plants, the estimated integration costs vary between $1.0 and $4.4/MWh-PV for a PV penetration level of 17%. The integration costs are primarily due to higher needs for hour-ahead balancing reserves to address the increased sub-hourly variability and uncertainty in the PV resource. (C) 2015 Elsevier Ltd. All rights reserved.« less

  17. Analysis of algal bloom risk with uncertainties in lakes by integrating self-organizing map and fuzzy information theory.

    PubMed

    Chen, Qiuwen; Rui, Han; Li, Weifeng; Zhang, Yanhui

    2014-06-01

    Algal blooms are a serious problem in waters, which damage aquatic ecosystems and threaten drinking water safety. However, the outbreak mechanism of algal blooms is very complex with great uncertainty, especially for large water bodies where environmental conditions have obvious variation in both space and time. This study developed an innovative method which integrated a self-organizing map (SOM) and fuzzy information diffusion theory to comprehensively analyze algal bloom risks with uncertainties. The Lake Taihu was taken as study case and the long-term (2004-2010) on-site monitoring data were used. The results showed that algal blooms in Taihu Lake were classified into four categories and exhibited obvious spatial-temporal patterns. The lake was mainly characterized by moderate bloom but had high uncertainty, whereas severe blooms with low uncertainty were observed in the northwest part of the lake. The study gives insight on the spatial-temporal dynamics of algal blooms, and should help government and decision-makers outline policies and practices on bloom monitoring and prevention. The developed method provides a promising approach to estimate algal bloom risks under uncertainties. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Receiving water quality assessment: comparison between simplified and detailed integrated urban modelling approaches.

    PubMed

    Mannina, Giorgio; Viviani, Gaspare

    2010-01-01

    Urban water quality management often requires use of numerical models allowing the evaluation of the cause-effect relationship between the input(s) (i.e. rainfall, pollutant concentrations on catchment surface and in sewer system) and the resulting water quality response. The conventional approach to the system (i.e. sewer system, wastewater treatment plant and receiving water body), considering each component separately, does not enable optimisation of the whole system. However, recent gains in understanding and modelling make it possible to represent the system as a whole and optimise its overall performance. Indeed, integrated urban drainage modelling is of growing interest for tools to cope with Water Framework Directive requirements. Two different approaches can be employed for modelling the whole urban drainage system: detailed and simplified. Each has its advantages and disadvantages. Specifically, detailed approaches can offer a higher level of reliability in the model results, but can be very time consuming from the computational point of view. Simplified approaches are faster but may lead to greater model uncertainty due to an over-simplification. To gain insight into the above problem, two different modelling approaches have been compared with respect to their uncertainty. The first urban drainage integrated model approach uses the Saint-Venant equations and the 1D advection-dispersion equations, for the quantity and for the quality aspects, respectively. The second model approach consists of the simplified reservoir model. The analysis used a parsimonious bespoke model developed in previous studies. For the uncertainty analysis, the Generalised Likelihood Uncertainty Estimation (GLUE) procedure was used. Model reliability was evaluated on the basis of capacity of globally limiting the uncertainty. Both models have a good capability to fit the experimental data, suggesting that all adopted approaches are equivalent both for quantity and quality. The detailed model approach is more robust and presents less uncertainty in terms of uncertainty bands. On the other hand, the simplified river water quality model approach shows higher uncertainty and may be unsuitable for receiving water body quality assessment.

  19. Advanced Modeling and Uncertainty Quantification for Flight Dynamics; Interim Results and Challenges

    NASA Technical Reports Server (NTRS)

    Hyde, David C.; Shweyk, Kamal M.; Brown, Frank; Shah, Gautam

    2014-01-01

    As part of the NASA Vehicle Systems Safety Technologies (VSST), Assuring Safe and Effective Aircraft Control Under Hazardous Conditions (Technical Challenge #3), an effort is underway within Boeing Research and Technology (BR&T) to address Advanced Modeling and Uncertainty Quantification for Flight Dynamics (VSST1-7). The scope of the effort is to develop and evaluate advanced multidisciplinary flight dynamics modeling techniques, including integrated uncertainties, to facilitate higher fidelity response characterization of current and future aircraft configurations approaching and during loss-of-control conditions. This approach is to incorporate multiple flight dynamics modeling methods for aerodynamics, structures, and propulsion, including experimental, computational, and analytical. Also to be included are techniques for data integration and uncertainty characterization and quantification. This research shall introduce new and updated multidisciplinary modeling and simulation technologies designed to improve the ability to characterize airplane response in off-nominal flight conditions. The research shall also introduce new techniques for uncertainty modeling that will provide a unified database model comprised of multiple sources, as well as an uncertainty bounds database for each data source such that a full vehicle uncertainty analysis is possible even when approaching or beyond Loss of Control boundaries. Methodologies developed as part of this research shall be instrumental in predicting and mitigating loss of control precursors and events directly linked to causal and contributing factors, such as stall, failures, damage, or icing. The tasks will include utilizing the BR&T Water Tunnel to collect static and dynamic data to be compared to the GTM extended WT database, characterizing flight dynamics in off-nominal conditions, developing tools for structural load estimation under dynamic conditions, devising methods for integrating various modeling elements into a real-time simulation capability, generating techniques for uncertainty modeling that draw data from multiple modeling sources, and providing a unified database model that includes nominal plus increments for each flight condition. This paper presents status of testing in the BR&T water tunnel and analysis of the resulting data and efforts to characterize these data using alternative modeling methods. Program challenges and issues are also presented.

  20. Peak fitting and integration uncertainties for the Aerodyne Aerosol Mass Spectrometer

    NASA Astrophysics Data System (ADS)

    Corbin, J. C.; Othman, A.; Haskins, J. D.; Allan, J. D.; Sierau, B.; Worsnop, D. R.; Lohmann, U.; Mensah, A. A.

    2015-04-01

    The errors inherent in the fitting and integration of the pseudo-Gaussian ion peaks in Aerodyne High-Resolution Aerosol Mass Spectrometers (HR-AMS's) have not been previously addressed as a source of imprecision for these instruments. This manuscript evaluates the significance of these uncertainties and proposes a method for their estimation in routine data analysis. Peak-fitting uncertainties, the most complex source of integration uncertainties, are found to be dominated by errors in m/z calibration. These calibration errors comprise significant amounts of both imprecision and bias, and vary in magnitude from ion to ion. The magnitude of these m/z calibration errors is estimated for an exemplary data set, and used to construct a Monte Carlo model which reproduced well the observed trends in fits to the real data. The empirically-constrained model is used to show that the imprecision in the fitted height of isolated peaks scales linearly with the peak height (i.e., as n1), thus contributing a constant-relative-imprecision term to the overall uncertainty. This constant relative imprecision term dominates the Poisson counting imprecision term (which scales as n0.5) at high signals. The previous HR-AMS uncertainty model therefore underestimates the overall fitting imprecision. The constant relative imprecision in fitted peak height for isolated peaks in the exemplary data set was estimated as ~4% and the overall peak-integration imprecision was approximately 5%. We illustrate the importance of this constant relative imprecision term by performing Positive Matrix Factorization (PMF) on a~synthetic HR-AMS data set with and without its inclusion. Finally, the ability of an empirically-constrained Monte Carlo approach to estimate the fitting imprecision for an arbitrary number of known overlapping peaks is demonstrated. Software is available upon request to estimate these error terms in new data sets.

  1. CASL L1 Milestone report : CASL.P4.01, sensitivity and uncertainty analysis for CIPS with VIPRE-W and BOA.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sung, Yixing; Adams, Brian M.; Secker, Jeffrey R.

    2011-12-01

    The CASL Level 1 Milestone CASL.P4.01, successfully completed in December 2011, aimed to 'conduct, using methodologies integrated into VERA, a detailed sensitivity analysis and uncertainty quantification of a crud-relevant problem with baseline VERA capabilities (ANC/VIPRE-W/BOA).' The VUQ focus area led this effort, in partnership with AMA, and with support from VRI. DAKOTA was coupled to existing VIPRE-W thermal-hydraulics and BOA crud/boron deposit simulations representing a pressurized water reactor (PWR) that previously experienced crud-induced power shift (CIPS). This work supports understanding of CIPS by exploring the sensitivity and uncertainty in BOA outputs with respect to uncertain operating and model parameters. Thismore » report summarizes work coupling the software tools, characterizing uncertainties, and analyzing the results of iterative sensitivity and uncertainty studies. These studies focused on sensitivity and uncertainty of CIPS indicators calculated by the current version of the BOA code used in the industry. Challenges with this kind of analysis are identified to inform follow-on research goals and VERA development targeting crud-related challenge problems.« less

  2. A comparative study of multivariable robustness analysis methods as applied to integrated flight and propulsion control

    NASA Technical Reports Server (NTRS)

    Schierman, John D.; Lovell, T. A.; Schmidt, David K.

    1993-01-01

    Three multivariable robustness analysis methods are compared and contrasted. The focus of the analysis is on system stability and performance robustness to uncertainty in the coupling dynamics between two interacting subsystems. Of particular interest is interacting airframe and engine subsystems, and an example airframe/engine vehicle configuration is utilized in the demonstration of these approaches. The singular value (SV) and structured singular value (SSV) analysis methods are compared to a method especially well suited for analysis of robustness to uncertainties in subsystem interactions. This approach is referred to here as the interacting subsystem (IS) analysis method. This method has been used previously to analyze airframe/engine systems, emphasizing the study of stability robustness. However, performance robustness is also investigated here, and a new measure of allowable uncertainty for acceptable performance robustness is introduced. The IS methodology does not require plant uncertainty models to measure the robustness of the system, and is shown to yield valuable information regarding the effects of subsystem interactions. In contrast, the SV and SSV methods allow for the evaluation of the robustness of the system to particular models of uncertainty, and do not directly indicate how the airframe (engine) subsystem interacts with the engine (airframe) subsystem.

  3. Sustainability of fisheries through marine reserves: a robust modeling analysis.

    PubMed

    Doyen, L; Béné, C

    2003-09-01

    Among the many factors that contribute to overexploitation of marine fisheries, the role played by uncertainty is important. This uncertainty includes both the scientific uncertainties related to the resource dynamics or assessments and the uncontrollability of catches. Some recent works advocate for the use of marine reserves as a central element of future stock management. In the present paper, we study the influence of protected areas upon fisheries sustainability through a simple dynamic model integrating non-stochastic harvesting uncertainty and a constraint of safe minimum biomass level. Using the mathematical concept of invariance kernel in a robust and worst-case context, we examine through a formal modeling analysis how marine reserves might guarantee viable fisheries. We also show how sustainability requirement is not necessarily conflicting with optimization of catches. Numerical simulations are provided to illustrate the main findings.

  4. Information-integration category learning and the human uncertainty response.

    PubMed

    Paul, Erick J; Boomer, Joseph; Smith, J David; Ashby, F Gregory

    2011-04-01

    The human response to uncertainty has been well studied in tasks requiring attention and declarative memory systems. However, uncertainty monitoring and control have not been studied in multi-dimensional, information-integration categorization tasks that rely on non-declarative procedural memory. Three experiments are described that investigated the human uncertainty response in such tasks. Experiment 1 showed that following standard categorization training, uncertainty responding was similar in information-integration tasks and rule-based tasks requiring declarative memory. In Experiment 2, however, uncertainty responding in untrained information-integration tasks impaired the ability of many participants to master those tasks. Finally, Experiment 3 showed that the deficit observed in Experiment 2 was not because of the uncertainty response option per se, but rather because the uncertainty response provided participants a mechanism via which to eliminate stimuli that were inconsistent with a simple declarative response strategy. These results are considered in the light of recent models of category learning and metacognition.

  5. Sensitivity Analysis of the Integrated Medical Model for ISS Programs

    NASA Technical Reports Server (NTRS)

    Goodenow, D. A.; Myers, J. G.; Arellano, J.; Boley, L.; Garcia, Y.; Saile, L.; Walton, M.; Kerstman, E.; Reyes, D.; Young, M.

    2016-01-01

    Sensitivity analysis estimates the relative contribution of the uncertainty in input values to the uncertainty of model outputs. Partial Rank Correlation Coefficient (PRCC) and Standardized Rank Regression Coefficient (SRRC) are methods of conducting sensitivity analysis on nonlinear simulation models like the Integrated Medical Model (IMM). The PRCC method estimates the sensitivity using partial correlation of the ranks of the generated input values to each generated output value. The partial part is so named because adjustments are made for the linear effects of all the other input values in the calculation of correlation between a particular input and each output. In SRRC, standardized regression-based coefficients measure the sensitivity of each input, adjusted for all the other inputs, on each output. Because the relative ranking of each of the inputs and outputs is used, as opposed to the values themselves, both methods accommodate the nonlinear relationship of the underlying model. As part of the IMM v4.0 validation study, simulations are available that predict 33 person-missions on ISS and 111 person-missions on STS. These simulated data predictions feed the sensitivity analysis procedures. The inputs to the sensitivity procedures include the number occurrences of each of the one hundred IMM medical conditions generated over the simulations and the associated IMM outputs: total quality time lost (QTL), number of evacuations (EVAC), and number of loss of crew lives (LOCL). The IMM team will report the results of using PRCC and SRRC on IMM v4.0 predictions of the ISS and STS missions created as part of the external validation study. Tornado plots will assist in the visualization of the condition-related input sensitivities to each of the main outcomes. The outcomes of this sensitivity analysis will drive review focus by identifying conditions where changes in uncertainty could drive changes in overall model output uncertainty. These efforts are an integral part of the overall verification, validation, and credibility review of IMM v4.0.

  6. An integrated modeling approach to support management decisions of coupled groundwater-agricultural systems under multiple uncertainties

    NASA Astrophysics Data System (ADS)

    Hagos Subagadis, Yohannes; Schütze, Niels; Grundmann, Jens

    2015-04-01

    The planning and implementation of effective water resources management strategies need an assessment of multiple (physical, environmental, and socio-economic) issues, and often requires new research in which knowledge of diverse disciplines are combined in a unified methodological and operational frameworks. Such integrative research to link different knowledge domains faces several practical challenges. Such complexities are further compounded by multiple actors frequently with conflicting interests and multiple uncertainties about the consequences of potential management decisions. A fuzzy-stochastic multiple criteria decision analysis tool was developed in this study to systematically quantify both probabilistic and fuzzy uncertainties associated with complex hydrosystems management. It integrated physical process-based models, fuzzy logic, expert involvement and stochastic simulation within a general framework. Subsequently, the proposed new approach is applied to a water-scarce coastal arid region water management problem in northern Oman, where saltwater intrusion into a coastal aquifer due to excessive groundwater extraction for irrigated agriculture has affected the aquifer sustainability, endangering associated socio-economic conditions as well as traditional social structure. Results from the developed method have provided key decision alternatives which can serve as a platform for negotiation and further exploration. In addition, this approach has enabled to systematically quantify both probabilistic and fuzzy uncertainties associated with the decision problem. Sensitivity analysis applied within the developed tool has shown that the decision makers' risk aversion and risk taking attitude may yield in different ranking of decision alternatives. The developed approach can be applied to address the complexities and uncertainties inherent in water resources systems to support management decisions, while serving as a platform for stakeholder participation.

  7. Operational Implementation of a Pc Uncertainty Construct for Conjunction Assessment Risk Analysis

    NASA Technical Reports Server (NTRS)

    Newman, Lauri K.; Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    Earlier this year the NASA Conjunction Assessment and Risk Analysis (CARA) project presented the theoretical and algorithmic aspects of a method to include the uncertainties in the calculation inputs when computing the probability of collision (Pc) between two space objects, principally uncertainties in the covariances and the hard-body radius. The output of this calculation approach is to produce rather than a single Pc value an entire probability density function that will represent the range of possible Pc values given the uncertainties in the inputs and bring CA risk analysis methodologies more in line with modern risk management theory. The present study provides results from the exercise of this method against an extended dataset of satellite conjunctions in order to determine the effect of its use on the evaluation of conjunction assessment (CA) event risk posture. The effects are found to be considerable: a good number of events are downgraded from or upgraded to a serious risk designation on the basis of consideration of the Pc uncertainty. The findings counsel the integration of the developed methods into NASA CA operations.

  8. Automated Planning and Scheduling for Space Mission Operations

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Jonsson, Ari; Knight, Russell

    2005-01-01

    Research Trends: a) Finite-capacity scheduling under more complex constraints and increased problem dimensionality (subcontracting, overtime, lot splitting, inventory, etc.) b) Integrated planning and scheduling. c) Mixed-initiative frameworks. d) Management of uncertainty (proactive and reactive). e) Autonomous agent architectures and distributed production management. e) Integration of machine learning capabilities. f) Wider scope of applications: 1) analysis of supplier/buyer protocols & tradeoffs; 2) integration of strategic & tactical decision-making; and 3) enterprise integration.

  9. An Integrated Gate Turnaround Management Concept Leveraging Big Data/Analytics for NAS Performance Improvements

    NASA Technical Reports Server (NTRS)

    Chung, William; Chachad, Girish; Hochstetler, Ronald

    2016-01-01

    The Integrated Gate Turnaround Management (IGTM) concept was developed to improve the gate turnaround performance at the airport by leveraging relevant historical data to support optimization of airport gate operations, which include: taxi to the gate, gate services, push back, taxi to the runway, and takeoff, based on available resources, constraints, and uncertainties. By analyzing events of gate operations, primary performance dependent attributes of these events were identified for the historical data analysis such that performance models can be developed based on uncertainties to support descriptive, predictive, and prescriptive functions. A system architecture was developed to examine system requirements in support of such a concept. An IGTM prototype was developed to demonstrate the concept using a distributed network and collaborative decision tools for stakeholders to meet on time pushback performance under uncertainties.

  10. Uncertainty Propagation for Terrestrial Mobile Laser Scanner

    NASA Astrophysics Data System (ADS)

    Mezian, c.; Vallet, Bruno; Soheilian, Bahman; Paparoditis, Nicolas

    2016-06-01

    Laser scanners are used more and more in mobile mapping systems. They provide 3D point clouds that are used for object reconstruction and registration of the system. For both of those applications, uncertainty analysis of 3D points is of great interest but rarely investigated in the literature. In this paper we present a complete pipeline that takes into account all the sources of uncertainties and allows to compute a covariance matrix per 3D point. The sources of uncertainties are laser scanner, calibration of the scanner in relation to the vehicle and direct georeferencing system. We suppose that all the uncertainties follow the Gaussian law. The variances of the laser scanner measurements (two angles and one distance) are usually evaluated by the constructors. This is also the case for integrated direct georeferencing devices. Residuals of the calibration process were used to estimate the covariance matrix of the 6D transformation between scanner laser and the vehicle system. Knowing the variances of all sources of uncertainties, we applied uncertainty propagation technique to compute the variance-covariance matrix of every obtained 3D point. Such an uncertainty analysis enables to estimate the impact of different laser scanners and georeferencing devices on the quality of obtained 3D points. The obtained uncertainty values were illustrated using error ellipsoids on different datasets.

  11. A tool for efficient, model-independent management optimization under uncertainty

    USGS Publications Warehouse

    White, Jeremy; Fienen, Michael N.; Barlow, Paul M.; Welter, Dave E.

    2018-01-01

    To fill a need for risk-based environmental management optimization, we have developed PESTPP-OPT, a model-independent tool for resource management optimization under uncertainty. PESTPP-OPT solves a sequential linear programming (SLP) problem and also implements (optional) efficient, “on-the-fly” (without user intervention) first-order, second-moment (FOSM) uncertainty techniques to estimate model-derived constraint uncertainty. Combined with a user-specified risk value, the constraint uncertainty estimates are used to form chance-constraints for the SLP solution process, so that any optimal solution includes contributions from model input and observation uncertainty. In this way, a “single answer” that includes uncertainty is yielded from the modeling analysis. PESTPP-OPT uses the familiar PEST/PEST++ model interface protocols, which makes it widely applicable to many modeling analyses. The use of PESTPP-OPT is demonstrated with a synthetic, integrated surface-water/groundwater model. The function and implications of chance constraints for this synthetic model are discussed.

  12. Geophysical data integration and conditional uncertainty analysis on hydraulic conductivity estimation

    USGS Publications Warehouse

    Rahman, A.; Tsai, F.T.-C.; White, C.D.; Carlson, D.A.; Willson, C.S.

    2007-01-01

    Integration of various geophysical data is essential to better understand aquifer heterogeneity. However, data integration is challenging because there are different levels of support between primary and secondary data needed to be correlated in various ways. This study proposes a geostatistical method to integrate the hydraulic conductivity measurements and electrical resistivity data to better estimate the hydraulic conductivity (K) distribution. The K measurements are obtained from the pumping tests and represent the primary data (hard data). The borehole electrical resistivity data from electrical logs are regarded as the secondary data (soft data). The electrical resistivity data is used to infer hydraulic conductivity values through the Archie law and Kozeny-Carman equation. A pseudo cross-semivariogram is developed to cope with the resistivity data non-collocation. Uncertainty in the auto-semivariograms and pseudo cross-semivariogram is quantified. The methodology is demonstrated by a real-world case study where the hydraulic conductivity is estimated in the Upper Chicot aquifer of Southwestern Louisiana. The groundwater responses by the cokriging and cosimulation of hydraulic conductivity are compared using analysis of variance (ANOVA). ?? 2007 ASCE.

  13. Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Pai, Shantaram S.; Nagpal, Vinod K.

    2007-01-01

    An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.

  14. A noise thermometry investigation of the melting point of gallium at the NIM

    NASA Astrophysics Data System (ADS)

    Zhang, J. T.; Xue, S.

    2006-06-01

    This paper describes a study of the melting point of gallium with the new NIM Johnson noise thermometer (JNT). The new thermometer adopts the structure of switching correlator and commutator with the reference resistor maintained at the triple point of water. The electronic system of the new thermometer is basically the same as the current JNT, but the preamplifiers have been improved slightly. This study demonstrates that examining the characteristics of the noise signals in the frequency domain is of critical importance in constructing an improved new thermometer, where a power spectral analysis is found to be critical in establishing appropriate grounding for the new thermometer. The new JNT is tested on measurements of the thermodynamic temperature of the melting point of gallium, which give the thermodynamic temperature of 302.9160 K, with an overall integration time of 190 h and a combined standard uncertainty of 9.4 mK. The uncertainty analysis indicates that a standard combined uncertainty of 3 mK could be achieved with the new thermometer over an integration period of 1750 h.

  15. A Framework for Quantifying Measurement Uncertainties and Uncertainty Propagation in HCCI/LTGC Engine Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petitpas, Guillaume; McNenly, Matthew J.; Whitesides, Russell A.

    In this study, a framework for estimating experimental measurement uncertainties for a Homogenous Charge Compression Ignition (HCCI)/Low-Temperature Gasoline Combustion (LTGC) engine testing facility is presented. Detailed uncertainty quantification is first carried out for the measurement of the in-cylinder pressure, whose variations during the cycle provide most of the information for performance evaluation. Standard uncertainties of other measured quantities, such as the engine geometry and speed, the air and fuel flow rate and the intake/exhaust dry molar fractions are also estimated. Propagating those uncertainties using a Monte Carlo simulation and Bayesian inference methods then allows for estimation of uncertainties of themore » mass-average temperature and composition at IVC and throughout the cycle; and also of the engine performances such as gross Integrated Mean Effective Pressure, Heat Release and Ringing Intensity. Throughout the analysis, nominal values for uncertainty inputs were taken from a well-characterized engine test facility. However, the analysis did not take into account the calibration practice of experiments run in that facility and the resulting uncertainty values are therefore not indicative of the expected accuracy of those experimental results. A future study will employ the methodology developed here to explore the effects of different calibration methods on the different uncertainty values in order to evaluate best practices for accurate engine measurements.« less

  16. A Framework for Quantifying Measurement Uncertainties and Uncertainty Propagation in HCCI/LTGC Engine Experiments

    DOE PAGES

    Petitpas, Guillaume; McNenly, Matthew J.; Whitesides, Russell A.

    2017-03-28

    In this study, a framework for estimating experimental measurement uncertainties for a Homogenous Charge Compression Ignition (HCCI)/Low-Temperature Gasoline Combustion (LTGC) engine testing facility is presented. Detailed uncertainty quantification is first carried out for the measurement of the in-cylinder pressure, whose variations during the cycle provide most of the information for performance evaluation. Standard uncertainties of other measured quantities, such as the engine geometry and speed, the air and fuel flow rate and the intake/exhaust dry molar fractions are also estimated. Propagating those uncertainties using a Monte Carlo simulation and Bayesian inference methods then allows for estimation of uncertainties of themore » mass-average temperature and composition at IVC and throughout the cycle; and also of the engine performances such as gross Integrated Mean Effective Pressure, Heat Release and Ringing Intensity. Throughout the analysis, nominal values for uncertainty inputs were taken from a well-characterized engine test facility. However, the analysis did not take into account the calibration practice of experiments run in that facility and the resulting uncertainty values are therefore not indicative of the expected accuracy of those experimental results. A future study will employ the methodology developed here to explore the effects of different calibration methods on the different uncertainty values in order to evaluate best practices for accurate engine measurements.« less

  17. Aiding planning in air traffic control: an experimental investigation of the effects of perceptual information integration.

    PubMed

    Moertl, Peter M; Canning, John M; Gronlund, Scott D; Dougherty, Michael R P; Johansson, Joakim; Mills, Scott H

    2002-01-01

    Prior research examined how controllers plan in their traditional environment and identified various information uncertainties as detriments to planning. A planning aid was designed to reduce this uncertainty by perceptually representing important constraints. This included integrating spatial information on the radar screen with discrete information (planned sequences of air traffic). Previous research reported improved planning performance and decreased workload in the planning aid condition. The purpose of this paper was to determine the source of these performance improvements. Analysis of computer interactions using log-linear modeling showed that the planning interface led to less repetitive--but more integrated--information retrieval compared with the traditional planning environment. Ecological interface design principles helped explain how the integrated information retrieval gave rise to the performance improvements. Actual or potential applications of this research include the design and evaluation of interface automation that keeps users in active control by modification of perceptual task characteristics.

  18. Optimizing Integrated Terminal Airspace Operations Under Uncertainty

    NASA Technical Reports Server (NTRS)

    Bosson, Christabelle; Xue, Min; Zelinski, Shannon

    2014-01-01

    In the terminal airspace, integrated departures and arrivals have the potential to increase operations efficiency. Recent research has developed geneticalgorithm- based schedulers for integrated arrival and departure operations under uncertainty. This paper presents an alternate method using a machine jobshop scheduling formulation to model the integrated airspace operations. A multistage stochastic programming approach is chosen to formulate the problem and candidate solutions are obtained by solving sample average approximation problems with finite sample size. Because approximate solutions are computed, the proposed algorithm incorporates the computation of statistical bounds to estimate the optimality of the candidate solutions. A proof-ofconcept study is conducted on a baseline implementation of a simple problem considering a fleet mix of 14 aircraft evolving in a model of the Los Angeles terminal airspace. A more thorough statistical analysis is also performed to evaluate the impact of the number of scenarios considered in the sampled problem. To handle extensive sampling computations, a multithreading technique is introduced.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pennock, Kenneth; Makarov, Yuri V.; Rajagopal, Sankaran

    The need for proactive closed-loop integration of uncertainty information into system operations and probability-based controls is widely recognized, but rarely implemented in system operations. Proactive integration for this project means that the information concerning expected uncertainty ranges for net load and balancing requirements, including required balancing capacity, ramping and ramp duration characteristics, will be fed back into the generation commitment and dispatch algorithms to modify their performance so that potential shortages of these characteristics can be prevented. This basic, yet important, premise is the motivating factor for this project. The achieved project goal is to demonstrate the benefit of suchmore » a system. The project quantifies future uncertainties, predicts additional system balancing needs including the prediction intervals for capacity and ramping requirements of future dispatch intervals, evaluates the impacts of uncertainties on transmission including the risk of overloads and voltage problems, and explores opportunities for intra-hour generation adjustments helping to provide more flexibility for system operators. The resulting benefits culminate in more reliable grid operation in the face of increased system uncertainty and variability caused by solar power. The project identifies that solar power does not require special separate penetration level restrictions or penalization for its intermittency. Ultimately, the collective consideration of all sources of intermittency distributed over a wide area unified with the comprehensive evaluation of various elements of balancing process, i.e. capacity, ramping, and energy requirements, help system operators more robustly and effectively balance generation against load and interchange. This project showed that doing so can facilitate more solar and other renewable resources on the grid without compromising reliability and control performance. Efforts during the project included developing and integrating advanced probabilistic solar forecasts, including distributed PV forecasts, into closed –loop decision making processes. Additionally, new uncertainty quantifications methods and tools for the direct integration of uncertainty and variability information into grid operations at the transmission and distribution levels were developed and tested. During Phase 1, project work focused heavily on the design, development and demonstration of a set of processes and tools that could reliably and efficiently incorporate solar power into California’s grid operations. In Phase 2, connectivity between the ramping analysis tools and market applications software were completed, multiple dispatch scenarios demonstrated a successful reduction of overall uncertainty and an analysis to quantify increases in system operator reliability, and the transmission and distribution system uncertainty prediction tool was introduced to system operation engineers in a live webinar. The project met its goals, the experiments prove the advancements to methods and tools, when working together, are beneficial to not only the California Independent System Operator, but the benefits are transferable to other system operators in the United States.« less

  20. A Bayesian Framework of Uncertainties Integration in 3D Geological Model

    NASA Astrophysics Data System (ADS)

    Liang, D.; Liu, X.

    2017-12-01

    3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.

  1. Integrative evaluation for sustainable decisions of urban wastewater system management under uncertainty

    NASA Astrophysics Data System (ADS)

    Hadjimichael, A.; Corominas, L.; Comas, J.

    2017-12-01

    With sustainable development as their overarching goal, urban wastewater system (UWS) managers need to take into account multiple social, economic, technical and environmental facets related to their decisions. In this complex decision-making environment, uncertainty can be formidable. It is present both in the ways the system is interpreted stochastically, but also in its natural ever-shifting behavior. This inherent uncertainty suggests that wiser decisions would be made under an adaptive and iterative decision-making regime. No decision-support framework has been presented in the literature to effectively addresses all these needs. The objective of this work is to describe such a conceptual framework to evaluate and compare alternative solutions for various UWS challenges within an adaptive management structure. Socio-economic aspects such as externalities are taken into account, along with other traditional criteria as necessary. Robustness, reliability and resilience analyses test the performance of the system against present and future variability. A valuation uncertainty analysis incorporates uncertain valuation assumptions in the decision-making process. The framework is demonstrated with an application to a case study presenting a typical problem often faced by managers: poor river water quality, increasing population, and more stringent water quality legislation. The application of the framework made use of: i) a cost-benefit analysis including monetized environmental benefits and damages; ii) a robustness analysis of system performance against future conditions; iii) reliability and resilience analyses of the system given contextual variability; and iv) a valuation uncertainty analysis of model parameters. The results suggest that the installation of bigger volumes would give rise to increased benefits despite larger capital costs, as well as increased robustness and resilience. Population numbers appear to affect the estimated benefits most, followed by electricity prices and climate change projections. The presented framework is expected to be a valuable tool for the next generation of UWS decision-making and the application demonstrates a novel and valuable integration of metrics and methods for UWS analysis.

  2. Parameter sensitivity analysis of a 1-D cold region lake model for land-surface schemes

    NASA Astrophysics Data System (ADS)

    Guerrero, José-Luis; Pernica, Patricia; Wheater, Howard; Mackay, Murray; Spence, Chris

    2017-12-01

    Lakes might be sentinels of climate change, but the uncertainty in their main feedback to the atmosphere - heat-exchange fluxes - is often not considered within climate models. Additionally, these fluxes are seldom measured, hindering critical evaluation of model output. Analysis of the Canadian Small Lake Model (CSLM), a one-dimensional integral lake model, was performed to assess its ability to reproduce diurnal and seasonal variations in heat fluxes and the sensitivity of simulated fluxes to changes in model parameters, i.e., turbulent transport parameters and the light extinction coefficient (Kd). A C++ open-source software package, Problem Solving environment for Uncertainty Analysis and Design Exploration (PSUADE), was used to perform sensitivity analysis (SA) and identify the parameters that dominate model behavior. The generalized likelihood uncertainty estimation (GLUE) was applied to quantify the fluxes' uncertainty, comparing daily-averaged eddy-covariance observations to the output of CSLM. Seven qualitative and two quantitative SA methods were tested, and the posterior likelihoods of the modeled parameters, obtained from the GLUE analysis, were used to determine the dominant parameters and the uncertainty in the modeled fluxes. Despite the ubiquity of the equifinality issue - different parameter-value combinations yielding equivalent results - the answer to the question was unequivocal: Kd, a measure of how much light penetrates the lake, dominates sensible and latent heat fluxes, and the uncertainty in their estimates is strongly related to the accuracy with which Kd is determined. This is important since accurate and continuous measurements of Kd could reduce modeling uncertainty.

  3. How much swamp are we talking here?: Propagating uncertainty about the area of coastal wetlands into the U.S. greenhouse gas inventory

    NASA Astrophysics Data System (ADS)

    Holmquist, J. R.; Crooks, S.; Windham-Myers, L.; Megonigal, P.; Weller, D.; Lu, M.; Bernal, B.; Byrd, K. B.; Morris, J. T.; Troxler, T.; McCombs, J.; Herold, N.

    2017-12-01

    Stable coastal wetlands can store substantial amounts of carbon (C) that can be released when they are degraded or eroded. The EPA recently incorporated coastal wetland net-storage and emissions within the Agricultural Forested and Other Land Uses category of the U.S. National Greenhouse Gas Inventory (NGGI). This was a seminal analysis, but its quantification of uncertainty needs improvement. We provide a value-added analysis by estimating that uncertainty, focusing initially on the most basic assumption, the area of coastal wetlands. We considered three sources: uncertainty in the areas of vegetation and salinity subclasses, uncertainty in the areas of changing or stable wetlands, and uncertainty in the inland extent of coastal wetlands. The areas of vegetation and salinity subtypes, as well as stable or changing, were estimated from 2006 and 2010 maps derived from Landsat imagery by the Coastal Change Analysis Program (C-CAP). We generated unbiased area estimates and confidence intervals for C-CAP, taking into account mapped area, proportional areas of commission and omission errors, as well as the number of observations. We defined the inland extent of wetlands as all land below the current elevation of twice monthly highest tides. We generated probabilistic inundation maps integrating wetland-specific bias and random error in light-detection and ranging elevation maps, with the spatially explicit random error in tidal surfaces generated from tide gauges. This initial uncertainty analysis will be extended to calculate total propagated uncertainty in the NGGI by including the uncertainties in the amount of C lost from eroded and degraded wetlands, stored annually in stable wetlands, and emitted in the form of methane by tidal freshwater wetlands.

  4. Impact of national context and culture on curriculum change: a case study.

    PubMed

    Jippes, Mariëlle; Driessen, Erik W; Majoor, Gerard D; Gijselaers, Wim H; Muijtjens, Arno M M; van der Vleuten, Cees P M

    2013-08-01

    Earlier studies suggested national culture to be a potential barrier to curriculum reform in medical schools. In particular, Hofstede's cultural dimension 'uncertainty avoidance' had a significant negative relationship with the implementation rate of integrated curricula. However, some schools succeeded to adopt curriculum changes despite their country's strong uncertainty avoidance. This raised the question: 'How did those schools overcome the barrier of uncertainty avoidance?' Austria offered the combination of a high uncertainty avoidance score and integrated curricula in all its medical schools. Twenty-seven key change agents in four medical universities were interviewed and transcripts analysed using thematic cross-case analysis. Initially, strict national laws and limited autonomy of schools inhibited innovation and fostered an 'excuse culture': 'It's not our fault. It is the ministry's'. A new law increasing university autonomy stimulated reforms. However, just this law would have been insufficient as many faculty still sought to avoid change. A strong need for change, supportive and continuous leadership, and visionary change agents were also deemed essential. In societies with strong uncertainty avoidance strict legislation may enforce resistance to curriculum change. In those countries opposition by faculty can be overcome if national legislation encourages change, provided additional internal factors support the change process.

  5. Valuing flexibilities in the design of urban water management systems.

    PubMed

    Deng, Yinghan; Cardin, Michel-Alexandre; Babovic, Vladan; Santhanakrishnan, Deepak; Schmitter, Petra; Meshgi, Ali

    2013-12-15

    Climate change and rapid urbanization requires decision-makers to develop a long-term forward assessment on sustainable urban water management projects. This is further complicated by the difficulties of assessing sustainable designs and various design scenarios from an economic standpoint. A conventional valuation approach for urban water management projects, like Discounted Cash Flow (DCF) analysis, fails to incorporate uncertainties, such as amount of rainfall, unit cost of water, and other uncertainties associated with future changes in technological domains. Such approach also fails to include the value of flexibility, which enables managers to adapt and reconfigure systems over time as uncertainty unfolds. This work describes an integrated framework to value investments in urban water management systems under uncertainty. It also extends the conventional DCF analysis through explicit considerations of flexibility in systems design and management. The approach incorporates flexibility as intelligent decision-making mechanisms that enable systems to avoid future downside risks and increase opportunities for upside gains over a range of possible futures. A water catchment area in Singapore was chosen to assess the value of a flexible extension of standard drainage canals and a flexible deployment of a novel water catchment technology based on green roofs and porous pavements. Results show that integrating uncertainty and flexibility explicitly into the decision-making process can reduce initial capital expenditure, improve value for investment, and enable decision-makers to learn more about system requirements during the lifetime of the project. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. On-orbit servicing system assessment and optimization methods based on lifecycle simulation under mixed aleatory and epistemic uncertainties

    NASA Astrophysics Data System (ADS)

    Yao, Wen; Chen, Xiaoqian; Huang, Yiyong; van Tooren, Michel

    2013-06-01

    To assess the on-orbit servicing (OOS) paradigm and optimize its utilities by taking advantage of its inherent flexibility and responsiveness, the OOS system assessment and optimization methods based on lifecycle simulation under uncertainties are studied. The uncertainty sources considered in this paper include both the aleatory (random launch/OOS operation failure and on-orbit component failure) and the epistemic (the unknown trend of the end-used market price) types. Firstly, the lifecycle simulation under uncertainties is discussed. The chronological flowchart is presented. The cost and benefit models are established, and the uncertainties thereof are modeled. The dynamic programming method to make optimal decision in face of the uncertain events is introduced. Secondly, the method to analyze the propagation effects of the uncertainties on the OOS utilities is studied. With combined probability and evidence theory, a Monte Carlo lifecycle Simulation based Unified Uncertainty Analysis (MCS-UUA) approach is proposed, based on which the OOS utility assessment tool under mixed uncertainties is developed. Thirdly, to further optimize the OOS system under mixed uncertainties, the reliability-based optimization (RBO) method is studied. To alleviate the computational burden of the traditional RBO method which involves nested optimum search and uncertainty analysis, the framework of Sequential Optimization and Mixed Uncertainty Analysis (SOMUA) is employed to integrate MCS-UUA, and the RBO algorithm SOMUA-MCS is developed. Fourthly, a case study on the OOS system for a hypothetical GEO commercial communication satellite is investigated with the proposed assessment tool. Furthermore, the OOS system is optimized with SOMUA-MCS. Lastly, some conclusions are given and future research prospects are highlighted.

  7. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping.

    PubMed

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-03-04

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster-Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster-Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC operation yielded poor results.

  8. Application of a Cognitive Model for Army Training: Handbook for Strategic Intelligence Analysis

    DTIC Science & Technology

    1984-10-01

    sources of uncertainty. By reducing uncertainty, information, in accordance with consumers of ITAC products are able to requirements validated by the...plan would have to beil designed and integrated with these * Know the envronment.I goals in mind. According to the SBDP, *Dvlpamtoia prah .4Soviet...political climate, and the cultural history each of which can revise and append of the country or countries implicated in the message according to

  9. Prioritizing Risks and Uncertainties from Intentional Release of Selected Category A Pathogens

    PubMed Central

    Hong, Tao; Gurian, Patrick L.; Huang, Yin; Haas, Charles N.

    2012-01-01

    This paper synthesizes available information on five Category A pathogens (Bacillus anthracis, Yersinia pestis, Francisella tularensis, Variola major and Lassa) to develop quantitative guidelines for how environmental pathogen concentrations may be related to human health risk in an indoor environment. An integrated model of environmental transport and human health exposure to biological pathogens is constructed which 1) includes the effects of environmental attenuation, 2) considers fomite contact exposure as well as inhalational exposure, and 3) includes an uncertainty analysis to identify key input uncertainties, which may inform future research directions. The findings provide a framework for developing the many different environmental standards that are needed for making risk-informed response decisions, such as when prophylactic antibiotics should be distributed, and whether or not a contaminated area should be cleaned up. The approach is based on the assumption of uniform mixing in environmental compartments and is thus applicable to areas sufficiently removed in time and space from the initial release that mixing has produced relatively uniform concentrations. Results indicate that when pathogens are released into the air, risk from inhalation is the main component of the overall risk, while risk from ingestion (dermal contact for B. anthracis) is the main component of the overall risk when pathogens are present on surfaces. Concentrations sampled from untracked floor, walls and the filter of heating ventilation and air conditioning (HVAC) system are proposed as indicators of previous exposure risk, while samples taken from touched surfaces are proposed as indicators of future risk if the building is reoccupied. A Monte Carlo uncertainty analysis is conducted and input-output correlations used to identify important parameter uncertainties. An approach is proposed for integrating these quantitative assessments of parameter uncertainty with broader, qualitative considerations to identify future research priorities. PMID:22412915

  10. An Integrated Uncertainty Analysis and Ensemble-based Data Assimilation Framework for Operational Snow Predictions

    NASA Astrophysics Data System (ADS)

    He, M.; Hogue, T. S.; Franz, K.; Margulis, S. A.; Vrugt, J. A.

    2009-12-01

    The National Weather Service (NWS), the agency responsible for short- and long-term streamflow predictions across the nation, primarily applies the SNOW17 model for operational forecasting of snow accumulation and melt. The SNOW17-forecasted snowmelt serves as an input to a rainfall-runoff model for streamflow forecasts in snow-dominated areas. The accuracy of streamflow predictions in these areas largely relies on the accuracy of snowmelt. However, no direct snowmelt measurements are available to validate the SNOW17 predictions. Instead, indirect measurements such as snow water equivalent (SWE) measurements or discharge are typically used to calibrate SNOW17 parameters. In addition, the forecast practice is inherently deterministic, lacking tools to systematically address forecasting uncertainties (e.g., uncertainties in parameters, forcing, SWE and discharge observations, etc.). The current research presents an Integrated Uncertainty analysis and Ensemble-based data Assimilation (IUEA) framework to improve predictions of snowmelt and discharge while simultaneously providing meaningful estimates of the associated uncertainty. The IUEA approach uses the recently developed DiffeRential Evolution Adaptive Metropolis (DREAM) to simultaneously estimate uncertainties in model parameters, forcing, and observations. The robustness and usefulness of the IUEA-SNOW17 framework is evaluated for snow-dominated watersheds in the northern Sierra Mountains, using the coupled IUEA-SNOW17 and an operational soil moisture accounting model (SAC-SMA). Preliminary results are promising and indicate successful performance of the coupled IUEA-SNOW17 framework. Implementation of the SNOW17 with the IUEA is straightforward and requires no major modification to the SNOW17 model structure. The IUEA-SNOW17 framework is intended to be modular and transferable and should assist the NWS in advancing the current forecasting system and reinforcing current operational forecasting skill.

  11. The Multi-Sensor Aerosol Products Sampling System (MAPSS) for Integrated Analysis of Satellite Retrieval Uncertainties

    NASA Technical Reports Server (NTRS)

    Ichoku, Charles; Petrenko, Maksym; Leptoukh, Gregory

    2010-01-01

    Among the known atmospheric constituents, aerosols represent the greatest uncertainty in climate research. Although satellite-based aerosol retrieval has practically become routine, especially during the last decade, there is often disagreement between similar aerosol parameters retrieved from different sensors, leaving users confused as to which sensors to trust for answering important science questions about the distribution, properties, and impacts of aerosols. As long as there is no consensus and the inconsistencies are not well characterized and understood ', there will be no way of developing reliable climate data records from satellite aerosol measurements. Fortunately, the most globally representative well-calibrated ground-based aerosol measurements corresponding to the satellite-retrieved products are available from the Aerosol Robotic Network (AERONET). To adequately utilize the advantages offered by this vital resource,., an online Multi-sensor Aerosol Products Sampling System (MAPSS) was recently developed. The aim of MAPSS is to facilitate detailed comparative analysis of satellite aerosol measurements from different sensors (Terra-MODIS, Aqua-MODIS, Terra-MISR, Aura-OMI, Parasol-POLDER, and Calipso-CALIOP) based on the collocation of these data products over AERONET stations. In this presentation, we will describe the strategy of the MAPSS system, its potential advantages for the aerosol community, and the preliminary results of an integrated comparative uncertainty analysis of aerosol products from multiple satellite sensors.

  12. Hybrid Gibbs Sampling and MCMC for CMB Analysis at Small Angular Scales

    NASA Technical Reports Server (NTRS)

    Jewell, Jeffrey B.; Eriksen, H. K.; Wandelt, B. D.; Gorski, K. M.; Huey, G.; O'Dwyer, I. J.; Dickinson, C.; Banday, A. J.; Lawrence, C. R.

    2008-01-01

    A) Gibbs Sampling has now been validated as an efficient, statistically exact, and practically useful method for "low-L" (as demonstrated on WMAP temperature polarization data). B) We are extending Gibbs sampling to directly propagate uncertainties in both foreground and instrument models to total uncertainty in cosmological parameters for the entire range of angular scales relevant for Planck. C) Made possible by inclusion of foreground model parameters in Gibbs sampling and hybrid MCMC and Gibbs sampling for the low signal to noise (high-L) regime. D) Future items to be included in the Bayesian framework include: 1) Integration with Hybrid Likelihood (or posterior) code for cosmological parameters; 2) Include other uncertainties in instrumental systematics? (I.e. beam uncertainties, noise estimation, calibration errors, other).

  13. "I'm not abusing or anything": patient-physician communication about opioid treatment in chronic pain.

    PubMed

    Matthias, Marianne S; Krebs, Erin E; Collins, Linda A; Bergman, Alicia A; Coffing, Jessica; Bair, Matthew J

    2013-11-01

    To characterize clinical communication about opioids through direct analysis of clinic visits and in-depth interviews with patients. This was a pilot study of 30 patients with chronic pain, who were audio-recorded in their primary care visits and interviewed after the visit about their pain care and relationship with their physicians. Emergent thematic analysis guided data interpretation. Uncertainties about opioid treatment for chronic pain, particularly addiction and misuse, play an important role in communicating about pain treatment. Three patterns of responding to uncertainty emerged in conversations between patients and physicians: reassurance, avoiding opioids, and gathering additional information. Results are interpreted within the framework of Problematic Integration theory. Although it is well-established that opioid treatment for chronic pain poses numerous uncertainties, little is known about how patients and their physicians navigate these uncertainties. This study illuminates ways in which patients and physicians face uncertainty communicatively and collaboratively. Acknowledging and confronting the uncertainties inherent in chronic opioid treatment are critical communication skills for patients taking opioids and their physicians. Many of the communication behaviors documented in this study may serve as a model for training patients and physicians to communicate effectively about opioids. Published by Elsevier Ireland Ltd.

  14. Integration of hydrologic and water allocation models in basin-scale water resources management considering crop pattern and climate change: Karkheh River Basin in Iran

    USDA-ARS?s Scientific Manuscript database

    The paradigm of integrated water resources management requires coupled analysis of hydrology and water resources in a river basin. Population growth and uncertainties due to climate change make historic data not a reliable source of information for future planning of water resources, hence necessit...

  15. Measuring and explaining eco-efficiencies of wastewater treatment plants in China: An uncertainty analysis perspective.

    PubMed

    Dong, Xin; Zhang, Xinyi; Zeng, Siyu

    2017-04-01

    In the context of sustainable development, there has been an increasing requirement for an eco-efficiency assessment of wastewater treatment plants (WWTPs). Data envelopment analysis (DEA), a technique that is widely applied for relative efficiency assessment, is used in combination with the tolerances approach to handle WWTPs' multiple inputs and outputs as well as their uncertainty. The economic cost, energy consumption, contaminant removal, and global warming effect during the treatment processes are integrated to interpret the eco-efficiency of WWTPs. A total of 736 sample plants from across China are assessed, and large sensitivities to variations in inputs and outputs are observed for most samples, with only three WWTPs identified as being stably efficient. Size of plant, overcapacity, climate type, and influent characteristics are proven to have a significant influence on both the mean efficiency and performance sensitivity of WWTPs, while no clear relationships were found between eco-efficiency and technology under the framework of uncertainty analysis. The incorporation of uncertainty quantification and environmental impact consideration has improved the liability and applicability of the assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Strategic Technology Investment Analysis: An Integrated System Approach

    NASA Technical Reports Server (NTRS)

    Adumitroaie, V.; Weisbin, C. R.

    2010-01-01

    Complex technology investment decisions within NASA are increasingly difficult to make such that the end results are satisfying the technical objectives and all the organizational constraints. Due to a restricted science budget environment and numerous required technology developments, the investment decisions need to take into account not only the functional impact on the program goals, but also development uncertainties and cost variations along with maintaining a healthy workforce. This paper describes an approach for optimizing and qualifying technology investment portfolios from the perspective of an integrated system model. The methodology encompasses multi-attribute decision theory elements and sensitivity analysis. The evaluation of the degree of robustness of the recommended portfolio provides the decision-maker with an array of viable selection alternatives, which take into account input uncertainties and possibly satisfy nontechnical constraints. The methodology is presented in the context of assessing capability development portfolios for NASA technology programs.

  17. Bayesian tomography and integrated data analysis in fusion diagnostics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Dong, E-mail: lid@swip.ac.cn; Dong, Y. B.; Deng, Wei

    2016-11-15

    In this article, a Bayesian tomography method using non-stationary Gaussian process for a prior has been introduced. The Bayesian formalism allows quantities which bear uncertainty to be expressed in the probabilistic form so that the uncertainty of a final solution can be fully resolved from the confidence interval of a posterior probability. Moreover, a consistency check of that solution can be performed by checking whether the misfits between predicted and measured data are reasonably within an assumed data error. In particular, the accuracy of reconstructions is significantly improved by using the non-stationary Gaussian process that can adapt to the varyingmore » smoothness of emission distribution. The implementation of this method to a soft X-ray diagnostics on HL-2A has been used to explore relevant physics in equilibrium and MHD instability modes. This project is carried out within a large size inference framework, aiming at an integrated analysis of heterogeneous diagnostics.« less

  18. Uncertainty and risk in wildland fire management: a review.

    PubMed

    Thompson, Matthew P; Calkin, Dave E

    2011-08-01

    Wildland fire management is subject to manifold sources of uncertainty. Beyond the unpredictability of wildfire behavior, uncertainty stems from inaccurate/missing data, limited resource value measures to guide prioritization across fires and resources at risk, and an incomplete scientific understanding of ecological response to fire, of fire behavior response to treatments, and of spatiotemporal dynamics involving disturbance regimes and climate change. This work attempts to systematically align sources of uncertainty with the most appropriate decision support methodologies, in order to facilitate cost-effective, risk-based wildfire planning efforts. We review the state of wildfire risk assessment and management, with a specific focus on uncertainties challenging implementation of integrated risk assessments that consider a suite of human and ecological values. Recent advances in wildfire simulation and geospatial mapping of highly valued resources have enabled robust risk-based analyses to inform planning across a variety of scales, although improvements are needed in fire behavior and ignition occurrence models. A key remaining challenge is a better characterization of non-market resources at risk, both in terms of their response to fire and how society values those resources. Our findings echo earlier literature identifying wildfire effects analysis and value uncertainty as the primary challenges to integrated wildfire risk assessment and wildfire management. We stress the importance of identifying and characterizing uncertainties in order to better quantify and manage them. Leveraging the most appropriate decision support tools can facilitate wildfire risk assessment and ideally improve decision-making. Published by Elsevier Ltd.

  19. Towards Improved Understanding of the Applicability of Uncertainty Forecasts in the Electric Power Industry

    DOE PAGES

    Bessa, Ricardo; Möhrlen, Corinna; Fundel, Vanessa; ...

    2017-09-14

    Around the world wind energy is starting to become a major energy provider in electricity markets, as well as participating in ancillary services markets to help maintain grid stability. The reliability of system operations and smooth integration of wind energy into electricity markets has been strongly supported by years of improvement in weather and wind power forecasting systems. Deterministic forecasts are still predominant in utility practice although truly optimal decisions and risk hedging are only possible with the adoption of uncertainty forecasts. One of the main barriers for the industrial adoption of uncertainty forecasts is the lack of understanding ofmore » its information content (e.g., its physical and statistical modeling) and standardization of uncertainty forecast products, which frequently leads to mistrust towards uncertainty forecasts and their applicability in practice. Our paper aims at improving this understanding by establishing a common terminology and reviewing the methods to determine, estimate, and communicate the uncertainty in weather and wind power forecasts. This conceptual analysis of the state of the art highlights that: (i) end-users should start to look at the forecast's properties in order to map different uncertainty representations to specific wind energy-related user requirements; (ii) a multidisciplinary team is required to foster the integration of stochastic methods in the industry sector. Furthermore, a set of recommendations for standardization and improved training of operators are provided along with examples of best practices.« less

  20. Towards Improved Understanding of the Applicability of Uncertainty Forecasts in the Electric Power Industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bessa, Ricardo; Möhrlen, Corinna; Fundel, Vanessa

    Around the world wind energy is starting to become a major energy provider in electricity markets, as well as participating in ancillary services markets to help maintain grid stability. The reliability of system operations and smooth integration of wind energy into electricity markets has been strongly supported by years of improvement in weather and wind power forecasting systems. Deterministic forecasts are still predominant in utility practice although truly optimal decisions and risk hedging are only possible with the adoption of uncertainty forecasts. One of the main barriers for the industrial adoption of uncertainty forecasts is the lack of understanding ofmore » its information content (e.g., its physical and statistical modeling) and standardization of uncertainty forecast products, which frequently leads to mistrust towards uncertainty forecasts and their applicability in practice. Our paper aims at improving this understanding by establishing a common terminology and reviewing the methods to determine, estimate, and communicate the uncertainty in weather and wind power forecasts. This conceptual analysis of the state of the art highlights that: (i) end-users should start to look at the forecast's properties in order to map different uncertainty representations to specific wind energy-related user requirements; (ii) a multidisciplinary team is required to foster the integration of stochastic methods in the industry sector. Furthermore, a set of recommendations for standardization and improved training of operators are provided along with examples of best practices.« less

  1. Modelling the exposure to chemicals for risk assessment: a comprehensive library of multimedia and PBPK models for integration, prediction, uncertainty and sensitivity analysis - the MERLIN-Expo tool.

    PubMed

    Ciffroy, P; Alfonso, B; Altenpohl, A; Banjac, Z; Bierkens, J; Brochot, C; Critto, A; De Wilde, T; Fait, G; Fierens, T; Garratt, J; Giubilato, E; Grange, E; Johansson, E; Radomyski, A; Reschwann, K; Suciu, N; Tanaka, T; Tediosi, A; Van Holderbeke, M; Verdonck, F

    2016-10-15

    MERLIN-Expo is a library of models that was developed in the frame of the FP7 EU project 4FUN in order to provide an integrated assessment tool for state-of-the-art exposure assessment for environment, biota and humans, allowing the detection of scientific uncertainties at each step of the exposure process. This paper describes the main features of the MERLIN-Expo tool. The main challenges in exposure modelling that MERLIN-Expo has tackled are: (i) the integration of multimedia (MM) models simulating the fate of chemicals in environmental media, and of physiologically based pharmacokinetic (PBPK) models simulating the fate of chemicals in human body. MERLIN-Expo thus allows the determination of internal effective chemical concentrations; (ii) the incorporation of a set of functionalities for uncertainty/sensitivity analysis, from screening to variance-based approaches. The availability of such tools for uncertainty and sensitivity analysis aimed to facilitate the incorporation of such issues in future decision making; (iii) the integration of human and wildlife biota targets with common fate modelling in the environment. MERLIN-Expo is composed of a library of fate models dedicated to non biological receptor media (surface waters, soils, outdoor air), biological media of concern for humans (several cultivated crops, mammals, milk, fish), as well as wildlife biota (primary producers in rivers, invertebrates, fish) and humans. These models can be linked together to create flexible scenarios relevant for both human and wildlife biota exposure. Standardized documentation for each model and training material were prepared to support an accurate use of the tool by end-users. One of the objectives of the 4FUN project was also to increase the confidence in the applicability of the MERLIN-Expo tool through targeted realistic case studies. In particular, we aimed at demonstrating the feasibility of building complex realistic exposure scenarios and the accuracy of the modelling predictions through a comparison with actual measurements. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sig Drellack, Lance Prothro

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result ofmore » the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The simulations are challenged by the distributed sources in each of the Corrective Action Units, by complex mass transfer processes, and by the size and complexity of the field-scale flow models. An efficient methodology utilizing particle tracking results and convolution integrals provides in situ concentrations appropriate for Monte Carlo analysis. Uncertainty in source releases and transport parameters including effective porosity, fracture apertures and spacing, matrix diffusion coefficients, sorption coefficients, and colloid load and mobility are considered. With the distributions of input uncertainties and output plume volumes, global analysis methods including stepwise regression, contingency table analysis, and classification tree analysis are used to develop sensitivity rankings of parameter uncertainties for each model considered, thus assisting a variety of decisions.« less

  3. Can the EVIDEM Framework Tackle Issues Raised by Evaluating Treatments for Rare Diseases: Analysis of Issues and Policies, and Context-Specific Adaptation.

    PubMed

    Wagner, Monika; Khoury, Hanane; Willet, Jacob; Rindress, Donna; Goetghebeur, Mireille

    2016-03-01

    The multiplicity of issues, including uncertainty and ethical dilemmas, and policies involved in appraising interventions for rare diseases suggests that multicriteria decision analysis (MCDA) based on a holistic definition of value is uniquely suited for this purpose. The objective of this study was to analyze and further develop a comprehensive MCDA framework (EVIDEM) to address rare disease issues and policies, while maintaining its applicability across disease areas. Specific issues and policies for rare diseases were identified through literature review. Ethical and methodological foundations of the EVIDEM framework v3.0 were systematically analyzed from the perspective of these issues, and policies and modifications of the framework were performed accordingly to ensure their integration. Analysis showed that the framework integrates ethical dilemmas and issues inherent to appraising interventions for rare diseases but required further integration of specific aspects. Modification thus included the addition of subcriteria to further differentiate disease severity, disease-specific treatment outcomes, and economic consequences of interventions for rare diseases. Scoring scales were further developed to include negative scales for all comparative criteria. A methodology was established to incorporate context-specific population priorities and policies, such as those for rare diseases, into the quantitative part of the framework. This design allows making more explicit trade-offs between competing ethical positions of fairness (prioritization of those who are worst off), the goal of benefiting as many people as possible, the imperative to help, and wise use of knowledge and resources. It also allows addressing variability in institutional policies regarding prioritization of specific disease areas, in addition to existing uncertainty analysis available from EVIDEM. The adapted framework measures value in its widest sense, while being responsive to rare disease issues and policies. It provides an operationalizable platform to integrate values, competing ethical dilemmas, and uncertainty in appraising healthcare interventions.

  4. Comparison of beam position calculation methods for application in digital acquisition systems

    NASA Astrophysics Data System (ADS)

    Reiter, A.; Singh, R.

    2018-05-01

    Different approaches to the data analysis of beam position monitors in hadron accelerators are compared adopting the perspective of an analog-to-digital converter in a sampling acquisition system. Special emphasis is given to position uncertainty and robustness against bias and interference that may be encountered in an accelerator environment. In a time-domain analysis of data in the presence of statistical noise, the position calculation based on the difference-over-sum method with algorithms like signal integral or power can be interpreted as a least-squares analysis of a corresponding fit function. This link to the least-squares method is exploited in the evaluation of analysis properties and in the calculation of position uncertainty. In an analytical model and experimental evaluations the positions derived from a straight line fit or equivalently the standard deviation are found to be the most robust and to offer the least variance. The measured position uncertainty is consistent with the model prediction in our experiment, and the results of tune measurements improve significantly.

  5. Barriers of inter-organisational integration in vocational rehabilitation.

    PubMed

    Wihlman, Ulla; Lundborg, Cecilia Stålsby; Axelsson, Runo; Holmström, Inger

    2008-06-19

    A project of vocational rehabilitation was studied in Sweden between 1999 and 2002. The project included four public organisations: the social insurance office, the local health services, the municipal social service and the office of the state employment service. The aim of this paper was to analyse perceived barriers in the development of inter-organisational integration. Theories of inter-professional and inter-organisational integration, and theories on organisational change. In total, 51 semi-structured interviews and 14 focus group discussions were performed with actors within the project between 1999 and 2002. A thematic approach was used for the analysis of the data. THREE DIFFERENT MAIN THEMES OF BARRIERS EMERGED FROM THE DATA: A Uncertainty, B Prioritising own organisation and C Lack of communication. The themes are interconnected in an intricate web and hence not mutually exclusive. The barriers found are all related partly to organisational change in general and partly to the specific development of organisational integration. Prioritising of own organisation led to flaws in communication, which in turn led to a high degree of uncertainty within the project. This can be seen as a circular relationship, since uncertainty might increase focus on own organisation and lack of communication. A way to overcome these barriers would be to take the needs of the clients as a point of departure in the development of joint services and to also involve them in the development of inter-organisational integration.

  6. Probabilistic Analysis of Solid Oxide Fuel Cell Based Hybrid Gas Turbine System

    NASA Technical Reports Server (NTRS)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2003-01-01

    The emergence of fuel cell systems and hybrid fuel cell systems requires the evolution of analysis strategies for evaluating thermodynamic performance. A gas turbine thermodynamic cycle integrated with a fuel cell was computationally simulated and probabilistically evaluated in view of the several uncertainties in the thermodynamic performance parameters. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the uncertainties in the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design and make it cost effective. The analysis leads to the selection of criteria for gas turbine performance.

  7. Gradient-based model calibration with proxy-model assistance

    NASA Astrophysics Data System (ADS)

    Burrows, Wesley; Doherty, John

    2016-02-01

    Use of a proxy model in gradient-based calibration and uncertainty analysis of a complex groundwater model with large run times and problematic numerical behaviour is described. The methodology is general, and can be used with models of all types. The proxy model is based on a series of analytical functions that link all model outputs used in the calibration process to all parameters requiring estimation. In enforcing history-matching constraints during the calibration and post-calibration uncertainty analysis processes, the proxy model is run for the purposes of populating the Jacobian matrix, while the original model is run when testing parameter upgrades; the latter process is readily parallelized. Use of a proxy model in this fashion dramatically reduces the computational burden of complex model calibration and uncertainty analysis. At the same time, the effect of model numerical misbehaviour on calculation of local gradients is mitigated, this allowing access to the benefits of gradient-based analysis where lack of integrity in finite-difference derivatives calculation would otherwise have impeded such access. Construction of a proxy model, and its subsequent use in calibration of a complex model, and in analysing the uncertainties of predictions made by that model, is implemented in the PEST suite.

  8. Train integrity detection risk analysis based on PRISM

    NASA Astrophysics Data System (ADS)

    Wen, Yuan

    2018-04-01

    GNSS based Train Integrity Monitoring System (TIMS) is an effective and low-cost detection scheme for train integrity detection. However, as an external auxiliary system of CTCS, GNSS may be influenced by external environments, such as uncertainty of wireless communication channels, which may lead to the failure of communication and positioning. In order to guarantee the reliability and safety of train operation, a risk analysis method of train integrity detection based on PRISM is proposed in this article. First, we analyze the risk factors (in GNSS communication process and the on-board communication process) and model them. Then, we evaluate the performance of the model in PRISM based on the field data. Finally, we discuss how these risk factors influence the train integrity detection process.

  9. Integrating telecare for chronic disease management in the community: What needs to be done?

    PubMed Central

    2011-01-01

    Background Telecare could greatly facilitate chronic disease management in the community, but despite government promotion and positive demonstrations its implementation has been limited. This study aimed to identify factors inhibiting the implementation and integration of telecare systems for chronic disease management in the community. Methods Large scale comparative study employing qualitative data collection techniques: semi-structured interviews with key informants, task-groups, and workshops; framework analysis of qualitative data informed by Normalization Process Theory. Drawn from telecare services in community and domestic settings in England and Scotland, 221 participants were included, consisting of health professionals and managers; patients and carers; social care professionals and managers; and service suppliers and manufacturers. Results Key barriers to telecare integration were uncertainties about coherent and sustainable service and business models; lack of coordination across social and primary care boundaries, lack of financial or other incentives to include telecare within primary care services; a lack of a sense of continuity with previous service provision and self-care work undertaken by patients; and general uncertainty about the adequacy of telecare systems. These problems led to poor integration of policy and practice. Conclusion Telecare services may offer a cost effective and safe form of care for some people living with chronic illness. Slow and uneven implementation and integration do not stem from problems of adoption. They result from incomplete understanding of the role of telecare systems and subsequent adaption and embeddedness to context, and uncertainties about the best way to develop, coordinate, and sustain services that assist with chronic disease management. Interventions are therefore needed that (i) reduce uncertainty about the ownership of implementation processes and that lock together health and social care agencies; and (ii) ensure user centred rather than biomedical/service-centred models of care. PMID:21619596

  10. Consistency of Estimated Global Water Cycle Variations Over the Satellite Era

    NASA Technical Reports Server (NTRS)

    Robertson, F. R.; Bosilovich, M. G.; Roberts, J. B.; Reichle, R. H.; Adler, R.; Ricciardulli, L.; Berg, W.; Huffman, G. J.

    2013-01-01

    Motivated by the question of whether recent indications of decadal climate variability and a possible "climate shift" may have affected the global water balance, we examine evaporation minus precipitation (E-P) variability integrated over the global oceans and global land from three points of view-remotely sensed retrievals / objective analyses over the oceans, reanalysis vertically-integrated moisture convergence (MFC) over land, and land surface models forced with observations-based precipitation, radiation and near-surface meteorology. Because monthly variations in area-averaged atmospheric moisture storage are small and the global integral of moisture convergence must approach zero, area-integrated E-P over ocean should essentially equal precipitation minus evapotranspiration (P-ET) over land (after adjusting for ocean and land areas). Our analysis reveals considerable uncertainty in the decadal variations of ocean evaporation when integrated to global scales. This is due to differences among datasets in 10m wind speed and near-surface atmospheric specific humidity (2m qa) used in bulk aerodynamic retrievals. Precipitation variations, all relying substantially on passive microwave retrievals over ocean, still have uncertainties in decadal variability, but not to the degree present with ocean evaporation estimates. Reanalysis MFC and P-ET over land from several observationally forced diagnostic and land surface models agree best on interannual variations. However, upward MFC (i.e. P-ET) reanalysis trends are likely related in part to observing system changes affecting atmospheric assimilation models. While some evidence for a low-frequency E-P maximum near 2000 is found, consistent with a recent apparent pause in sea-surface temperature (SST) rise, uncertainties in the datasets used here remain significant. Prospects for further reducing uncertainties are discussed. The results are interpreted in the context of recent climate variability (Pacific Decadal Oscillation, Atlantic Meridional Overturning), and efforts to distinguish these modes from longer-term trends.

  11. Integrating telecare for chronic disease management in the community: what needs to be done?

    PubMed

    May, Carl R; Finch, Tracy L; Cornford, James; Exley, Catherine; Gately, Claire; Kirk, Sue; Jenkings, K Neil; Osbourne, Janice; Robinson, A Louise; Rogers, Anne; Wilson, Robert; Mair, Frances S

    2011-05-27

    Telecare could greatly facilitate chronic disease management in the community, but despite government promotion and positive demonstrations its implementation has been limited. This study aimed to identify factors inhibiting the implementation and integration of telecare systems for chronic disease management in the community. Large scale comparative study employing qualitative data collection techniques: semi-structured interviews with key informants, task-groups, and workshops; framework analysis of qualitative data informed by Normalization Process Theory. Drawn from telecare services in community and domestic settings in England and Scotland, 221 participants were included, consisting of health professionals and managers; patients and carers; social care professionals and managers; and service suppliers and manufacturers. Key barriers to telecare integration were uncertainties about coherent and sustainable service and business models; lack of coordination across social and primary care boundaries, lack of financial or other incentives to include telecare within primary care services; a lack of a sense of continuity with previous service provision and self-care work undertaken by patients; and general uncertainty about the adequacy of telecare systems. These problems led to poor integration of policy and practice. Telecare services may offer a cost effective and safe form of care for some people living with chronic illness. Slow and uneven implementation and integration do not stem from problems of adoption. They result from incomplete understanding of the role of telecare systems and subsequent adaption and embeddedness to context, and uncertainties about the best way to develop, coordinate, and sustain services that assist with chronic disease management. Interventions are therefore needed that (i) reduce uncertainty about the ownership of implementation processes and that lock together health and social care agencies; and (ii) ensure user centred rather than biomedical/service-centred models of care.

  12. Development and application of a standardized flow measurement uncertainty analysis framework to various low-head short-converging intake types across the United States federal hydropower fleet

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Brennan T

    2015-01-01

    Turbine discharges at low-head short converging intakes are difficult to measure accurately. The proximity of the measurement section to the intake entrance admits large uncertainties related to asymmetry of the velocity profile, swirl, and turbulence. Existing turbine performance codes [10, 24] do not address this special case and published literature is largely silent on rigorous evaluation of uncertainties associated with this measurement context. The American Society of Mechanical Engineers (ASME) Committee investigated the use of Acoustic transit time (ATT), Acoustic scintillation (AS), and Current meter (CM) in a short converging intake at the Kootenay Canal Generating Station in 2009. Basedmore » on their findings, a standardized uncertainty analysis (UA) framework for velocity-area method (specifically for CM measurements) is presented in this paper given the fact that CM is still the most fundamental and common type of measurement system. Typical sources of systematic and random errors associated with CM measurements are investigated, and the major sources of uncertainties associated with turbulence and velocity fluctuations, numerical velocity integration technique (bi-cubic spline), and the number and placement of current meters are being considered for an evaluation. Since the velocity measurements in a short converging intake are associated with complex nonlinear and time varying uncertainties (e.g., Reynolds stress in fluid dynamics), simply applying the law of propagation of uncertainty is known to overestimate the measurement variance while the Monte Carlo method does not. Therefore, a pseudo-Monte Carlo simulation method (random flow generation technique [8]) which was initially developed for the purpose of establishing upstream or initial conditions in the Large-Eddy Simulation (LES) and the Direct Numerical Simulation (DNS) is used to statistically determine uncertainties associated with turbulence and velocity fluctuations. This technique is then combined with a bi-cubic spline interpolation method which converts point velocities into a continuous velocity distribution over the measurement domain. Subsequently the number and placement of current meters are simulated to investigate the accuracy of the estimated flow rates using the numerical velocity-area integration method outlined in ISO 3354 [12]. The authors herein consider that statistics on generated flow rates processed with bi-cubic interpolation and sensor simulations are the combined uncertainties which already accounted for the effects of all those three uncertainty sources. A preliminary analysis based on the current meter data obtained through an upgrade acceptance test of a single unit located in a mainstem plant has been presented.« less

  13. Neural network uncertainty assessment using Bayesian statistics: a remote sensing application

    NASA Technical Reports Server (NTRS)

    Aires, F.; Prigent, C.; Rossow, W. B.

    2004-01-01

    Neural network (NN) techniques have proved successful for many regression problems, in particular for remote sensing; however, uncertainty estimates are rarely provided. In this article, a Bayesian technique to evaluate uncertainties of the NN parameters (i.e., synaptic weights) is first presented. In contrast to more traditional approaches based on point estimation of the NN weights, we assess uncertainties on such estimates to monitor the robustness of the NN model. These theoretical developments are illustrated by applying them to the problem of retrieving surface skin temperature, microwave surface emissivities, and integrated water vapor content from a combined analysis of satellite microwave and infrared observations over land. The weight uncertainty estimates are then used to compute analytically the uncertainties in the network outputs (i.e., error bars and correlation structure of these errors). Such quantities are very important for evaluating any application of an NN model. The uncertainties on the NN Jacobians are then considered in the third part of this article. Used for regression fitting, NN models can be used effectively to represent highly nonlinear, multivariate functions. In this situation, most emphasis is put on estimating the output errors, but almost no attention has been given to errors associated with the internal structure of the regression model. The complex structure of dependency inside the NN is the essence of the model, and assessing its quality, coherency, and physical character makes all the difference between a blackbox model with small output errors and a reliable, robust, and physically coherent model. Such dependency structures are described to the first order by the NN Jacobians: they indicate the sensitivity of one output with respect to the inputs of the model for given input data. We use a Monte Carlo integration procedure to estimate the robustness of the NN Jacobians. A regularization strategy based on principal component analysis is proposed to suppress the multicollinearities in order to make these Jacobians robust and physically meaningful.

  14. Evaluation of habitat suitability index models by global sensitivity and uncertainty analyses: a case study for submerged aquatic vegetation

    USGS Publications Warehouse

    Zajac, Zuzanna; Stith, Bradley M.; Bowling, Andrea C.; Langtimm, Catherine A.; Swain, Eric D.

    2015-01-01

    Habitat suitability index (HSI) models are commonly used to predict habitat quality and species distributions and are used to develop biological surveys, assess reserve and management priorities, and anticipate possible change under different management or climate change scenarios. Important management decisions may be based on model results, often without a clear understanding of the level of uncertainty associated with model outputs. We present an integrated methodology to assess the propagation of uncertainty from both inputs and structure of the HSI models on model outputs (uncertainty analysis: UA) and relative importance of uncertain model inputs and their interactions on the model output uncertainty (global sensitivity analysis: GSA). We illustrate the GSA/UA framework using simulated hydrology input data from a hydrodynamic model representing sea level changes and HSI models for two species of submerged aquatic vegetation (SAV) in southwest Everglades National Park: Vallisneria americana (tape grass) and Halodule wrightii (shoal grass). We found considerable spatial variation in uncertainty for both species, but distributions of HSI scores still allowed discrimination of sites with good versus poor conditions. Ranking of input parameter sensitivities also varied spatially for both species, with high habitat quality sites showing higher sensitivity to different parameters than low-quality sites. HSI models may be especially useful when species distribution data are unavailable, providing means of exploiting widely available environmental datasets to model past, current, and future habitat conditions. The GSA/UA approach provides a general method for better understanding HSI model dynamics, the spatial and temporal variation in uncertainties, and the parameters that contribute most to model uncertainty. Including an uncertainty and sensitivity analysis in modeling efforts as part of the decision-making framework will result in better-informed, more robust decisions.

  15. Multimodel Simulation of Water Flow: Uncertainty Analysis

    USDA-ARS?s Scientific Manuscript database

    Simulations of soil water flow require measurements of soil hydraulic properties which are particularly difficult at the field scale. Laboratory measurements provide hydraulic properties at scales finer than the field scale, whereas pedotransfer functions (PTFs) integrate information on hydraulic pr...

  16. Risk Analysis for Resource Planning Optimization

    NASA Technical Reports Server (NTRS)

    Chueng, Kar-Ming

    2008-01-01

    The main purpose of this paper is to introduce a risk management approach that allows planners to quantify the risk and efficiency tradeoff in the presence of uncertainties, and to make forward-looking choices in the development and execution of the plan. Demonstrate a planning and risk analysis framework that tightly integrates mathematical optimization, empirical simulation, and theoretical analysis techniques to solve complex problems.

  17. Uncertainty assessment of urban pluvial flood risk in a context of climate change adaptation decision making

    NASA Astrophysics Data System (ADS)

    Arnbjerg-Nielsen, Karsten; Zhou, Qianqian

    2014-05-01

    There has been a significant increase in climatic extremes in many regions. In Central and Northern Europe, this has led to more frequent and more severe floods. Along with improved flood modelling technologies this has enabled development of economic assessment of climate change adaptation to increasing urban flood risk. Assessment of adaptation strategies often requires a comprehensive risk-based economic analysis of current risk, drivers of change of risk over time, and measures to reduce the risk. However, such studies are often associated with large uncertainties. The uncertainties arise from basic assumptions in the economic analysis and the hydrological model, but also from the projection of future societies to local climate change impacts and suitable adaptation options. This presents a challenge to decision makers when trying to identify robust measures. We present an integrated uncertainty analysis, which can assess and quantify the overall uncertainty in relation to climate change adaptation to urban flash floods. The analysis is based on an uncertainty cascade that by means of Monte Carlo simulations of flood risk assessments incorporates climate change impacts as a key driver of risk changes over time. The overall uncertainty is then attributed to six bulk processes: climate change impact, urban rainfall-runoff processes, stage-depth functions, unit cost of repair, cost of adaptation measures, and discount rate. We apply the approach on an urban hydrological catchment in Odense, Denmark, and find that the uncertainty on the climate change impact appears to have the least influence on the net present value of the studied adaptation measures-. This does not imply that the climate change impact is not important, but that the uncertainties are not dominating when deciding on action or in-action. We then consider the uncertainty related to choosing between adaptation options given that a decision of action has been taken. In this case the major part of the uncertainty on the estimated net present values is identical for all adaptation options and will therefore not affect a comparison between adaptation measures. This makes the chose among the options easier. Furthermore, the explicit attribution of uncertainty also enables a reduction of the overall uncertainty by identifying the processes which contributes the most. This knowledge can then be used to further reduce the uncertainty related to decision making, as a substantial part of the remaining uncertainty is epistemic.

  18. Integrated water assessment and modelling: A bibliometric analysis of trends in the water resource sector

    NASA Astrophysics Data System (ADS)

    Zare, Fateme; Elsawah, Sondoss; Iwanaga, Takuya; Jakeman, Anthony J.; Pierce, Suzanne A.

    2017-09-01

    There are substantial challenges facing humanity in the water and related sectors and purposeful integration of the disciplines, connected sectors and interest groups is now perceived as essential to address them. This article describes and uses bibliometric analysis techniques to provide quantitative insights into the general landscape of Integrated Water Resource Assessment and Modelling (IWAM) research over the last 45 years. Keywords, terms in titles, abstracts and the full texts are used to distinguish the 13,239 IWAM articles in journals and other non-grey literature. We identify the major journals publishing IWAM research, influential authors through citation counts, as well as the distribution and strength of source countries. Fruitfully, we find that the growth in numbers of such publications has continued to accelerate, and attention to both the biophysical and socioeconomic aspects has also been growing. On the other hand, our analysis strongly indicates that the former continue to dominate, partly by embracing integration with other biophysical sectors related to water - environment, groundwater, ecology, climate change and agriculture. In the social sciences the integration is occurring predominantly through economics, with the others, including law, policy and stakeholder participation, much diminished in comparison. We find there has been increasing attention to management and decision support systems, but a much weaker focus on uncertainty, a pervasive concern whose criticalities must be identified and managed for improving decision making. It would seem that interdisciplinary science still has a long way to go before crucial integration with the non-economic social sciences and uncertainty considerations are achieved more routinely.

  19. Case studies in Bayesian microbial risk assessments.

    PubMed

    Kennedy, Marc C; Clough, Helen E; Turner, Joanne

    2009-12-21

    The quantification of uncertainty and variability is a key component of quantitative risk analysis. Recent advances in Bayesian statistics make it ideal for integrating multiple sources of information, of different types and quality, and providing a realistic estimate of the combined uncertainty in the final risk estimates. We present two case studies related to foodborne microbial risks. In the first, we combine models to describe the sequence of events resulting in illness from consumption of milk contaminated with VTEC O157. We used Monte Carlo simulation to propagate uncertainty in some of the inputs to computer models describing the farm and pasteurisation process. Resulting simulated contamination levels were then assigned to consumption events from a dietary survey. Finally we accounted for uncertainty in the dose-response relationship and uncertainty due to limited incidence data to derive uncertainty about yearly incidences of illness in young children. Options for altering the risk were considered by running the model with different hypothetical policy-driven exposure scenarios. In the second case study we illustrate an efficient Bayesian sensitivity analysis for identifying the most important parameters of a complex computer code that simulated VTEC O157 prevalence within a managed dairy herd. This was carried out in 2 stages, first to screen out the unimportant inputs, then to perform a more detailed analysis on the remaining inputs. The method works by building a Bayesian statistical approximation to the computer code using a number of known code input/output pairs (training runs). We estimated that the expected total number of children aged 1.5-4.5 who become ill due to VTEC O157 in milk is 8.6 per year, with 95% uncertainty interval (0,11.5). The most extreme policy we considered was banning on-farm pasteurisation of milk, which reduced the estimate to 6.4 with 95% interval (0,11). In the second case study the effective number of inputs was reduced from 30 to 7 in the screening stage, and just 2 inputs were found to explain 82.8% of the output variance. A combined total of 500 runs of the computer code were used. These case studies illustrate the use of Bayesian statistics to perform detailed uncertainty and sensitivity analyses, integrating multiple information sources in a way that is both rigorous and efficient.

  20. Situational Favorability and Perceived Environmental Uncertainty: An Integrative Approach

    ERIC Educational Resources Information Center

    Nebeker, Delbert M.

    1975-01-01

    Presents the conceptual and empirical basis for a possible combining of Fiedler's contingency model of leadership effectiveness and Lawrence and Lorsch's contingency organization theory. Using perceived environmental uncertainty as the integrating concept, a measure of decision uncertainty was found to be significantly related to Fiedler's…

  1. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    NASA Technical Reports Server (NTRS)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  2. Probabilistic stability analysis: the way forward for stability analysis of sustainable power systems.

    PubMed

    Milanović, Jovica V

    2017-08-13

    Future power systems will be significantly different compared with their present states. They will be characterized by an unprecedented mix of a wide range of electricity generation and transmission technologies, as well as responsive and highly flexible demand and storage devices with significant temporal and spatial uncertainty. The importance of probabilistic approaches towards power system stability analysis, as a subsection of power system studies routinely carried out by power system operators, has been highlighted in previous research. However, it may not be feasible (or even possible) to accurately model all of the uncertainties that exist within a power system. This paper describes for the first time an integral approach to probabilistic stability analysis of power systems, including small and large angular stability and frequency stability. It provides guidance for handling uncertainties in power system stability studies and some illustrative examples of the most recent results of probabilistic stability analysis of uncertain power systems.This article is part of the themed issue 'Energy management: flexibility, risk and optimization'. © 2017 The Author(s).

  3. Intra-Hour Dispatch and Automatic Generator Control Demonstration with Solar Forecasting - Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coimbra, Carlos F. M.

    2016-02-25

    In this project we address multiple resource integration challenges associated with increasing levels of solar penetration that arise from the variability and uncertainty in solar irradiance. We will model the SMUD service region as its own balancing region, and develop an integrated, real-time operational tool that takes solar-load forecast uncertainties into consideration and commits optimal energy resources and reserves for intra-hour and intra-day decisions. The primary objectives of this effort are to reduce power system operation cost by committing appropriate amount of energy resources and reserves, as well as to provide operators a prediction of the generation fleet’s behavior inmore » real time for realistic PV penetration scenarios. The proposed methodology includes the following steps: clustering analysis on the expected solar variability per region for the SMUD system, Day-ahead (DA) and real-time (RT) load forecasts for the entire service areas, 1-year of intra-hour CPR forecasts for cluster centers, 1-year of smart re-forecasting CPR forecasts in real-time for determination of irreducible errors, and uncertainty quantification for integrated solar-load for both distributed and central stations (selected locations within service region) PV generation.« less

  4. Insights into water managers' perception and handling of uncertainties - a study of the role of uncertainty in practitioners' planning and decision-making

    NASA Astrophysics Data System (ADS)

    Höllermann, Britta; Evers, Mariele

    2017-04-01

    Planning and decision-making under uncertainty is common in water management due to climate variability, simplified models, societal developments, planning restrictions just to name a few. Dealing with uncertainty can be approached from two sites, hereby affecting the process and form of communication: Either improve the knowledge base by reducing uncertainties or apply risk-based approaches to acknowledge uncertainties throughout the management process. Current understanding is that science more strongly focusses on the former approach, while policy and practice are more actively applying a risk-based approach to handle incomplete and/or ambiguous information. The focus of this study is on how water managers perceive and handle uncertainties at the knowledge/decision interface in their daily planning and decision-making routines. How they evaluate the role of uncertainties for their decisions and how they integrate this information into the decision-making process. Expert interviews and questionnaires among practitioners and scientists provided an insight into their perspectives on uncertainty handling allowing a comparison of diverse strategies between science and practice as well as between different types of practitioners. Our results confirmed the practitioners' bottom up approach from potential measures upwards instead of impact assessment downwards common in science-based approaches. This science-practice gap may hinder effective uncertainty integration and acknowledgement in final decisions. Additionally, the implementation of an adaptive and flexible management approach acknowledging uncertainties is often stalled by rigid regulations favouring a predict-and-control attitude. However, the study showed that practitioners' level of uncertainty recognition varies with respect to his or her affiliation to type of employer and business unit, hence, affecting the degree of the science-practice-gap with respect to uncertainty recognition. The level of working experience was examined as a cross-cutting property of science and practice with increasing levels of uncertainty awareness and integration among more experienced researchers and practitioners. In conclusion, our study of water managers' perception and handling of uncertainties provides valuable insights for finding routines for uncertainty communication and integration into planning and decision-making processes by acknowledging the divers perceptions among producers, users and receivers of uncertainty information. These results can contribute to more effective integration of hydrological forecast and improved decisions.

  5. Uncertainty-based tradeoff analysis methodology for integrated transportation investment decision-making.

    DOT National Transportation Integrated Search

    2009-10-28

    Transportation agencies strive to maintain their systems in good condition and also to provide : acceptable levels of service to users. However, funding is often inadequate to meet the needs : of system preservation and expansion, and thus performanc...

  6. Probabilistic projections of 21st century climate change over Northern Eurasia

    NASA Astrophysics Data System (ADS)

    Monier, E.; Sokolov, A. P.; Schlosser, C. A.; Scott, J. R.; Gao, X.

    2013-12-01

    We present probabilistic projections of 21st century climate change over Northern Eurasia using the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an earth system model of intermediate complexity, with a two-dimensional zonal-mean atmosphere, to a human activity model. Regional climate change is obtained by two downscaling methods: a dynamical downscaling, where the IGSM is linked to a three dimensional atmospheric model; and a statistical downscaling, where a pattern scaling algorithm uses climate-change patterns from 17 climate models. This framework allows for key sources of uncertainty in future projections of regional climate change to be accounted for: emissions projections; climate system parameters (climate sensitivity, strength of aerosol forcing and ocean heat uptake rate); natural variability; and structural uncertainty. Results show that the choice of climate policy and the climate parameters are the largest drivers of uncertainty. We also nd that dierent initial conditions lead to dierences in patterns of change as large as when using different climate models. Finally, this analysis reveals the wide range of possible climate change over Northern Eurasia, emphasizing the need to consider all sources of uncertainty when modeling climate impacts over Northern Eurasia.

  7. Probabilistic projections of 21st century climate change over Northern Eurasia

    NASA Astrophysics Data System (ADS)

    Monier, Erwan; Sokolov, Andrei; Schlosser, Adam; Scott, Jeffery; Gao, Xiang

    2013-12-01

    We present probabilistic projections of 21st century climate change over Northern Eurasia using the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an Earth system model of intermediate complexity with a two-dimensional zonal-mean atmosphere to a human activity model. Regional climate change is obtained by two downscaling methods: a dynamical downscaling, where the IGSM is linked to a three-dimensional atmospheric model, and a statistical downscaling, where a pattern scaling algorithm uses climate change patterns from 17 climate models. This framework allows for four major sources of uncertainty in future projections of regional climate change to be accounted for: emissions projections, climate system parameters (climate sensitivity, strength of aerosol forcing and ocean heat uptake rate), natural variability, and structural uncertainty. The results show that the choice of climate policy and the climate parameters are the largest drivers of uncertainty. We also find that different initial conditions lead to differences in patterns of change as large as when using different climate models. Finally, this analysis reveals the wide range of possible climate change over Northern Eurasia, emphasizing the need to consider these sources of uncertainty when modeling climate impacts over Northern Eurasia.

  8. Explicitly integrating parameter, input, and structure uncertainties into Bayesian Neural Networks for probabilistic hydrologic forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xuesong; Liang, Faming; Yu, Beibei

    2011-11-09

    Estimating uncertainty of hydrologic forecasting is valuable to water resources and other relevant decision making processes. Recently, Bayesian Neural Networks (BNNs) have been proved powerful tools for quantifying uncertainty of streamflow forecasting. In this study, we propose a Markov Chain Monte Carlo (MCMC) framework to incorporate the uncertainties associated with input, model structure, and parameter into BNNs. This framework allows the structure of the neural networks to change by removing or adding connections between neurons and enables scaling of input data by using rainfall multipliers. The results show that the new BNNs outperform the BNNs that only consider uncertainties associatedmore » with parameter and model structure. Critical evaluation of posterior distribution of neural network weights, number of effective connections, rainfall multipliers, and hyper-parameters show that the assumptions held in our BNNs are not well supported. Further understanding of characteristics of different uncertainty sources and including output error into the MCMC framework are expected to enhance the application of neural networks for uncertainty analysis of hydrologic forecasting.« less

  9. Determination of the fine structure constant based on BLOCH oscillations of ultracold atoms in a vertical optical lattice.

    PubMed

    Cladé, Pierre; de Mirandes, Estefania; Cadoret, Malo; Guellati-Khélifa, Saïda; Schwob, Catherine; Nez, François; Julien, Lucile; Biraben, François

    2006-01-27

    We report an accurate measurement of the recoil velocity of 87Rb atoms based on Bloch oscillations in a vertical accelerated optical lattice. We transfer about 900 recoil momenta with an efficiency of 99.97% per recoil. A set of 72 measurements of the recoil velocity, each one with a relative uncertainty of about 33 ppb in 20 min integration time, leads to a determination of the fine structure constant with a statistical relative uncertainty of 4.4 ppb. The detailed analysis of the different systematic errors yields to a relative uncertainty of 6.7 ppb. The deduced value of alpha-1 is 137.035 998 78(91).

  10. Ku-band antenna acquisition and tracking performance study, volume 4

    NASA Technical Reports Server (NTRS)

    Huang, T. C.; Lindsey, W. C.

    1977-01-01

    The results pertaining to the tradeoff analysis and performance of the Ku-band shuttle antenna pointing and signal acquisition system are presented. The square, hexagonal and spiral antenna trajectories were investigated assuming the TDRS postulated uncertainty region and a flexible statistical model for the location of the TDRS within the uncertainty volume. The scanning trajectories, shuttle/TDRS signal parameters and dynamics, and three signal acquisition algorithms were integrated into a hardware simulation. The hardware simulation is quite flexible in that it allows for the evaluation of signal acquisition performance for an arbitrary (programmable) antenna pattern, a large range of C/N sub O's, various TDRS/shuttle a priori uncertainty distributions, and three distinct signal search algorithms.

  11. Formulation of an integrated robust design and tactics optimization process for undersea weapon systems

    NASA Astrophysics Data System (ADS)

    Frits, Andrew P.

    In the current Navy environment of undersea weapons development, the engineering aspect of design is decoupled from the development of the tactics with which the weapon is employed. Tactics are developed by intelligence experts, warfighters, and wargamers, while torpedo design is handled by engineers and contractors. This dissertation examines methods by which the conceptual design process of undersea weapon systems, including both torpedo systems and mine counter-measure systems, can be improved. It is shown that by simultaneously designing the torpedo and the tactics with which undersea weapons are used, a more effective overall weapon system can be created. In addition to integrating torpedo tactics with design, the thesis also looks at design methods to account for uncertainty. The uncertainty is attributable to multiple sources, including: lack of detailed analysis tools early in the design process, incomplete knowledge of the operational environments, and uncertainty in the performance of potential technologies. A robust design process is introduced to account for this uncertainty in the analysis and optimization of torpedo systems through the combination of Monte Carlo simulation with response surface methodology and metamodeling techniques. Additionally, various other methods that are appropriate to uncertainty analysis are discussed and analyzed. The thesis also advances a new approach towards examining robustness and risk: the treatment of probability of success (POS) as an independent variable. Examining the cost and performance tradeoffs between high and low probability of success designs, the decision-maker can make better informed decisions as to what designs are most promising and determine the optimal balance of risk, cost, and performance. Finally, the thesis examines the use of non-dimensionalization of parameters for torpedo design. The thesis shows that the use of non-dimensional torpedo parameters leads to increased knowledge about the scaleability of torpedo systems and increased performance of Designs of Experiments.

  12. An uncertainty and sensitivity analysis approach for GIS-based multicriteria landslide susceptibility mapping

    PubMed Central

    Feizizadeh, Bakhtiar; Blaschke, Thomas

    2014-01-01

    GIS-based multicriteria decision analysis (MCDA) methods are increasingly being used in landslide susceptibility mapping. However, the uncertainties that are associated with MCDA techniques may significantly impact the results. This may sometimes lead to inaccurate outcomes and undesirable consequences. This article introduces a new GIS-based MCDA approach. We illustrate the consequences of applying different MCDA methods within a decision-making process through uncertainty analysis. Three GIS-MCDA methods in conjunction with Monte Carlo simulation (MCS) and Dempster–Shafer theory are analyzed for landslide susceptibility mapping (LSM) in the Urmia lake basin in Iran, which is highly susceptible to landslide hazards. The methodology comprises three stages. First, the LSM criteria are ranked and a sensitivity analysis is implemented to simulate error propagation based on the MCS. The resulting weights are expressed through probability density functions. Accordingly, within the second stage, three MCDA methods, namely analytical hierarchy process (AHP), weighted linear combination (WLC) and ordered weighted average (OWA), are used to produce the landslide susceptibility maps. In the third stage, accuracy assessments are carried out and the uncertainties of the different results are measured. We compare the accuracies of the three MCDA methods based on (1) the Dempster–Shafer theory and (2) a validation of the results using an inventory of known landslides and their respective coverage based on object-based image analysis of IRS-ID satellite images. The results of this study reveal that through the integration of GIS and MCDA models, it is possible to identify strategies for choosing an appropriate method for LSM. Furthermore, our findings indicate that the integration of MCDA and MCS can significantly improve the accuracy of the results. In LSM, the AHP method performed best, while the OWA reveals better performance in the reliability assessment. The WLC operation yielded poor results. PMID:27019609

  13. Object-Oriented MDAO Tool with Aeroservoelastic Model Tuning Capability

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Li, Wesley; Lung, Shun-fat

    2008-01-01

    An object-oriented multi-disciplinary analysis and optimization (MDAO) tool has been developed at the NASA Dryden Flight Research Center to automate the design and analysis process and leverage existing commercial as well as in-house codes to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic and hypersonic aircraft. Once the structural analysis discipline is finalized and integrated completely into the MDAO process, other disciplines such as aerodynamics and flight controls will be integrated as well. Simple and efficient model tuning capabilities based on optimization problem are successfully integrated with the MDAO tool. More synchronized all phases of experimental testing (ground and flight), analytical model updating, high-fidelity simulations for model validation, and integrated design may result in reduction of uncertainties in the aeroservoelastic model and increase the flight safety.

  14. Assessment of flood susceptible areas using spatially explicit, probabilistic multi-criteria decision analysis

    NASA Astrophysics Data System (ADS)

    Tang, Zhongqian; Zhang, Hua; Yi, Shanzhen; Xiao, Yangfan

    2018-03-01

    GIS-based multi-criteria decision analysis (MCDA) is increasingly used to support flood risk assessment. However, conventional GIS-MCDA methods fail to adequately represent spatial variability and are accompanied with considerable uncertainty. It is, thus, important to incorporate spatial variability and uncertainty into GIS-based decision analysis procedures. This research develops a spatially explicit, probabilistic GIS-MCDA approach for the delineation of potentially flood susceptible areas. The approach integrates the probabilistic and the local ordered weighted averaging (OWA) methods via Monte Carlo simulation, to take into account the uncertainty related to criteria weights, spatial heterogeneity of preferences and the risk attitude of the analyst. The approach is applied to a pilot study for the Gucheng County, central China, heavily affected by the hazardous 2012 flood. A GIS database of six geomorphological and hydrometeorological factors for the evaluation of susceptibility was created. Moreover, uncertainty and sensitivity analysis were performed to investigate the robustness of the model. The results indicate that the ensemble method improves the robustness of the model outcomes with respect to variation in criteria weights and identifies which criteria weights are most responsible for the variability of model outcomes. Therefore, the proposed approach is an improvement over the conventional deterministic method and can provides a more rational, objective and unbiased tool for flood susceptibility evaluation.

  15. Application of the JENDL-4.0 nuclear data set for uncertainty analysis of the prototype FBR Monju

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tamagno, P.; Van Rooijen, W. F. G.; Takeda, T.

    2012-07-01

    This paper deals with uncertainty analysis of the Monju reactor using JENDL-4.0 and the ERANOS code 1. In 2010 the Japan Atomic Energy Agency - JAEA - released the JENDL-4.0 nuclear data set. This new evaluation contains improved values of cross-sections and emphasizes accurate covariance matrices. Also in 2010, JAEA restarted the sodium-cooled fast reactor prototype Monju after about 15 years of shutdown. The long shutdown time resulted in a build-up of {sup 241}Am by natural decay from the initially loaded Pu. As well as improved covariance matrices, JENDL-4.0 is announced to contain improved data for minor actinides 2. Themore » choice of Monju reactor as an application of the new evaluation seems then even more relevant. The uncertainty analysis requires the determination of sensitivity coefficients. The well-established ERANOS code was chosen because of its integrated modules that allow users to perform sensitivity and uncertainty analysis. A JENDL-4.0 cross-sections library is not available for ERANOS. Therefor a cross-sections library had to be made from the original ENDF files for the ECCO cell code (part of ERANOS). For confirmation of the newly made library, calculations of a benchmark core were performed. These calculations used the MZA and MZB benchmarks and showed consistent results with other libraries. Calculations for the Monju reactor were performed using hexagonal 3D geometry and PN transport theory. However, the ERANOS sensitivity modules cannot use the resulting fluxes, as these modules require finite differences based fluxes, obtained from RZ SN-transport or 3D diffusion calculations. The corresponding geometrical models have been made and the results verified with Monju restart experimental data 4. Uncertainty analysis was performed using the RZ model. JENDL-4.0 uncertainty analysis showed a significant reduction of the uncertainty related to the fission cross-section of Pu along with an increase of the uncertainty related to the capture cross-section of {sup 238}U compared with the previous JENDL-3.3 version. Covariance data recently added in JENDL-4.0 for {sup 241}Am appears to have a non-negligible contribution. (authors)« less

  16. A Comparison of Probabilistic and Deterministic Campaign Analysis for Human Space Exploration

    NASA Technical Reports Server (NTRS)

    Merrill, R. Gabe; Andraschko, Mark; Stromgren, Chel; Cirillo, Bill; Earle, Kevin; Goodliff, Kandyce

    2008-01-01

    Human space exploration is by its very nature an uncertain endeavor. Vehicle reliability, technology development risk, budgetary uncertainty, and launch uncertainty all contribute to stochasticity in an exploration scenario. However, traditional strategic analysis has been done in a deterministic manner, analyzing and optimizing the performance of a series of planned missions. History has shown that exploration scenarios rarely follow such a planned schedule. This paper describes a methodology to integrate deterministic and probabilistic analysis of scenarios in support of human space exploration. Probabilistic strategic analysis is used to simulate "possible" scenario outcomes, based upon the likelihood of occurrence of certain events and a set of pre-determined contingency rules. The results of the probabilistic analysis are compared to the nominal results from the deterministic analysis to evaluate the robustness of the scenario to adverse events and to test and optimize contingency planning.

  17. Integrated Data Analysis for Fusion: A Bayesian Tutorial for Fusion Diagnosticians

    NASA Astrophysics Data System (ADS)

    Dinklage, Andreas; Dreier, Heiko; Fischer, Rainer; Gori, Silvio; Preuss, Roland; Toussaint, Udo von

    2008-03-01

    Integrated Data Analysis (IDA) offers a unified way of combining information relevant to fusion experiments. Thereby, IDA meets with typical issues arising in fusion data analysis. In IDA, all information is consistently formulated as probability density functions quantifying uncertainties in the analysis within the Bayesian probability theory. For a single diagnostic, IDA allows the identification of faulty measurements and improvements in the setup. For a set of diagnostics, IDA gives joint error distributions allowing the comparison and integration of different diagnostics results. Validation of physics models can be performed by model comparison techniques. Typical data analysis applications benefit from IDA capabilities of nonlinear error propagation, the inclusion of systematic effects and the comparison of different physics models. Applications range from outlier detection, background discrimination, model assessment and design of diagnostics. In order to cope with next step fusion device requirements, appropriate techniques are explored for fast analysis applications.

  18. Fast integration-based prediction bands for ordinary differential equation models.

    PubMed

    Hass, Helge; Kreutz, Clemens; Timmer, Jens; Kaschek, Daniel

    2016-04-15

    To gain a deeper understanding of biological processes and their relevance in disease, mathematical models are built upon experimental data. Uncertainty in the data leads to uncertainties of the model's parameters and in turn to uncertainties of predictions. Mechanistic dynamic models of biochemical networks are frequently based on nonlinear differential equation systems and feature a large number of parameters, sparse observations of the model components and lack of information in the available data. Due to the curse of dimensionality, classical and sampling approaches propagating parameter uncertainties to predictions are hardly feasible and insufficient. However, for experimental design and to discriminate between competing models, prediction and confidence bands are essential. To circumvent the hurdles of the former methods, an approach to calculate a profile likelihood on arbitrary observations for a specific time point has been introduced, which provides accurate confidence and prediction intervals for nonlinear models and is computationally feasible for high-dimensional models. In this article, reliable and smooth point-wise prediction and confidence bands to assess the model's uncertainty on the whole time-course are achieved via explicit integration with elaborate correction mechanisms. The corresponding system of ordinary differential equations is derived and tested on three established models for cellular signalling. An efficiency analysis is performed to illustrate the computational benefit compared with repeated profile likelihood calculations at multiple time points. The integration framework and the examples used in this article are provided with the software package Data2Dynamics, which is based on MATLAB and freely available at http://www.data2dynamics.org helge.hass@fdm.uni-freiburg.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. Methods for forecasting freight in uncertainty : time series analysis of multiple factors.

    DOT National Transportation Integrated Search

    2011-01-31

    The main goal of this research was to analyze and more accurately model freight movement in : Alabama. Ultimately, the goal of this project was to provide an overall approach to the : integration of accurate freight models into transportation plans a...

  20. Evaluating Uncertainty in Integrated Environmental Models: A Review of Concepts and Tools

    EPA Science Inventory

    This paper reviews concepts for evaluating integrated environmental models and discusses a list of relevant software-based tools. A simplified taxonomy for sources of uncertainty and a glossary of key terms with standard definitions are provided in the context of integrated appro...

  1. A geostatistics-informed hierarchical sensitivity analysis method for complex groundwater flow and transport modeling

    NASA Astrophysics Data System (ADS)

    Dai, Heng; Chen, Xingyuan; Ye, Ming; Song, Xuehang; Zachara, John M.

    2017-05-01

    Sensitivity analysis is an important tool for development and improvement of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study, we developed a new sensitivity analysis method that integrates the concept of variance-based method with a hierarchical uncertainty quantification framework. Different uncertain inputs are grouped and organized into a multilayer framework based on their characteristics and dependency relationships to reduce the dimensionality of the sensitivity analysis. A set of new sensitivity indices are defined for the grouped inputs using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially distributed input variables.

  2. A Geostatistics-Informed Hierarchical Sensitivity Analysis Method for Complex Groundwater Flow and Transport Modeling

    NASA Astrophysics Data System (ADS)

    Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.

    2017-12-01

    Sensitivity analysis is an important tool for development and improvement of mathematical models, especially for complex systems with a high dimension of spatially correlated parameters. Variance-based global sensitivity analysis has gained popularity because it can quantify the relative contribution of uncertainty from different sources. However, its computational cost increases dramatically with the complexity of the considered model and the dimension of model parameters. In this study we developed a new sensitivity analysis method that integrates the concept of variance-based method with a hierarchical uncertainty quantification framework. Different uncertain inputs are grouped and organized into a multi-layer framework based on their characteristics and dependency relationships to reduce the dimensionality of the sensitivity analysis. A set of new sensitivity indices are defined for the grouped inputs using the variance decomposition method. Using this methodology, we identified the most important uncertainty source for a dynamic groundwater flow and solute transport model at the Department of Energy (DOE) Hanford site. The results indicate that boundary conditions and permeability field contribute the most uncertainty to the simulated head field and tracer plume, respectively. The relative contribution from each source varied spatially and temporally. By using a geostatistical approach to reduce the number of realizations needed for the sensitivity analysis, the computational cost of implementing the developed method was reduced to a practically manageable level. The developed sensitivity analysis method is generally applicable to a wide range of hydrologic and environmental problems that deal with high-dimensional spatially-distributed input variables.

  3. Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling

    DOE PAGES

    Pastore, Giovanni; Swiler, L. P.; Hales, Jason D.; ...

    2014-10-12

    The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code and a recently implemented physics-based model for the coupled fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO2 single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information from the open literature. The study leads to an initial quantitative assessment of the uncertaintymore » in fission gas behavior modeling with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.« less

  4. Wave-optics uncertainty propagation and regression-based bias model in GNSS radio occultation bending angle retrievals

    NASA Astrophysics Data System (ADS)

    Gorbunov, Michael E.; Kirchengast, Gottfried

    2018-01-01

    A new reference occultation processing system (rOPS) will include a Global Navigation Satellite System (GNSS) radio occultation (RO) retrieval chain with integrated uncertainty propagation. In this paper, we focus on wave-optics bending angle (BA) retrieval in the lower troposphere and introduce (1) an empirically estimated boundary layer bias (BLB) model then employed to reduce the systematic uncertainty of excess phases and bending angles in about the lowest 2 km of the troposphere and (2) the estimation of (residual) systematic uncertainties and their propagation together with random uncertainties from excess phase to bending angle profiles. Our BLB model describes the estimated bias of the excess phase transferred from the estimated bias of the bending angle, for which the model is built, informed by analyzing refractivity fluctuation statistics shown to induce such biases. The model is derived from regression analysis using a large ensemble of Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC) RO observations and concurrent European Centre for Medium-Range Weather Forecasts (ECMWF) analysis fields. It is formulated in terms of predictors and adaptive functions (powers and cross products of predictors), where we use six main predictors derived from observations: impact altitude, latitude, bending angle and its standard deviation, canonical transform (CT) amplitude, and its fluctuation index. Based on an ensemble of test days, independent of the days of data used for the regression analysis to establish the BLB model, we find the model very effective for bias reduction and capable of reducing bending angle and corresponding refractivity biases by about a factor of 5. The estimated residual systematic uncertainty, after the BLB profile subtraction, is lower bounded by the uncertainty from the (indirect) use of ECMWF analysis fields but is significantly lower than the systematic uncertainty without BLB correction. The systematic and random uncertainties are propagated from excess phase to bending angle profiles, using a perturbation approach and the wave-optical method recently introduced by Gorbunov and Kirchengast (2015), starting with estimated excess phase uncertainties. The results are encouraging and this uncertainty propagation approach combined with BLB correction enables a robust reduction and quantification of the uncertainties of excess phases and bending angles in the lower troposphere.

  5. Assessment of Radiative Heating Uncertainty for Hyperbolic Earth Entry

    NASA Technical Reports Server (NTRS)

    Johnston, Christopher O.; Mazaheri, Alireza; Gnoffo, Peter A.; Kleb, W. L.; Sutton, Kenneth; Prabhu, Dinesh K.; Brandis, Aaron M.; Bose, Deepak

    2011-01-01

    This paper investigates the shock-layer radiative heating uncertainty for hyperbolic Earth entry, with the main focus being a Mars return. In Part I of this work, a baseline simulation approach involving the LAURA Navier-Stokes code with coupled ablation and radiation is presented, with the HARA radiation code being used for the radiation predictions. Flight cases representative of peak-heating Mars or asteroid return are de ned and the strong influence of coupled ablation and radiation on their aerothermodynamic environments are shown. Structural uncertainties inherent in the baseline simulations are identified, with turbulence modeling, precursor absorption, grid convergence, and radiation transport uncertainties combining for a +34% and ..24% structural uncertainty on the radiative heating. A parametric uncertainty analysis, which assumes interval uncertainties, is presented. This analysis accounts for uncertainties in the radiation models as well as heat of formation uncertainties in the flow field model. Discussions and references are provided to support the uncertainty range chosen for each parameter. A parametric uncertainty of +47.3% and -28.3% is computed for the stagnation-point radiative heating for the 15 km/s Mars-return case. A breakdown of the largest individual uncertainty contributors is presented, which includes C3 Swings cross-section, photoionization edge shift, and Opacity Project atomic lines. Combining the structural and parametric uncertainty components results in a total uncertainty of +81.3% and ..52.3% for the Mars-return case. In Part II, the computational technique and uncertainty analysis presented in Part I are applied to 1960s era shock-tube and constricted-arc experimental cases. It is shown that experiments contain shock layer temperatures and radiative ux values relevant to the Mars-return cases of present interest. Comparisons between the predictions and measurements, accounting for the uncertainty in both, are made for a range of experiments. A measure of comparison quality is de ned, which consists of the percent overlap of the predicted uncertainty bar with the corresponding measurement uncertainty bar. For nearly all cases, this percent overlap is greater than zero, and for most of the higher temperature cases (T >13,000 K) it is greater than 50%. These favorable comparisons provide evidence that the baseline computational technique and uncertainty analysis presented in Part I are adequate for Mars-return simulations. In Part III, the computational technique and uncertainty analysis presented in Part I are applied to EAST shock-tube cases. These experimental cases contain wavelength dependent intensity measurements in a wavelength range that covers 60% of the radiative intensity for the 11 km/s, 5 m radius flight case studied in Part I. Comparisons between the predictions and EAST measurements are made for a range of experiments. The uncertainty analysis presented in Part I is applied to each prediction, and comparisons are made using the metrics defined in Part II. The agreement between predictions and measurements is excellent for velocities greater than 10.5 km/s. Both the wavelength dependent and wavelength integrated intensities agree within 30% for nearly all cases considered. This agreement provides confidence in the computational technique and uncertainty analysis presented in Part I, and provides further evidence that this approach is adequate for Mars-return simulations. Part IV of this paper reviews existing experimental data that include the influence of massive ablation on radiative heating. It is concluded that this existing data is not sufficient for the present uncertainty analysis. Experiments to capture the influence of massive ablation on radiation are suggested as future work, along with further studies of the radiative precursor and improvements in the radiation properties of ablation products.

  6. Integrated flight/propulsion control - Subsystem specifications for performance

    NASA Technical Reports Server (NTRS)

    Neighbors, W. K.; Rock, Stephen M.

    1993-01-01

    A procedure is presented for calculating multiple subsystem specifications given a number of performance requirements on the integrated system. This procedure applies to problems where the control design must be performed in a partitioned manner. It is based on a structured singular value analysis, and generates specifications as magnitude bounds on subsystem uncertainties. The performance requirements should be provided in the form of bounds on transfer functions of the integrated system. This form allows the expression of model following, command tracking, and disturbance rejection requirements. The procedure is demonstrated on a STOVL aircraft design.

  7. Multiscale Modeling and Uncertainty Quantification for Nuclear Fuel Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Estep, Donald; El-Azab, Anter; Pernice, Michael

    2017-03-23

    In this project, we will address the challenges associated with constructing high fidelity multiscale models of nuclear fuel performance. We (*) propose a novel approach for coupling mesoscale and macroscale models, (*) devise efficient numerical methods for simulating the coupled system, and (*) devise and analyze effective numerical approaches for error and uncertainty quantification for the coupled multiscale system. As an integral part of the project, we will carry out analysis of the effects of upscaling and downscaling, investigate efficient methods for stochastic sensitivity analysis of the individual macroscale and mesoscale models, and carry out a posteriori error analysis formore » computed results. We will pursue development and implementation of solutions in software used at Idaho National Laboratories on models of interest to the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program.« less

  8. Integrated probabilistic risk assessment for nanoparticles: the case of nanosilica in food.

    PubMed

    Jacobs, Rianne; van der Voet, Hilko; Ter Braak, Cajo J F

    Insight into risks of nanotechnology and the use of nanoparticles is an essential condition for the social acceptance and safe use of nanotechnology. One of the problems with which the risk assessment of nanoparticles is faced is the lack of data, resulting in uncertainty in the risk assessment. We attempt to quantify some of this uncertainty by expanding a previous deterministic study on nanosilica (5-200 nm) in food into a fully integrated probabilistic risk assessment. We use the integrated probabilistic risk assessment method in which statistical distributions and bootstrap methods are used to quantify uncertainty and variability in the risk assessment. Due to the large amount of uncertainty present, this probabilistic method, which separates variability from uncertainty, contributed to a better understandable risk assessment. We found that quantifying the uncertainties did not increase the perceived risk relative to the outcome of the deterministic study. We pinpointed particular aspects of the hazard characterization that contributed most to the total uncertainty in the risk assessment, suggesting that further research would benefit most from obtaining more reliable data on those aspects.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramuhalli, Pradeep; Hirt, Evelyn H.; Veeramany, Arun

    This research report summaries the development and evaluation of a prototypic enhanced risk monitor (ERM) methodology (framework) that includes alternative risk metrics and uncertainty analysis. This updated ERM methodology accounts for uncertainty in the equipment condition assessment (ECA), the prognostic result, and the probabilistic risk assessment (PRA) model. It is anticipated that the ability to characterize uncertainty in the estimated risk and update the risk estimates in real time based on equipment condition assessment (ECA) will provide a mechanism for optimizing plant performance while staying within specified safety margins. These results (based on impacting active component O&M using real-time equipmentmore » condition information) are a step towards ERMs that, if integrated with AR supervisory plant control systems, can help control O&M costs and improve affordability of advanced reactors.« less

  10. Comparison of Aircraft Models and Integration Schemes for Interval Management in the TRACON

    NASA Technical Reports Server (NTRS)

    Neogi, Natasha; Hagen, George E.; Herencia-Zapana, Heber

    2012-01-01

    Reusable models of common elements for communication, computation, decision and control in air traffic management are necessary in order to enable simulation, analysis and assurance of emergent properties, such as safety and stability, for a given operational concept. Uncertainties due to faults, such as dropped messages, along with non-linearities and sensor noise are an integral part of these models, and impact emergent system behavior. Flight control algorithms designed using a linearized version of the flight mechanics will exhibit error due to model uncertainty, and may not be stable outside a neighborhood of the given point of linearization. Moreover, the communication mechanism by which the sensed state of an aircraft is fed back to a flight control system (such as an ADS-B message) impacts the overall system behavior; both due to sensor noise as well as dropped messages (vacant samples). Additionally simulation of the flight controller system can exhibit further numerical instability, due to selection of the integration scheme and approximations made in the flight dynamics. We examine the theoretical and numerical stability of a speed controller under the Euler and Runge-Kutta schemes of integration, for the Maintain phase for a Mid-Term (2035-2045) Interval Management (IM) Operational Concept for descent and landing operations. We model uncertainties in communication due to missed ADS-B messages by vacant samples in the integration schemes, and compare the emergent behavior of the system, in terms of stability, via the boundedness of the final system state. Any bound on the errors incurred by these uncertainties will play an essential part in a composable assurance argument required for real-time, flight-deck guidance and control systems,. Thus, we believe that the creation of reusable models, which possess property guarantees, such as safety and stability, is an innovative and essential requirement to assessing the emergent properties of novel airspace concepts of operation.

  11. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averagingmore » methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  12. A multiple hypotheses uncertainty analysis in hydrological modelling: about model structure, landscape parameterization, and numerical integration

    NASA Astrophysics Data System (ADS)

    Pilz, Tobias; Francke, Till; Bronstert, Axel

    2016-04-01

    Until today a large number of competing computer models has been developed to understand hydrological processes and to simulate and predict streamflow dynamics of rivers. This is primarily the result of a lack of a unified theory in catchment hydrology due to insufficient process understanding and uncertainties related to model development and application. Therefore, the goal of this study is to analyze the uncertainty structure of a process-based hydrological catchment model employing a multiple hypotheses approach. The study focuses on three major problems that have received only little attention in previous investigations. First, to estimate the impact of model structural uncertainty by employing several alternative representations for each simulated process. Second, explore the influence of landscape discretization and parameterization from multiple datasets and user decisions. Third, employ several numerical solvers for the integration of the governing ordinary differential equations to study the effect on simulation results. The generated ensemble of model hypotheses is then analyzed and the three sources of uncertainty compared against each other. To ensure consistency and comparability all model structures and numerical solvers are implemented within a single simulation environment. First results suggest that the selection of a sophisticated numerical solver for the differential equations positively affects simulation outcomes. However, already some simple and easy to implement explicit methods perform surprisingly well and need less computational efforts than more advanced but time consuming implicit techniques. There is general evidence that ambiguous and subjective user decisions form a major source of uncertainty and can greatly influence model development and application at all stages.

  13. Robust control of nonlinear MAGLEV suspension system with mismatched uncertainties via DOBC approach.

    PubMed

    Yang, Jun; Zolotas, Argyrios; Chen, Wen-Hua; Michail, Konstantinos; Li, Shihua

    2011-07-01

    Robust control of a class of uncertain systems that have disturbances and uncertainties not satisfying "matching" condition is investigated in this paper via a disturbance observer based control (DOBC) approach. In the context of this paper, "matched" disturbances/uncertainties stand for the disturbances/uncertainties entering the system through the same channels as control inputs. By properly designing a disturbance compensation gain, a novel composite controller is proposed to counteract the "mismatched" lumped disturbances from the output channels. The proposed method significantly extends the applicability of the DOBC methods. Rigorous stability analysis of the closed-loop system with the proposed method is established under mild assumptions. The proposed method is applied to a nonlinear MAGnetic LEViation (MAGLEV) suspension system. Simulation shows that compared to the widely used integral control method, the proposed method provides significantly improved disturbance rejection and robustness against load variation. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  14. Optimisation study of a vehicle bumper subsystem with fuzzy parameters

    NASA Astrophysics Data System (ADS)

    Farkas, L.; Moens, D.; Donders, S.; Vandepitte, D.

    2012-10-01

    This paper deals with the design and optimisation for crashworthiness of a vehicle bumper subsystem, which is a key scenario for vehicle component design. The automotive manufacturers and suppliers have to find optimal design solutions for such subsystems that comply with the conflicting requirements of the regulatory bodies regarding functional performance (safety and repairability) and regarding the environmental impact (mass). For the bumper design challenge, an integrated methodology for multi-attribute design engineering of mechanical structures is set up. The integrated process captures the various tasks that are usually performed manually, this way facilitating the automated design iterations for optimisation. Subsequently, an optimisation process is applied that takes the effect of parametric uncertainties into account, such that the system level of failure possibility is acceptable. This optimisation process is referred to as possibility-based design optimisation and integrates the fuzzy FE analysis applied for the uncertainty treatment in crash simulations. This process is the counterpart of the reliability-based design optimisation used in a probabilistic context with statistically defined parameters (variabilities).

  15. An integrated bi-level optimization model for air quality management of Beijing's energy system under uncertainty.

    PubMed

    Jin, S W; Li, Y P; Nie, S

    2018-05-15

    In this study, an interval chance-constrained bi-level programming (ICBP) method is developed for air quality management of municipal energy system under uncertainty. ICBP can deal with uncertainties presented as interval values and probability distributions as well as examine the risk of violating constraints. Besides, a leader-follower decision strategy is incorporated into the optimization process where two decision makers with different goals and preferences are involved. To solve the proposed model, a bi-level interactive algorithm based on satisfactory degree is introduced into the decision-making processes. Then, an ICBP based energy and environmental systems (ICBP-EES) model is formulated for Beijing, in which air quality index (AQI) is used for evaluating the integrated air quality of multiple pollutants. Result analysis can help different stakeholders adjust their tolerances to achieve the overall satisfaction of EES planning for the study city. Results reveal that natural gas is the main source for electricity-generation and heating that could lead to a potentially increment of imported energy for Beijing in future. Results also disclose that PM 10 is the major contributor to AQI. These findings can help decision makers to identify desired alternatives for EES planning and provide useful information for regional air quality management under uncertainty. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Regional crop yield forecasting: a probabilistic approach

    NASA Astrophysics Data System (ADS)

    de Wit, A.; van Diepen, K.; Boogaard, H.

    2009-04-01

    Information on the outlook on yield and production of crops over large regions is essential for government services dealing with import and export of food crops, for agencies with a role in food relief, for international organizations with a mandate in monitoring the world food production and trade, and for commodity traders. Process-based mechanistic crop models are an important tool for providing such information, because they can integrate the effect of crop management, weather and soil on crop growth. When properly integrated in a yield forecasting system, the aggregated model output can be used to predict crop yield and production at regional, national and continental scales. Nevertheless, given the scales at which these models operate, the results are subject to large uncertainties due to poorly known weather conditions and crop management. Current yield forecasting systems are generally deterministic in nature and provide no information about the uncertainty bounds on their output. To improve on this situation we present an ensemble-based approach where uncertainty bounds can be derived from the dispersion of results in the ensemble. The probabilistic information provided by this ensemble-based system can be used to quantify uncertainties (risk) on regional crop yield forecasts and can therefore be an important support to quantitative risk analysis in a decision making process.

  17. A typology of uncertainty derived from an analysis of critical incidents in medical residents: A mixed methods study.

    PubMed

    Hamui-Sutton, Alicia; Vives-Varela, Tania; Gutiérrez-Barreto, Samuel; Leenen, Iwin; Sánchez-Mendiola, Melchor

    2015-11-04

    Medical uncertainty is inherently related to the practice of the physician and generally affects his or her patient care, job satisfaction, continuing education, as well as the overall goals of the health care system. In this paper, some new types of uncertainty, which extend existing typologies, are identified and the contexts and strategies to deal with them are studied. We carried out a mixed-methods study, consisting of a qualitative and a quantitative phase. For the qualitative study, 128 residents reported critical incidents in their clinical practice and described how they coped with the uncertainty in the situation. Each critical incident was analyzed and the most salient situations, 45 in total, were retained. In the quantitative phase, a distinct group of 120 medical residents indicated for each of these situations whether they have been involved in the described situations and, if so, which coping strategy they applied. The analysis examines the relation between characteristics of the situation and the coping strategies. From the qualitative study, a new typology of uncertainty was derived which distinguishes between technical, conceptual, communicational, systemic, and ethical uncertainty. The quantitative analysis showed that, independently of the type of uncertainty, critical incidents are most frequently resolved by consulting senior physicians (49 % overall), which underscores the importance of the hierarchical relationships in the hospital. The insights gained by this study are combined into an integrative model of uncertainty in medical residencies, which combines the type and perceived level of uncertainty, the strategies employed to deal with it, and context elements such as the actors present in the situation. The model considers the final resolution at each of three levels: the patient, the health system, and the physician's personal level. This study gives insight into how medical residents make decisions under different types of uncertainty, giving account of the context in which the interactions take place and of the strategies used to resolve the incidents. These insights may guide the development of organizational policies that reduce uncertainty and stress in residents during their clinical training.

  18. Model reference tracking control of an aircraft: a robust adaptive approach

    NASA Astrophysics Data System (ADS)

    Tanyer, Ilker; Tatlicioglu, Enver; Zergeroglu, Erkan

    2017-05-01

    This work presents the design and the corresponding analysis of a nonlinear robust adaptive controller for model reference tracking of an aircraft that has parametric uncertainties in its system matrices and additive state- and/or time-dependent nonlinear disturbance-like terms in its dynamics. Specifically, robust integral of the sign of the error feedback term and an adaptive term is fused with a proportional integral controller. Lyapunov-based stability analysis techniques are utilised to prove global asymptotic convergence of the output tracking error. Extensive numerical simulations are presented to illustrate the performance of the proposed robust adaptive controller.

  19. Effect of soil property uncertainties on permafrost thaw projections: a calibration-constrained analysis

    NASA Astrophysics Data System (ADS)

    Harp, D. R.; Atchley, A. L.; Painter, S. L.; Coon, E. T.; Wilson, C. J.; Romanovsky, V. E.; Rowland, J. C.

    2016-02-01

    The effects of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The null-space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of predictive uncertainty (due to soil property (parametric) uncertainty) and the inter-annual climate variability due to year to year differences in CESM climate forcings. After calibrating to measured borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant predictive uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Inter-annual climate variability in projected soil moisture content and Stefan number are small. A volume- and time-integrated Stefan number decreases significantly, indicating a shift in subsurface energy utilization in the future climate (latent heat of phase change becomes more important than heat conduction). Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we quantify the relative magnitude of soil property uncertainty to another source of permafrost uncertainty, structural climate model uncertainty. We show that the effect of calibration-constrained uncertainty in soil properties, although significant, is less than that produced by structural climate model uncertainty for this location.

  20. Wildfire Decision Making Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Thompson, M.

    2013-12-01

    Decisions relating to wildfire management are subject to multiple sources of uncertainty, and are made by a broad range of individuals, across a multitude of environmental and socioeconomic contexts. In this presentation I will review progress towards identification and characterization of uncertainties and how this information can support wildfire decision-making. First, I will review a typology of uncertainties common to wildfire management, highlighting some of the more salient sources of uncertainty and how they present challenges to assessing wildfire risk. This discussion will cover the expanding role of burn probability modeling, approaches for characterizing fire effects, and the role of multi-criteria decision analysis, and will provide illustrative examples of integrated wildfire risk assessment across a variety of planning scales. Second, I will describe a related uncertainty typology that focuses on the human dimensions of wildfire management, specifically addressing how social, psychological, and institutional factors may impair cost-effective risk mitigation. This discussion will encompass decision processes before, during, and after fire events, with a specific focus on active management of complex wildfire incidents. An improved ability to characterize uncertainties faced in wildfire management could lead to improved delivery of decision support, targeted communication strategies, and ultimately to improved wildfire management outcomes.

  1. Some Good Practices for Integration and Outreach and their Implementation in the Community Integrated Assessment System (CIAS) and its associated web portal CLIMASCOPE

    NASA Astrophysics Data System (ADS)

    Warren, R. F.; Price, J. T.; Goswami, S.

    2010-12-01

    Successful communication of knowledge to climate change policy makers requires the careful integration of scientific knowledge in an integrated assessment that can be clearly communicated to stakeholders, and which encapsulates the uncertainties in the analysis and conveys the need for using a risk assessment approach. It is important that (i) the system is co-designed with the users (ii) relevant disciplines are included (iii) assumptions made are clear (iv) the robustness of outputs to uncertainties is demonstrated (v) the system is flexible so that it can keep up with changing stakeholder needs and (vi) the results are communicated clearly and are readily accessible. The “Community Integrated Assessment System” (CIAS) is a unique multi-institutional, modular, and flexible integrated assessment system for modeling climate change which fulfils the above six criteria. It differs from other integrated models in being a flexible system allowing various combinations of component modules, to be connected together into alternative integrated assessment models. These modules may be written at different institutions in different computer languages and/or based on different operating systems. Scientists are able determine which particular CIAS coupled model they wish to use through a web portal. This includes the facility to implement Latin hypercube experimental design facilitating formal uncertainty analysis. Further exploration of robustness is possible through the ability to select, for example, alternative hyrdrological or climate models to address the same questions. It has been applied to study future scenarios of climate change mitigation, through for example the AVOIDing dangerous climate change project for DEFRA, in which the avoided impacts (benefits) of alternative climate policies were compared to no-policy baselines. These highlight the potential for mitigation to remove a substantial fraction of the climate change impacts that would otherwise occur; but also show that is not possible to avoid all the impacts, and hence that adaptation will still be required. For example, this has been shown for projections of future European drought. CIAS has also been used for analyses used in the IPCC 4AR and the Stern review. Recent applications include a study of the role of avoided deforestation in climate mitigation, and a study of the impacts of climate change on biodiversity. A second web portal, CLIMASCOPE, is being developed for use by stakeholders, currently focusing on the needs of adaptation planners. This will benefit communication by allowing a wide range of users free access to regional climate change projections in simple manner, yet one which encourages risk assessment through encapsulation of the uncertainties in climate change projection. Examples of CLIMASCOPE output that is being made available to stakeholders will be shown.

  2. IMPLICATIONS OF USING ROBUST BAYESIAN ANALYSIS TO REPRESENT DIVERSE SOURCES OF UNCERTAINTY IN INTEGRATED ASSESSMENT

    EPA Science Inventory

    In our previous research, we showed that robust Bayesian methods can be used in environmental modeling to define a set of probability distributions for key parameters that captures the effects of expert disagreement, ambiguity, or ignorance. This entire set can then be update...

  3. Uncertainty Propagation in Hypersonic Vehicle Aerothermoelastic Analysis

    NASA Astrophysics Data System (ADS)

    Lamorte, Nicolas Etienne

    Hypersonic vehicles face a challenging flight environment. The aerothermoelastic analysis of its components requires numerous simplifying approximations. Identifying and quantifying the effect of uncertainties pushes the limits of the existing deterministic models, and is pursued in this work. An uncertainty quantification framework is used to propagate the effects of identified uncertainties on the stability margins and performance of the different systems considered. First, the aeroelastic stability of a typical section representative of a control surface on a hypersonic vehicle is examined. Variability in the uncoupled natural frequencies of the system is modeled to mimic the effect of aerodynamic heating. Next, the stability of an aerodynamically heated panel representing a component of the skin of a generic hypersonic vehicle is considered. Uncertainty in the location of transition from laminar to turbulent flow and the heat flux prediction is quantified using CFD. In both cases significant reductions of the stability margins are observed. A loosely coupled airframe--integrated scramjet engine is considered next. The elongated body and cowl of the engine flow path are subject to harsh aerothermodynamic loading which causes it to deform. Uncertainty associated with deformation prediction is propagated to the engine performance analysis. The cowl deformation is the main contributor to the sensitivity of the propulsion system performance. Finally, a framework for aerothermoelastic stability boundary calculation for hypersonic vehicles using CFD is developed. The usage of CFD enables one to consider different turbulence conditions, laminar or turbulent, and different models of the air mixture, in particular real gas model which accounts for dissociation of molecules at high temperature. The system is found to be sensitive to turbulence modeling as well as the location of the transition from laminar to turbulent flow. Real gas effects play a minor role in the flight conditions considered. These studies demonstrate the advantages of accounting for uncertainty at an early stage of the analysis. They emphasize the important relation between heat flux modeling, thermal stresses and stability margins of hypersonic vehicles.

  4. Distribution of uncertainties at the municipality level for flood risk modelling along the river Meuse: implications for policy-making

    NASA Astrophysics Data System (ADS)

    Pirotton, Michel; Stilmant, Frédéric; Erpicum, Sébastien; Dewals, Benjamin; Archambeau, Pierre

    2016-04-01

    Flood risk modelling has been conducted for the whole course of the river Meuse in Belgium. Major cities, such as Liege (200,000 inh.) and Namur (110,000 inh.), are located in the floodplains of river Meuse. Particular attention has been paid to uncertainty analysis and its implications for decision-making. The modelling chain contains flood frequency analysis, detailed 2D hydraulic computations, damage modelling and risk calculation. The relative importance of each source of uncertainty to the overall results uncertainty has been estimated by considering several alternate options for each step of the analysis: different distributions were considered in the flood frequency analysis; the influence of modelling assumptions and boundary conditions (e.g., steady vs. unsteady) were taken into account for the hydraulic computation; two different landuse classifications and two sets of damage functions were used; the number of exceedance probabilities involved in the risk calculation (by integration of the risk-curves) was varied. In addition, the sensitivity of the results with respect to increases in flood discharges was assessed. The considered increases are consistent with a "wet" climate change scenario for the time horizons 2021-2050 and 2071-2100 (Detrembleur et al., 2015). The results of hazard computation differ significantly between the upper and lower parts of the course of river Meuse in Belgium. In the former, inundation extents grow gradually as the considered flood discharge is increased (i.e. the exceedance probability is reduced), while in the downstream part, protection structures (mainly concrete walls) prevent inundation for flood discharges corresponding to exceedance probabilities of 0.01 and above (in the present climate). For higher discharges, large inundation extents are obtained in the floodplains. The highest values of risk (mean annual damage) are obtained in the municipalities which undergo relatively frequent flooding (upper part of the river), as well as in those of the downstream part of the Meuse in which flow depths in the urbanized floodplains are particularly high when inundation occurs. This is the case of the city of Liege, as a result of a subsidence process following former mining activities. For a given climate scenario, the uncertainty ranges affecting flood risk estimates are significant; but not so much that the results for the different municipalities would overlap substantially. Therefore, these uncertainties do not hamper prioritization in terms of allocation of risk reduction measures at the municipality level. In the present climate, the uncertainties arising from flood frequency analysis have a negligible influence in the upper part of the river, while they have a considerable impact on risk modelling in the lower part, where a threshold effect was observed due to the flood protection structures (sudden transition from no inundation to massive flooding when a threshold discharge is exceeded). Varying the number of exceedance probabilities in the integration of the risk curve has different effects for different municipalities; but it does not change the ranking of the municipalities in terms of flood risk. For the other scenarios, damage estimation contributes most to the overall uncertainties. As shown by this study, the magnitude of the uncertainty and its main origin vary in space and in time. This emphasizes the paramount importance of conducting distributed uncertainty analyses. In the considered study area, prioritization of risk reduction means can be reliably performed despite the modelling uncertainties. Reference Detrembleur, S., Stilmant, F., Dewals, B., Erpicum, S., Archambeau, P., & Pirotton, M. (2015). Impacts of climate change on future flood damage on the river Meuse, with a distributed uncertainty analysis. Natural Hazards, 77(3), 1533-1549. Acknowledgement Part of this research was funded through the ARC grant for Concerted Research Actions, financed by the Wallonia-Brussels Federation. It was also supported by the NWE Interreg IVB Program.

  5. An overview of methods to identify and manage uncertainty for modelling problems in the water-environment-agriculture cross-sector

    DOE PAGES

    Jakeman, Anthony J.; Jakeman, John Davis

    2018-03-14

    Uncertainty pervades the representation of systems in the water–environment–agriculture cross-sector. Successful methods to address uncertainties have largely focused on standard mathematical formulations of biophysical processes in a single sector, such as partial or ordinary differential equations. More attention to integrated models of such systems is warranted. Model components representing the different sectors of an integrated model can have less standard, and different, formulations to one another, as well as different levels of epistemic knowledge and data informativeness. Thus, uncertainty is not only pervasive but also crosses boundaries and propagates between system components. Uncertainty assessment (UA) cries out for more eclecticmore » treatment in these circumstances, some of it being more qualitative and empirical. Here in this paper, we discuss the various sources of uncertainty in such a cross-sectoral setting and ways to assess and manage them. We have outlined a fast-growing set of methodologies, particularly in the computational mathematics literature on uncertainty quantification (UQ), that seem highly pertinent for uncertainty assessment. There appears to be considerable scope for advancing UA by integrating relevant UQ techniques into cross-sectoral problem applications. Of course this will entail considerable collaboration between domain specialists who often take first ownership of the problem and computational methods experts.« less

  6. An overview of methods to identify and manage uncertainty for modelling problems in the water-environment-agriculture cross-sector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jakeman, Anthony J.; Jakeman, John Davis

    Uncertainty pervades the representation of systems in the water–environment–agriculture cross-sector. Successful methods to address uncertainties have largely focused on standard mathematical formulations of biophysical processes in a single sector, such as partial or ordinary differential equations. More attention to integrated models of such systems is warranted. Model components representing the different sectors of an integrated model can have less standard, and different, formulations to one another, as well as different levels of epistemic knowledge and data informativeness. Thus, uncertainty is not only pervasive but also crosses boundaries and propagates between system components. Uncertainty assessment (UA) cries out for more eclecticmore » treatment in these circumstances, some of it being more qualitative and empirical. Here in this paper, we discuss the various sources of uncertainty in such a cross-sectoral setting and ways to assess and manage them. We have outlined a fast-growing set of methodologies, particularly in the computational mathematics literature on uncertainty quantification (UQ), that seem highly pertinent for uncertainty assessment. There appears to be considerable scope for advancing UA by integrating relevant UQ techniques into cross-sectoral problem applications. Of course this will entail considerable collaboration between domain specialists who often take first ownership of the problem and computational methods experts.« less

  7. Modeling Nitrogen Dynamics in a Waste Stabilization Pond System Using Flexible Modeling Environment with MCMC.

    PubMed

    Mukhtar, Hussnain; Lin, Yu-Pin; Shipin, Oleg V; Petway, Joy R

    2017-07-12

    This study presents an approach for obtaining realization sets of parameters for nitrogen removal in a pilot-scale waste stabilization pond (WSP) system. The proposed approach was designed for optimal parameterization, local sensitivity analysis, and global uncertainty analysis of a dynamic simulation model for the WSP by using the R software package Flexible Modeling Environment (R-FME) with the Markov chain Monte Carlo (MCMC) method. Additionally, generalized likelihood uncertainty estimation (GLUE) was integrated into the FME to evaluate the major parameters that affect the simulation outputs in the study WSP. Comprehensive modeling analysis was used to simulate and assess nine parameters and concentrations of ON-N, NH₃-N and NO₃-N. Results indicate that the integrated FME-GLUE-based model, with good Nash-Sutcliffe coefficients (0.53-0.69) and correlation coefficients (0.76-0.83), successfully simulates the concentrations of ON-N, NH₃-N and NO₃-N. Moreover, the Arrhenius constant was the only parameter sensitive to model performances of ON-N and NH₃-N simulations. However, Nitrosomonas growth rate, the denitrification constant, and the maximum growth rate at 20 °C were sensitive to ON-N and NO₃-N simulation, which was measured using global sensitivity.

  8. Advanced Small Modular Reactor Economics Status Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison, Thomas J.

    2014-10-01

    This report describes the data collection work performed for an advanced small modular reactor (AdvSMR) economics analysis activity at the Oak Ridge National Laboratory. The methodology development and analytical results are described in separate, stand-alone documents as listed in the references. The economics analysis effort for the AdvSMR program combines the technical and fuel cycle aspects of advanced (non-light water reactor [LWR]) reactors with the market and production aspects of SMRs. This requires the collection, analysis, and synthesis of multiple unrelated and potentially high-uncertainty data sets from a wide range of data sources. Further, the nature of both economic andmore » nuclear technology analysis requires at least a minor attempt at prediction and prognostication, and the far-term horizon for deployment of advanced nuclear systems introduces more uncertainty. Energy market uncertainty, especially the electricity market, is the result of the integration of commodity prices, demand fluctuation, and generation competition, as easily seen in deregulated markets. Depending on current or projected values for any of these factors, the economic attractiveness of any power plant construction project can change yearly or quarterly. For long-lead construction projects such as nuclear power plants, this uncertainty generates an implied and inherent risk for potential nuclear power plant owners and operators. The uncertainty in nuclear reactor and fuel cycle costs is in some respects better understood and quantified than the energy market uncertainty. The LWR-based fuel cycle has a long commercial history to use as its basis for cost estimation, and the current activities in LWR construction provide a reliable baseline for estimates for similar efforts. However, for advanced systems, the estimates and their associated uncertainties are based on forward-looking assumptions for performance after the system has been built and has achieved commercial operation. Advanced fuel materials and fabrication costs have large uncertainties based on complexities of operation, such as contact-handled fuel fabrication versus remote handling, or commodity availability. Thus, this analytical work makes a good faith effort to quantify uncertainties and provide qualifiers, caveats, and explanations for the sources of these uncertainties. The overall result is that this work assembles the necessary information and establishes the foundation for future analyses using more precise data as nuclear technology advances.« less

  9. Production of biofuels and biochemicals: in need of an ORACLE.

    PubMed

    Miskovic, Ljubisa; Hatzimanikatis, Vassily

    2010-08-01

    The engineering of cells for the production of fuels and chemicals involves simultaneous optimization of multiple objectives, such as specific productivity, extended substrate range and improved tolerance - all under a great degree of uncertainty. The achievement of these objectives under physiological and process constraints will be impossible without the use of mathematical modeling. However, the limited information and the uncertainty in the available information require new methods for modeling and simulation that will characterize the uncertainty and will quantify, in a statistical sense, the expectations of success of alternative metabolic engineering strategies. We discuss these considerations toward developing a framework for the Optimization and Risk Analysis of Complex Living Entities (ORACLE) - a computational method that integrates available information into a mathematical structure to calculate control coefficients. Copyright 2010 Elsevier Ltd. All rights reserved.

  10. Uncertainty Quantification of Turbulence Model Closure Coefficients for Transonic Wall-Bounded Flows

    NASA Technical Reports Server (NTRS)

    Schaefer, John; West, Thomas; Hosder, Serhat; Rumsey, Christopher; Carlson, Jan-Renee; Kleb, William

    2015-01-01

    The goal of this work was to quantify the uncertainty and sensitivity of commonly used turbulence models in Reynolds-Averaged Navier-Stokes codes due to uncertainty in the values of closure coefficients for transonic, wall-bounded flows and to rank the contribution of each coefficient to uncertainty in various output flow quantities of interest. Specifically, uncertainty quantification of turbulence model closure coefficients was performed for transonic flow over an axisymmetric bump at zero degrees angle of attack and the RAE 2822 transonic airfoil at a lift coefficient of 0.744. Three turbulence models were considered: the Spalart-Allmaras Model, Wilcox (2006) k-w Model, and the Menter Shear-Stress Trans- port Model. The FUN3D code developed by NASA Langley Research Center was used as the flow solver. The uncertainty quantification analysis employed stochastic expansions based on non-intrusive polynomial chaos as an efficient means of uncertainty propagation. Several integrated and point-quantities are considered as uncertain outputs for both CFD problems. All closure coefficients were treated as epistemic uncertain variables represented with intervals. Sobol indices were used to rank the relative contributions of each closure coefficient to the total uncertainty in the output quantities of interest. This study identified a number of closure coefficients for each turbulence model for which more information will reduce the amount of uncertainty in the output significantly for transonic, wall-bounded flows.

  11. Exploration of Uncertainty in Glacier Modelling

    NASA Technical Reports Server (NTRS)

    Thompson, David E.

    1999-01-01

    There are procedures and methods for verification of coding algebra and for validations of models and calculations that are in use in the aerospace computational fluid dynamics (CFD) community. These methods would be efficacious if used by the glacier dynamics modelling community. This paper is a presentation of some of those methods, and how they might be applied to uncertainty management supporting code verification and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modelling are discussed. After establishing sources of uncertainty and methods for code verification, the paper looks at a representative sampling of verification and validation efforts that are underway in the glacier modelling community, and establishes a context for these within overall solution quality assessment. Finally, an information architecture and interactive interface is introduced and advocated. This Integrated Cryospheric Exploration (ICE) Environment is proposed for exploring and managing sources of uncertainty in glacier modelling codes and methods, and for supporting scientific numerical exploration and verification. The details and functionality of this Environment are described based on modifications of a system already developed for CFD modelling and analysis.

  12. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

    NASA Astrophysics Data System (ADS)

    Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.

    2016-06-01

    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.

  13. Final report of coordination and cooperation with the European Union on embankment failure analysis

    USDA-ARS?s Scientific Manuscript database

    There has been an emphasis in the European Union (EU) community on the investigation of extreme flood processes and the uncertainties related to these processes. Over a 3-year period, the EU and the U.S. dam safety community (1) coordinated their efforts and collected information needed to integrate...

  14. The quandaries and promise of risk management: a scientist's perspective on integration of science and management.

    Treesearch

    B.G. Marcot

    2007-01-01

    This paper briefly lists constraints and problems of traditional approaches to natural resource risk analysis and risk management. Such problems include disparate definitions of risk, multiple and conflicting objectives and decisions, conflicting interpretations of uncertainty, and failure of articulating decision criteria, risk attitudes, modeling assumptions, and...

  15. Integrated mixed methods policy analysis for sustainable food systems: trends, challenges and future research.

    PubMed

    Cuevas, Soledad

    Agriculture is a major contributor to greenhouse gas emissions, an important part of which is associated to deforestation and indirect land use change. Appropriate and coherent food policies can play an important role in aligning health, economic and environmental goals. From the point of view of policy analysis, however, this requires multi-sectoral, interdisciplinary approaches which can be highly complex. Important methodological advances in the area are not exempted from limitations and criticism. We argue that there is scope for further developments in integrated quantitative and qualitative policy analysis combining existing methods, including mathematical modelling and stakeholder analysis. We outline methodological trends in the field, briefly characterise integrated mixed methods policy analysis and identify contributions, challenges and opportunities for future research. In particular, this type of approach can help address issues of uncertainty and context-specific validity, incorporate multiple perspectives and help advance meaningful interdisciplinary collaboration in the field. Substantial challenges remain, however, such as the integration of key issues related to non-communicable disease, or the incorporation of a broader range of qualitative approaches that can address important cultural and ethical dimensions of food.

  16. Assessing climate change and socio-economic uncertainties in long term management of water resources

    NASA Astrophysics Data System (ADS)

    Jahanshahi, Golnaz; Dawson, Richard; Walsh, Claire; Birkinshaw, Stephen; Glenis, Vassilis

    2015-04-01

    Long term management of water resources is challenging for decision makers given the range of uncertainties that exist. Such uncertainties are a function of long term drivers of change, such as climate, environmental loadings, demography, land use and other socio economic drivers. Impacts of climate change on frequency of extreme events such as drought make it a serious threat to water resources and water security. The release of probabilistic climate information, such as the UKCP09 scenarios, provides improved understanding of some uncertainties in climate models. This has motivated a more rigorous approach to dealing with other uncertainties in order to understand the sensitivity of investment decisions to future uncertainty and identify adaptation options that are as far as possible robust. We have developed and coupled a system of models that includes a weather generator, simulations of catchment hydrology, demand for water and the water resource system. This integrated model has been applied in the Thames catchment which supplies the city of London, UK. This region is one of the driest in the UK and hence sensitive to water availability. In addition, it is one of the fastest growing parts of the UK and plays an important economic role. Key uncertainties in long term water resources in the Thames catchment, many of which result from earth system processes, are identified and quantified. The implications of these uncertainties are explored using a combination of uncertainty analysis and sensitivity testing. The analysis shows considerable uncertainty in future rainfall, river flow and consequently water resource. For example, results indicate that by the 2050s, low flow (Q95) in the Thames catchment will range from -44 to +9% compared with the control scenario (1970s). Consequently, by the 2050s the average number of drought days are expected to increase 4-6 times relative to the 1970s. Uncertainties associated with urban growth increase these risks further. Adaptation measures, such as new reservoirs can manage these risks to a certain extent, but our sensitivity testing demonstrates that they are less robust to future uncertainties than measures taken to reduce water demand. Keywords: Climate change, Uncertainty, Decision making, Drought, Risk, Water resources management.

  17. Wind Energy Management System Integration Project Incorporating Wind Generation and Load Forecast Uncertainties into Power Grid Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makarov, Yuri V.; Huang, Zhenyu; Etingov, Pavel V.

    2010-09-01

    The power system balancing process, which includes the scheduling, real time dispatch (load following) and regulation processes, is traditionally based on deterministic models. Since the conventional generation needs time to be committed and dispatched to a desired megawatt level, the scheduling and load following processes use load and wind power production forecasts to achieve future balance between the conventional generation and energy storage on the one side, and system load, intermittent resources (such as wind and solar generation) and scheduled interchange on the other side. Although in real life the forecasting procedures imply some uncertainty around the load and windmore » forecasts (caused by forecast errors), only their mean values are actually used in the generation dispatch and commitment procedures. Since the actual load and intermittent generation can deviate from their forecasts, it becomes increasingly unclear (especially, with the increasing penetration of renewable resources) whether the system would be actually able to meet the conventional generation requirements within the look-ahead horizon, what the additional balancing efforts would be needed as we get closer to the real time, and what additional costs would be incurred by those needs. In order to improve the system control performance characteristics, maintain system reliability, and minimize expenses related to the system balancing functions, it becomes necessary to incorporate the predicted uncertainty ranges into the scheduling, load following, and, in some extent, into the regulation processes. It is also important to address the uncertainty problem comprehensively, by including all sources of uncertainty (load, intermittent generation, generators’ forced outages, etc.) into consideration. All aspects of uncertainty such as the imbalance size (which is the same as capacity needed to mitigate the imbalance) and generation ramping requirement must be taken into account. The latter unique features make this work a significant step forward toward the objective of incorporating of wind, solar, load, and other uncertainties into power system operations. In this report, a new methodology to predict the uncertainty ranges for the required balancing capacity, ramping capability and ramp duration is presented. Uncertainties created by system load forecast errors, wind and solar forecast errors, generation forced outages are taken into account. The uncertainty ranges are evaluated for different confidence levels of having the actual generation requirements within the corresponding limits. The methodology helps to identify system balancing reserve requirement based on a desired system performance levels, identify system “breaking points”, where the generation system becomes unable to follow the generation requirement curve with the user-specified probability level, and determine the time remaining to these potential events. The approach includes three stages: statistical and actual data acquisition, statistical analysis of retrospective information, and prediction of future grid balancing requirements for specified time horizons and confidence intervals. Assessment of the capacity and ramping requirements is performed using a specially developed probabilistic algorithm based on a histogram analysis incorporating all sources of uncertainty and parameters of a continuous (wind forecast and load forecast errors) and discrete (forced generator outages and failures to start up) nature. Preliminary simulations using California Independent System Operator (California ISO) real life data have shown the effectiveness of the proposed approach. A tool developed based on the new methodology described in this report will be integrated with the California ISO systems. Contractual work is currently in place to integrate the tool with the AREVA EMS system.« less

  18. Addressing uncertainty in modelling cumulative impacts within maritime spatial planning in the Adriatic and Ionian region.

    PubMed

    Gissi, Elena; Menegon, Stefano; Sarretta, Alessandro; Appiotti, Federica; Maragno, Denis; Vianello, Andrea; Depellegrin, Daniel; Venier, Chiara; Barbanti, Andrea

    2017-01-01

    Maritime spatial planning (MSP) is envisaged as a tool to apply an ecosystem-based approach to the marine and coastal realms, aiming at ensuring that the collective pressure of human activities is kept within acceptable limits. Cumulative impacts (CI) assessment can support science-based MSP, in order to understand the existing and potential impacts of human uses on the marine environment. A CI assessment includes several sources of uncertainty that can hinder the correct interpretation of its results if not explicitly incorporated in the decision-making process. This study proposes a three-level methodology to perform a general uncertainty analysis integrated with the CI assessment for MSP, applied to the Adriatic and Ionian Region (AIR). We describe the nature and level of uncertainty with the help of expert judgement and elicitation to include all of the possible sources of uncertainty related to the CI model with assumptions and gaps related to the case-based MSP process in the AIR. Next, we use the results to tailor the global uncertainty analysis to spatially describe the uncertainty distribution and variations of the CI scores dependent on the CI model factors. The results show the variability of the uncertainty in the AIR, with only limited portions robustly identified as the most or the least impacted areas under multiple model factors hypothesis. The results are discussed for the level and type of reliable information and insights they provide to decision-making. The most significant uncertainty factors are identified to facilitate the adaptive MSP process and to establish research priorities to fill knowledge gaps for subsequent planning cycles. The method aims to depict the potential CI effects, as well as the extent and spatial variation of the data and scientific uncertainty; therefore, this method constitutes a suitable tool to inform the potential establishment of the precautionary principle in MSP.

  19. Effects of Uncertainty in TRMM Precipitation Radar Path Integrated Attenuation on Interannual Variations of Tropical Oceanic Rainfall

    NASA Technical Reports Server (NTRS)

    Robertson, Franklin R.; Fitzjarrald, Dan E.; Kummerow, Christian D.; Arnold, James E. (Technical Monitor)

    2002-01-01

    Considerable uncertainty surrounds the issue of whether precipitation over the tropical oceans (30 deg N/S) systematically changes with interannual sea-surface temperature (SST) anomalies that accompany El Nino (warm) and La Nina (cold) events. Time series of rainfall estimates from the Tropical Rainfall Measuring Mission (TRMM Precipitation Radar (PR) over the tropical oceans show marked differences with estimates from two TRMM Microwave Imager (TMI) passive microwave algorithms. We show that path-integrated attenuation derived from the effects of precipitation on the radar return from the ocean surface exhibits interannual variability that agrees closely with the TMI time series. Further analysis of the frequency distribution of PR (2A25 product) rain rates suggests that the algorithm incorporates the attenuation measurement in a very conservative fashion so as to optimize the instantaneous rain rates. Such an optimization appears to come at the expense of monitoring interannual climate variability.

  20. Investigating Uncertainty and Sensitivity in Integrated, Multimedia Environmental Models: Tools for FRAMES-3MRA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Babendreier, Justin E.; Castleton, Karl J.

    2005-08-01

    Elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-medium constructs driven by a unique set of site-specific data. Quantitative assessment of integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a comparative approach using several techniques, coupled with sufficient computational power. The Framework for Risk Analysis in Multimedia Environmental Systems - Multimedia, Multipathway, and Multireceptor Risk Assessment (FRAMES-3MRA) is an important software model being developed by the United States Environmental Protection Agency for use in risk assessment of hazardous waste management facilities. The 3MRAmore » modeling system includes a set of 17 science modules that collectively simulate release, fate and transport, exposure, and risk associated with hazardous contaminants disposed of in land-based waste management units (WMU) .« less

  1. A novel sliding mode guidance law without line-of-sight angular rate information accounting for autopilot lag

    NASA Astrophysics Data System (ADS)

    He, Shaoming; Wang, Jiang; Wang, Wei

    2017-12-01

    This paper proposes a new composite guidance law to intercept manoeuvring targets without line-of-sight (LOS) angular rate information in the presence of autopilot lag. The presented formulation is obtained via a combination of homogeneous theory and sliding mode control approach. Different from some existing observers, the proposed homogeneous observer can estimate the lumped uncertainty and the LOS angular rate in an integrated manner. To reject the mismatched lumped uncertainty in the integrated guidance and autopilot system, a sliding surface, which consists of the system states and the estimated states, is proposed and a robust guidance law is then synthesised. Stability analysis shows that the LOS angular rate can be stabilised in a small region around zero asymptotically and the upper bound can be lowered by appropriate parameter choice. Numerical simulations with some comparisons are carried out to demonstrate the superiority of the proposed method.

  2. Precision measurement of the photon detection efficiency of silicon photomultipliers using two integrating spheres.

    PubMed

    Yang, Seul Ki; Lee, J; Kim, Sug-Whan; Lee, Hye-Young; Jeon, Jin-A; Park, I H; Yoon, Jae-Ryong; Baek, Yang-Sik

    2014-01-13

    We report a new and improved photon counting method for the precision PDE measurement of SiPM detectors, utilizing two integrating spheres connected serially and calibrated reference detectors. First, using a ray tracing simulation and irradiance measurement results with a reference photodiode, we investigated irradiance characteristics of the measurement instrument, and analyzed dominating systematic uncertainties in PDE measurement. Two SiPM detectors were then used for PDE measurements between wavelengths of 368 and 850 nm and for bias voltages varying from around 70V. The resulting PDEs of the SiPMs show good agreement with those from other studies, yet with an improved accuracy of 1.57% (1σ). This was achieved by the simultaneous measurement with the NIST calibrated reference detectors, which suppressed the time dependent variation of source light. The technical details of the instrumentation, measurement results and uncertainty analysis are reported together with their implications.

  3. Bringing social standards into project evaluation under dynamic uncertainty.

    PubMed

    Knudsen, Odin K; Scandizzo, Pasquale L

    2005-04-01

    Society often sets social standards that define thresholds of damage to society or the environment above which compensation must be paid to the state or other parties. In this article, we analyze the interdependence between the use of social standards and investment evaluation under dynamic uncertainty where a negative externality above a threshold established by society requires an assessment and payment of damages. Under uncertainty, the party considering implementing a project or new technology must not only assess when the project is economically efficient to implement but when to abandon a project that could potentially exceed the social standard. Using real-option theory and simple models, we demonstrate how such a social standard can be integrated into cost-benefit analysis through the use of a development option and a liability option coupled with a damage function. Uncertainty, in fact, implies that both parties interpret the social standard as a target for safety rather than an inflexible barrier that cannot be overcome. The larger is the uncertainty, in fact, the greater will be the tolerance for damages in excess of the social standard from both parties.

  4. Addressing global uncertainty and sensitivity in first-principles based microkinetic models by an adaptive sparse grid approach

    NASA Astrophysics Data System (ADS)

    Döpking, Sandra; Plaisance, Craig P.; Strobusch, Daniel; Reuter, Karsten; Scheurer, Christoph; Matera, Sebastian

    2018-01-01

    In the last decade, first-principles-based microkinetic modeling has been developed into an important tool for a mechanistic understanding of heterogeneous catalysis. A commonly known, but hitherto barely analyzed issue in this kind of modeling is the presence of sizable errors from the use of approximate Density Functional Theory (DFT). We here address the propagation of these errors to the catalytic turnover frequency (TOF) by global sensitivity and uncertainty analysis. Both analyses require the numerical quadrature of high-dimensional integrals. To achieve this efficiently, we utilize and extend an adaptive sparse grid approach and exploit the confinement of the strongly non-linear behavior of the TOF to local regions of the parameter space. We demonstrate the methodology on a model of the oxygen evolution reaction at the Co3O4 (110)-A surface, using a maximum entropy error model that imposes nothing but reasonable bounds on the errors. For this setting, the DFT errors lead to an absolute uncertainty of several orders of magnitude in the TOF. We nevertheless find that it is still possible to draw conclusions from such uncertain models about the atomistic aspects controlling the reactivity. A comparison with derivative-based local sensitivity analysis instead reveals that this more established approach provides incomplete information. Since the adaptive sparse grids allow for the evaluation of the integrals with only a modest number of function evaluations, this approach opens the way for a global sensitivity analysis of more complex models, for instance, models based on kinetic Monte Carlo simulations.

  5. Challenges in Incorporating Climate Change Adaptation into Integrated Water Resources Management

    NASA Astrophysics Data System (ADS)

    Kirshen, P. H.; Cardwell, H.; Kartez, J.; Merrill, S.

    2011-12-01

    Over the last few decades, integrated water resources management (IWRM), under various names, has become the accepted philosophy for water management in the USA. While much is still to be learned about how to actually carry it out, implementation is slowly moving forward - spurred by both legislation and the demands of stakeholders. New challenges to IWRM have arisen because of climate change. Climate change has placed increased demands on the creativities of planners and engineers because they now must design systems that will function over decades of hydrologic uncertainties that dwarf any previous hydrologic or other uncertainties. Climate and socio-economic monitoring systems must also now be established to determine when the future climate has changed sufficiently to warrant undertaking adaptation. The requirements for taking some actions now and preserving options for future actions as well as the increased risk of social inequities in climate change impacts and adaptation are challenging experts in stakeholder participation. To meet these challenges, an integrated methodology is essential that builds upon scenario analysis, risk assessment, statistical decision theory, participatory planning, and consensus building. This integration will create cross-disciplinary boundaries for these disciplines to overcome.

  6. Bayesian integration of position and orientation cues in perception of biological and non-biological forms.

    PubMed

    Thurman, Steven M; Lu, Hongjing

    2014-01-01

    Visual form analysis is fundamental to shape perception and likely plays a central role in perception of more complex dynamic shapes, such as moving objects or biological motion. Two primary form-based cues serve to represent the overall shape of an object: the spatial position and the orientation of locations along the boundary of the object. However, it is unclear how the visual system integrates these two sources of information in dynamic form analysis, and in particular how the brain resolves ambiguities due to sensory uncertainty and/or cue conflict. In the current study, we created animations of sparsely-sampled dynamic objects (human walkers or rotating squares) comprised of oriented Gabor patches in which orientation could either coincide or conflict with information provided by position cues. When the cues were incongruent, we found a characteristic trade-off between position and orientation information whereby position cues increasingly dominated perception as the relative uncertainty of orientation increased and vice versa. Furthermore, we found no evidence for differences in the visual processing of biological and non-biological objects, casting doubt on the claim that biological motion may be specialized in the human brain, at least in specific terms of form analysis. To explain these behavioral results quantitatively, we adopt a probabilistic template-matching model that uses Bayesian inference within local modules to estimate object shape separately from either spatial position or orientation signals. The outputs of the two modules are integrated with weights that reflect individual estimates of subjective cue reliability, and integrated over time to produce a decision about the perceived dynamics of the input data. Results of this model provided a close fit to the behavioral data, suggesting a mechanism in the human visual system that approximates rational Bayesian inference to integrate position and orientation signals in dynamic form analysis.

  7. Integrating info-gap decision theory with robust population management: a case study using the Mountain Plover.

    PubMed

    van der Burg, Max Post; Tyre, Andrew J

    2011-01-01

    Wildlife managers often make decisions under considerable uncertainty. In the most extreme case, a complete lack of data leads to uncertainty that is unquantifiable. Information-gap decision theory deals with assessing management decisions under extreme uncertainty, but it is not widely used in wildlife management. So too, robust population management methods were developed to deal with uncertainties in multiple-model parameters. However, the two methods have not, as yet, been used in tandem to assess population management decisions. We provide a novel combination of the robust population management approach for matrix models with the information-gap decision theory framework for making conservation decisions under extreme uncertainty. We applied our model to the problem of nest survival management in an endangered bird species, the Mountain Plover (Charadrius montanus). Our results showed that matrix sensitivities suggest that nest management is unlikely to have a strong effect on population growth rate, confirming previous analyses. However, given the amount of uncertainty about adult and juvenile survival, our analysis suggested that maximizing nest marking effort was a more robust decision to maintain a stable population. Focusing on the twin concepts of opportunity and robustness in an information-gap model provides a useful method of assessing conservation decisions under extreme uncertainty.

  8. New features and improved uncertainty analysis in the NEA nuclear data sensitivity tool (NDaST)

    NASA Astrophysics Data System (ADS)

    Dyrda, J.; Soppera, N.; Hill, I.; Bossant, M.; Gulliford, J.

    2017-09-01

    Following the release and initial testing period of the NEA's Nuclear Data Sensitivity Tool [1], new features have been designed and implemented in order to expand its uncertainty analysis capabilities. The aim is to provide a free online tool for integral benchmark testing, that is both efficient and comprehensive, meeting the needs of the nuclear data and benchmark testing communities. New features include access to P1 sensitivities for neutron scattering angular distribution [2] and constrained Chi sensitivities for the prompt fission neutron energy sampling. Both of these are compatible with covariance data accessed via the JANIS nuclear data software, enabling propagation of the resultant uncertainties in keff to a large series of integral experiment benchmarks. These capabilities are available using a number of different covariance libraries e.g., ENDF/B, JEFF, JENDL and TENDL, allowing comparison of the broad range of results it is possible to obtain. The IRPhE database of reactor physics measurements is now also accessible within the tool in addition to the criticality benchmarks from ICSBEP. Other improvements include the ability to determine and visualise the energy dependence of a given calculated result in order to better identify specific regions of importance or high uncertainty contribution. Sorting and statistical analysis of the selected benchmark suite is now also provided. Examples of the plots generated by the software are included to illustrate such capabilities. Finally, a number of analytical expressions, for example Maxwellian and Watt fission spectra will be included. This will allow the analyst to determine the impact of varying such distributions within the data evaluation, either through adjustment of parameters within the expressions, or by comparison to a more general probability distribution fitted to measured data. The impact of such changes is verified through calculations which are compared to a `direct' measurement found by adjustment of the original ENDF format file.

  9. Uncertainty prediction for PUB

    NASA Astrophysics Data System (ADS)

    Mendiondo, E. M.; Tucci, C. M.; Clarke, R. T.; Castro, N. M.; Goldenfum, J. A.; Chevallier, P.

    2003-04-01

    IAHS’ initiative of Prediction in Ungaged Basins (PUB) attempts to integrate monitoring needs and uncertainty prediction for river basins. This paper outlines alternative ways of uncertainty prediction which could be linked with new blueprints for PUB, thereby showing how equifinality-based models should be grasped using practical strategies of gauging like the Nested Catchment Experiment (NCE). Uncertainty prediction is discussed from observations of Potiribu Project, which is a NCE layout at representative basins of a suptropical biome of 300,000 km2 in South America. Uncertainty prediction is assessed at the microscale (1 m2 plots), at the hillslope (0,125 km2) and at the mesoscale (0,125 - 560 km2). At the microscale, uncertainty-based models are constrained by temporal variations of state variables with changing likelihood surfaces of experiments using Green-Ampt model. Two new blueprints emerged from this NCE for PUB: (1) the Scale Transferability Scheme (STS) at the hillslope scale and the Integrating Process Hypothesis (IPH) at the mesoscale. The STS integrates a multi-dimensional scaling with similarity thresholds, as a generalization of the Representative Elementary Area (REA), using spatial correlation from point (distributed) to area (lumped) process. In this way, STS addresses uncertainty-bounds of model parameters, into an upscaling process at the hillslope. In the other hand, the IPH approach regionalizes synthetic hydrographs, thereby interpreting the uncertainty bounds of streamflow variables. Multiscale evidences from Potiribu NCE layout show novel pathways of uncertainty prediction under a PUB perspective in representative basins of world biomes.

  10. Integrated Fatigue Damage Diagnosis and Prognosis Under Uncertainties

    DTIC Science & Technology

    2012-09-01

    Integrated fatigue damage diagnosis and prognosis under uncertainties Tishun Peng 1 , Jingjing He 1 , Yongming Liu 1 , Abhinav Saxena 2 , Jose...Ames Research Center, Moffett Field, CA, 94035, USA kai.goebel@nasa.gov ABSTRACT An integrated fatigue damage diagnosis and prognosis framework is...remaining useful life (RUL) prediction. First, a piezoelectric sensor network is used to detect the fatigue crack size near the rivet holes in fuselage

  11. Accounting for multiple sources of uncertainty in impact assessments: The example of the BRACE study

    NASA Astrophysics Data System (ADS)

    O'Neill, B. C.

    2015-12-01

    Assessing climate change impacts often requires the use of multiple scenarios, types of models, and data sources, leading to a large number of potential sources of uncertainty. For example, a single study might require a choice of a forcing scenario, climate model, bias correction and/or downscaling method, societal development scenario, model (typically several) for quantifying elements of societal development such as economic and population growth, biophysical model (such as for crop yields or hydrology), and societal impact model (e.g. economic or health model). Some sources of uncertainty are reduced or eliminated by the framing of the question. For example, it may be useful to ask what an impact outcome would be conditional on a given societal development pathway, forcing scenario, or policy. However many sources of uncertainty remain, and it is rare for all or even most of these sources to be accounted for. I use the example of a recent integrated project on the Benefits of Reduced Anthropogenic Climate changE (BRACE) to explore useful approaches to uncertainty across multiple components of an impact assessment. BRACE comprises 23 papers that assess the differences in impacts between two alternative climate futures: those associated with Representative Concentration Pathways (RCPs) 4.5 and 8.5. It quantifies difference in impacts in terms of extreme events, health, agriculture, tropical cyclones, and sea level rise. Methodologically, it includes climate modeling, statistical analysis, integrated assessment modeling, and sector-specific impact modeling. It employs alternative scenarios of both radiative forcing and societal development, but generally uses a single climate model (CESM), partially accounting for climate uncertainty by drawing heavily on large initial condition ensembles. Strengths and weaknesses of the approach to uncertainty in BRACE are assessed. Options under consideration for improving the approach include the use of perturbed physics ensembles of CESM, employing results from multiple climate models, and combining the results from single impact models with statistical representations of uncertainty across multiple models. A key consideration is the relationship between the question being addressed and the uncertainty approach.

  12. Position uncertainty distribution for articulated arm coordinate measuring machine based on simplified definite integration

    NASA Astrophysics Data System (ADS)

    You, Xu; Zhi-jian, Zong; Qun, Gao

    2018-07-01

    This paper describes a methodology for the position uncertainty distribution of an articulated arm coordinate measuring machine (AACMM). First, a model of the structural parameter uncertainties was established by statistical method. Second, the position uncertainty space volume of the AACMM in a certain configuration was expressed using a simplified definite integration method based on the structural parameter uncertainties; it was then used to evaluate the position accuracy of the AACMM in a certain configuration. Third, the configurations of a certain working point were calculated by an inverse solution, and the position uncertainty distribution of a certain working point was determined; working point uncertainty can be evaluated by the weighting method. Lastly, the position uncertainty distribution in the workspace of the ACCMM was described by a map. A single-point contrast test of a 6-joint AACMM was carried out to verify the effectiveness of the proposed method, and it was shown that the method can describe the position uncertainty of the AACMM and it was used to guide the calibration of the AACMM and the choice of AACMM’s accuracy area.

  13. Reliability of a new biokinetic model of zirconium in internal dosimetry: part I, parameter uncertainty analysis.

    PubMed

    Li, Wei Bo; Greiter, Matthias; Oeh, Uwe; Hoeschen, Christoph

    2011-12-01

    The reliability of biokinetic models is essential in internal dose assessments and radiation risk analysis for the public, occupational workers, and patients exposed to radionuclides. In this paper, a method for assessing the reliability of biokinetic models by means of uncertainty and sensitivity analysis was developed. The paper is divided into two parts. In the first part of the study published here, the uncertainty sources of the model parameters for zirconium (Zr), developed by the International Commission on Radiological Protection (ICRP), were identified and analyzed. Furthermore, the uncertainty of the biokinetic experimental measurement performed at the Helmholtz Zentrum München-German Research Center for Environmental Health (HMGU) for developing a new biokinetic model of Zr was analyzed according to the Guide to the Expression of Uncertainty in Measurement, published by the International Organization for Standardization. The confidence interval and distribution of model parameters of the ICRP and HMGU Zr biokinetic models were evaluated. As a result of computer biokinetic modelings, the mean, standard uncertainty, and confidence interval of model prediction calculated based on the model parameter uncertainty were presented and compared to the plasma clearance and urinary excretion measured after intravenous administration. It was shown that for the most important compartment, the plasma, the uncertainty evaluated for the HMGU model was much smaller than that for the ICRP model; that phenomenon was observed for other organs and tissues as well. The uncertainty of the integral of the radioactivity of Zr up to 50 y calculated by the HMGU model after ingestion by adult members of the public was shown to be smaller by a factor of two than that of the ICRP model. It was also shown that the distribution type of the model parameter strongly influences the model prediction, and the correlation of the model input parameters affects the model prediction to a certain extent depending on the strength of the correlation. In the case of model prediction, the qualitative comparison of the model predictions with the measured plasma and urinary data showed the HMGU model to be more reliable than the ICRP model; quantitatively, the uncertainty model prediction by the HMGU systemic biokinetic model is smaller than that of the ICRP model. The uncertainty information on the model parameters analyzed in this study was used in the second part of the paper regarding a sensitivity analysis of the Zr biokinetic models.

  14. A Monte Carlo Uncertainty Analysis of Ozone Trend Predictions in a Two Dimensional Model. Revision

    NASA Technical Reports Server (NTRS)

    Considine, D. B.; Stolarski, R. S.; Hollandsworth, S. M.; Jackman, C. H.; Fleming, E. L.

    1998-01-01

    We use Monte Carlo analysis to estimate the uncertainty in predictions of total O3 trends between 1979 and 1995 made by the Goddard Space Flight Center (GSFC) two-dimensional (2D) model of stratospheric photochemistry and dynamics. The uncertainty is caused by gas-phase chemical reaction rates, photolysis coefficients, and heterogeneous reaction parameters which are model inputs. The uncertainty represents a lower bound to the total model uncertainty assuming the input parameter uncertainties are characterized correctly. Each of the Monte Carlo runs was initialized in 1970 and integrated for 26 model years through the end of 1995. This was repeated 419 times using input parameter sets generated by Latin Hypercube Sampling. The standard deviation (a) of the Monte Carlo ensemble of total 03 trend predictions is used to quantify the model uncertainty. The 34% difference between the model trend in globally and annually averaged total O3 using nominal inputs and atmospheric trends calculated from Nimbus 7 and Meteor 3 total ozone mapping spectrometer (TOMS) version 7 data is less than the 46% calculated 1 (sigma), model uncertainty, so there is no significant difference between the modeled and observed trends. In the northern hemisphere midlatitude spring the modeled and observed total 03 trends differ by more than 1(sigma) but less than 2(sigma), which we refer to as marginal significance. We perform a multiple linear regression analysis of the runs which suggests that only a few of the model reactions contribute significantly to the variance in the model predictions. The lack of significance in these comparisons suggests that they are of questionable use as guides for continuing model development. Large model/measurement differences which are many multiples of the input parameter uncertainty are seen in the meridional gradients of the trend and the peak-to-peak variations in the trends over an annual cycle. These discrepancies unambiguously indicate model formulation problems and provide a measure of model performance which can be used in attempts to improve such models.

  15. Uncertainty based modeling of rainfall-runoff: Combined differential evolution adaptive Metropolis (DREAM) and K-means clustering

    NASA Astrophysics Data System (ADS)

    Zahmatkesh, Zahra; Karamouz, Mohammad; Nazif, Sara

    2015-09-01

    Simulation of rainfall-runoff process in urban areas is of great importance considering the consequences and damages of extreme runoff events and floods. The first issue in flood hazard analysis is rainfall simulation. Large scale climate signals have been proved to be effective in rainfall simulation and prediction. In this study, an integrated scheme is developed for rainfall-runoff modeling considering different sources of uncertainty. This scheme includes three main steps of rainfall forecasting, rainfall-runoff simulation and future runoff prediction. In the first step, data driven models are developed and used to forecast rainfall using large scale climate signals as rainfall predictors. Due to high effect of different sources of uncertainty on the output of hydrologic models, in the second step uncertainty associated with input data, model parameters and model structure is incorporated in rainfall-runoff modeling and simulation. Three rainfall-runoff simulation models are developed for consideration of model conceptual (structural) uncertainty in real time runoff forecasting. To analyze the uncertainty of the model structure, streamflows generated by alternative rainfall-runoff models are combined, through developing a weighting method based on K-means clustering. Model parameters and input uncertainty are investigated using an adaptive Markov Chain Monte Carlo method. Finally, calibrated rainfall-runoff models are driven using the forecasted rainfall to predict future runoff for the watershed. The proposed scheme is employed in the case study of the Bronx River watershed, New York City. Results of uncertainty analysis of rainfall-runoff modeling reveal that simultaneous estimation of model parameters and input uncertainty significantly changes the probability distribution of the model parameters. It is also observed that by combining the outputs of the hydrological models using the proposed clustering scheme, the accuracy of runoff simulation in the watershed is remarkably improved up to 50% in comparison to the simulations by the individual models. Results indicate that the developed methodology not only provides reliable tools for rainfall and runoff modeling, but also adequate time for incorporating required mitigation measures in dealing with potentially extreme runoff events and flood hazard. Results of this study can be used in identification of the main factors affecting flood hazard analysis.

  16. Investigation of uncertainties associated with the production of n-butanol through ethanol catalysis in sugarcane biorefineries.

    PubMed

    Pereira, Lucas G; Dias, Marina O S; MacLean, Heather L; Bonomi, Antonio

    2015-08-01

    This study evaluated the viability of n-butanol production integrated within a first and second generation sugarcane biorefinery. The evaluation included a deterministic analysis as well as a stochastic approach, the latter using Monte Carlo simulation. Results were promising for n-butanol production in terms of revenues per tonne of processed sugarcane, but discouraging with respect to internal rate of return (IRR). The uncertainty analysis determined there was high risk involved in producing n-butanol and co-products from ethanol catalysis. It is unlikely that these products and associated production route will be financially attractive in the short term without lower investment costs, supportive public policies and tax incentives coupled with biofuels' production strategies. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. A new retrieval algorithm for tropospheric temperature, humidity and pressure profiling based on GNSS radio occultation data

    NASA Astrophysics Data System (ADS)

    Kirchengast, Gottfried; Li, Ying; Scherllin-Pirscher, Barbara; Schwärz, Marc; Schwarz, Jakob; Nielsen, Johannes K.

    2017-04-01

    The GNSS radio occultation (RO) technique is an important remote sensing technique for obtaining thermodynamic profiles of temperature, humidity, and pressure in the Earth's troposphere. However, due to refraction effects of both dry ambient air and water vapor in the troposphere, retrieval of accurate thermodynamic profiles at these lower altitudes is challenging and requires suitable background information in addition to the RO refractivity information. Here we introduce a new moist air retrieval algorithm aiming to improve the quality and robustness of retrieving temperature, humidity and pressure profiles in moist air tropospheric conditions. The new algorithm consists of four steps: (1) use of prescribed specific humidity and its uncertainty to retrieve temperature and its associated uncertainty; (2) use of prescribed temperature and its uncertainty to retrieve specific humidity and its associated uncertainty; (3) use of the previous results to estimate final temperature and specific humidity profiles through optimal estimation; (4) determination of air pressure and density profiles from the results obtained before. The new algorithm does not require elaborated matrix inversions which are otherwise widely used in 1D-Var retrieval algorithms, and it allows a transparent uncertainty propagation, whereby the uncertainties of prescribed variables are dynamically estimated accounting for their spatial and temporal variations. Estimated random uncertainties are calculated by constructing error covariance matrices from co-located ECMWF short-range forecast and corresponding analysis profiles. Systematic uncertainties are estimated by empirical modeling. The influence of regarding or disregarding vertical error correlations is quantified. The new scheme is implemented with static input uncertainty profiles in WEGC's current OPSv5.6 processing system and with full scope in WEGC's next-generation system, the Reference Occultation Processing System (rOPS). Results from both WEGC systems, current OPSv5.6 and next-generation rOPS, are shown and discussed, based on both insights from individual profiles and statistical ensembles, and compared to moist air retrieval results from the UCAR Boulder and ROM-SAF Copenhagen centers. The results show that the new algorithmic scheme improves the temperature, humidity and pressure retrieval performance, in particular also the robustness including for integrated uncertainty estimation for large-scale applications, over the previous algorithms. The new rOPS-implemented algorithm will therefore be used in the first large-scale reprocessing towards a tropospheric climate data record 2001-2016 by the rOPS, including its integrated uncertainty propagation.

  18. INVESTIGATING UNCERTAINTY AND SENSITIVITY IN INTEGRATED MULTIMEDIA ENVIRONMENTAL MODELS: TOOLS FOR 3MRA

    EPA Science Inventory

    Sufficiently elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-media constructs driven by a unique set of site-specific data. The ensuing challenge of examining ever more complex, integrated, higher-ord...

  19. INVESTIGATING UNCERTAINTY AND SENSITIVITY IN INTEGRATED, MULTIMEDIA ENVIRONMENTAL MODELS: TOOLS FOR FRAMES-3MRA

    EPA Science Inventory

    Elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-medium constructs driven by a unique set of site-specific data. Quantitative assessment of integrated, multimedia models that simulate hundreds of sites...

  20. Energy prices will play an important role in determining global land use in the twenty first century

    NASA Astrophysics Data System (ADS)

    Steinbuks, Jevgenijs; Hertel, Thomas W.

    2013-03-01

    Global land use research to date has focused on quantifying uncertainty effects of three major drivers affecting competition for land: the uncertainty in energy and climate policies affecting competition between food and biofuels, the uncertainty of climate impacts on agriculture and forestry, and the uncertainty in the underlying technological progress driving efficiency of food, bioenergy and timber production. The market uncertainty in fossil fuel prices has received relatively less attention in the global land use literature. Petroleum and natural gas prices affect both the competitiveness of biofuels and the cost of nitrogen fertilizers. High prices put significant pressure on global land supply and greenhouse gas emissions from terrestrial systems, while low prices can moderate demands for cropland. The goal of this letter is to assess and compare the effects of these core uncertainties on the optimal profile for global land use and land-based GHG emissions over the coming century. The model that we develop integrates distinct strands of agronomic, biophysical and economic literature into a single, intertemporally consistent, analytical framework, at global scale. Our analysis accounts for the value of land-based services in the production of food, first- and second-generation biofuels, timber, forest carbon and biodiversity. We find that long-term uncertainty in energy prices dominates the climate impacts and climate policy uncertainties emphasized in prior research on global land use.

  1. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE PAGES

    Dai, Heng; Ye, Ming; Walker, Anthony P.; ...

    2017-03-28

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  2. Quantifying Risks and Uncertainties Associated with Induced Seismicity due to CO2 Injection into Geologic Formations with Faults

    NASA Astrophysics Data System (ADS)

    Hou, Z.; Nguyen, B. N.; Bacon, D. H.; White, M. D.; Murray, C. J.

    2016-12-01

    A multiphase flow and reactive transport simulator named STOMP-CO2-R has been developed and coupled to the ABAQUS® finite element package for geomechanical analysis enabling comprehensive thermo-hydro-geochemical-mechanical (THMC) analyses. The coupled THMC simulator has been applied to analyze faulted CO2 reservoir responses (e.g., stress and strain distributions, pressure buildup, slip tendency factor, pressure margin to fracture) with various complexities in fault and reservoir structures and mineralogy. Depending on the geological and reaction network settings, long-term injection of CO2 can have a significant effect on the elastic stiffness and permeability of formation rocks. In parallel, an uncertainty quantification framework (UQ-CO2), which consists of entropy-based prior uncertainty representation, efficient sampling, geostatistical reservoir modeling, and effective response surface analysis, has been developed for quantifying risks and uncertainties associated with CO2 sequestration. It has been demonstrated for evaluating risks in CO2 leakage through natural pathways and wellbores, and for developing predictive reduced order models. Recently, a parallel STOMP-CO2-R has been developed and the updated STOMP/ABAQUS model has been proven to have a great scalability, which makes it possible to integrate the model with the UQ framework to effectively and efficiently explore multidimensional parameter space (e.g., permeability, elastic modulus, crack orientation, fault friction coefficient) for a more systematic analysis of induced seismicity risks.

  3. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dai, Heng; Ye, Ming; Walker, Anthony P.

    A hydrological model consists of multiple process level submodels, and each submodel represents a process key to the operation of the simulated system. Global sensitivity analysis methods have been widely used to identify important processes for system model development and improvement. The existing methods of global sensitivity analysis only consider parametric uncertainty, and are not capable of handling model uncertainty caused by multiple process models that arise from competing hypotheses about one or more processes. To address this problem, this study develops a new method to probe model output sensitivity to competing process models by integrating model averaging methods withmore » variance-based global sensitivity analysis. A process sensitivity index is derived as a single summary measure of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and their parameters. Here, for demonstration, the new index is used to assign importance to the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that convert precipitation to recharge, and the geology process is simulated by two models of hydraulic conductivity. Each process model has its own random parameters. Finally, the new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.« less

  4. Back-stepping active disturbance rejection control design for integrated missile guidance and control system via reduced-order ESO.

    PubMed

    Xingling, Shao; Honglun, Wang

    2015-07-01

    This paper proposes a novel composite integrated guidance and control (IGC) law for missile intercepting against unknown maneuvering target with multiple uncertainties and control constraint. First, by using back-stepping technique, the proposed IGC law design is separated into guidance loop and control loop. The unknown target maneuvers and variations of aerodynamics parameters in guidance and control loop are viewed as uncertainties, which are estimated and compensated by designed model-assisted reduced-order extended state observer (ESO). Second, based on the principle of active disturbance rejection control (ADRC), enhanced feedback linearization (FL) based control law is implemented for the IGC model using the estimates generated by reduced-order ESO. In addition, performance analysis and comparisons between ESO and reduced-order ESO are examined. Nonlinear tracking differentiator is employed to construct the derivative of virtual control command in the control loop. Third, the closed-loop stability for the considered system is established. Finally, the effectiveness of the proposed IGC law in enhanced interception performance such as smooth interception course, improved robustness against multiple uncertainties as well as reduced control consumption during initial phase are demonstrated through simulations. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  5. Analysis of the uncertainty in the monetary valuation of ecosystem services--A case study at the river basin scale.

    PubMed

    Boithias, Laurie; Terrado, Marta; Corominas, Lluís; Ziv, Guy; Kumar, Vikas; Marqués, Montse; Schuhmacher, Marta; Acuña, Vicenç

    2016-02-01

    Ecosystem services provide multiple benefits to human wellbeing and are increasingly considered by policy-makers in environmental management. However, the uncertainty related with the monetary valuation of these benefits is not yet adequately defined or integrated by policy-makers. Given this background, our aim was to quantify different sources of uncertainty when performing monetary valuation of ecosystem services, in order to provide a series of guidelines to reduce them. With an example of 4 ecosystem services (i.e., water provisioning, waste treatment, erosion protection, and habitat for species) provided at the river basin scale, we quantified the uncertainty associated with the following sources: (1) the number of services considered, (2) the number of benefits considered for each service, (3) the valuation metrics (i.e. valuation methods) used to value benefits, and (4) the uncertainty of the parameters included in the valuation metrics. Results indicate that the highest uncertainty was caused by the number of services considered, as well as by the number of benefits considered for each service, whereas the parametric uncertainty was similar to the one related to the selection of valuation metric, thus suggesting that the parametric uncertainty, which is the only uncertainty type commonly considered, was less critical than the structural uncertainty, which is in turn mainly dependent on the decision-making context. Given the uncertainty associated to the valuation structure, special attention should be given to the selection of services, benefits and metrics according to a given context. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. A modified ATI technique for nowcasting convective rain volumes over areas. [area-time integrals

    NASA Technical Reports Server (NTRS)

    Makarau, Amos; Johnson, L. Ronald; Doneaud, Andre A.

    1988-01-01

    This paper explores the applicability of the area-time-integral (ATI) technique for the estimation of the growth portion only of a convective storm (while the rain volume is computed using the entire life history of the event) and for nowcasting the total rain volume of a convective system at the stage of its maximum development. For these purposes, the ATIs were computed from the digital radar data (for 1981-1982) from the North Dakota Cloud Modification Project, using the maximum echo area (ATIA) no less than 25 dBz, the maximum reflectivity, and the maximum echo height as the end of the growth portion of the convective event. Linear regression analysis demonstrated that correlations between total rain volume or the maximum rain volume versus ATIA were the strongest. The uncertainties obtained were comparable to the uncertainties which typically occur in rain volume estimates obtained from radar data employing Z-R conversion followed by space and time integration. This demonstrates that the total rain volume of a storm can be nowcasted at its maximum stage of development.

  7. Integrating quantitative PCR and Bayesian statistics in quantifying human adenoviruses in small volumes of source water.

    PubMed

    Wu, Jianyong; Gronewold, Andrew D; Rodriguez, Roberto A; Stewart, Jill R; Sobsey, Mark D

    2014-02-01

    Rapid quantification of viral pathogens in drinking and recreational water can help reduce waterborne disease risks. For this purpose, samples in small volume (e.g. 1L) are favored because of the convenience of collection, transportation and processing. However, the results of viral analysis are often subject to uncertainty. To overcome this limitation, we propose an approach that integrates Bayesian statistics, efficient concentration methods, and quantitative PCR (qPCR) to quantify viral pathogens in water. Using this approach, we quantified human adenoviruses (HAdVs) in eighteen samples of source water collected from six drinking water treatment plants. HAdVs were found in seven samples. In the other eleven samples, HAdVs were not detected by qPCR, but might have existed based on Bayesian inference. Our integrated approach that quantifies uncertainty provides a better understanding than conventional assessments of potential risks to public health, particularly in cases when pathogens may present a threat but cannot be detected by traditional methods. © 2013 Elsevier B.V. All rights reserved.

  8. From cutting-edge pointwise cross-section to groupwise reaction rate: A primer

    NASA Astrophysics Data System (ADS)

    Sublet, Jean-Christophe; Fleming, Michael; Gilbert, Mark R.

    2017-09-01

    The nuclear research and development community has a history of using both integral and differential experiments to support accurate lattice-reactor, nuclear reactor criticality and shielding simulations, as well as verification and validation efforts of cross sections and emitted particle spectra. An important aspect to this type of analysis is the proper consideration of the contribution of the neutron spectrum in its entirety, with correct propagation of uncertainties and standard deviations derived from Monte Carlo simulations, to the local and total uncertainty in the simulated reactions rates (RRs), which usually only apply to one application at a time. This paper identifies deficiencies in the traditional treatment, and discusses correct handling of the RR uncertainty quantification and propagation, including details of the cross section components in the RR uncertainty estimates, which are verified for relevant applications. The methodology that rigorously captures the spectral shift and cross section contributions to the uncertainty in the RR are discussed with quantified examples that demonstrate the importance of the proper treatment of the spectrum profile and cross section contributions to the uncertainty in the RR and subsequent response functions. The recently developed inventory code FISPACT-II, when connected to the processed nuclear data libraries TENDL-2015, ENDF/B-VII.1, JENDL-4.0u or JEFF-3.2, forms an enhanced multi-physics platform providing a wide variety of advanced simulation methods for modelling activation, transmutation, burnup protocols and simulating radiation damage sources terms. The system has extended cutting-edge nuclear data forms, uncertainty quantification and propagation methods, which have been the subject of recent integral and differential, fission, fusion and accelerators validation efforts. The simulation system is used to accurately and predictively probe, understand and underpin a modern and sustainable understanding of the nuclear physics that is so important for many areas of science and technology; advanced fission and fuel systems, magnetic and inertial confinement fusion, high energy, accelerator physics, medical application, isotope production, earth exploration, astrophysics and homeland security.

  9. Use of Linear Prediction Uncertainty Analysis to Guide Conditioning of Models Simulating Surface-Water/Groundwater Interactions

    NASA Astrophysics Data System (ADS)

    Hughes, J. D.; White, J.; Doherty, J.

    2011-12-01

    Linear prediction uncertainty analysis in a Bayesian framework was applied to guide the conditioning of an integrated surface water/groundwater model that will be used to predict the effects of groundwater withdrawals on surface-water and groundwater flows. Linear prediction uncertainty analysis is an effective approach for identifying (1) raw and processed data most effective for model conditioning prior to inversion, (2) specific observations and periods of time critically sensitive to specific predictions, and (3) additional observation data that would reduce model uncertainty relative to specific predictions. We present results for a two-dimensional groundwater model of a 2,186 km2 area of the Biscayne aquifer in south Florida implicitly coupled to a surface-water routing model of the actively managed canal system. The model domain includes 5 municipal well fields withdrawing more than 1 Mm3/day and 17 operable surface-water control structures that control freshwater releases from the Everglades and freshwater discharges to Biscayne Bay. More than 10 years of daily observation data from 35 groundwater wells and 24 surface water gages are available to condition model parameters. A dense parameterization was used to fully characterize the contribution of the inversion null space to predictive uncertainty and included bias-correction parameters. This approach allows better resolution of the boundary between the inversion null space and solution space. Bias-correction parameters (e.g., rainfall, potential evapotranspiration, and structure flow multipliers) absorb information that is present in structural noise that may otherwise contaminate the estimation of more physically-based model parameters. This allows greater precision in predictions that are entirely solution-space dependent, and reduces the propensity for bias in predictions that are not. Results show that application of this analysis is an effective means of identifying those surface-water and groundwater data, both raw and processed, that minimize predictive uncertainty, while simultaneously identifying the maximum solution-space dimensionality of the inverse problem supported by the data.

  10. CHARACTERIZATION OF DATA VARIABILITY AND UNCERTAINTY: HEALTH EFFECTS ASSESSMENTS IN THE INTEGRATED RISK INFORMATION SYSTEM (IRIS)

    EPA Science Inventory

    In response to a Congressional directive contained in HR 106-379 regarding EPA's appropriations for FY2000, EPA has undertaken an evaluation of the characterization of data variability and uncertainty in its Integrated Risk Information System (IRIS) health effects information dat...

  11. Global sensitivity and uncertainty analysis of an atmospheric chemistry transport model: the FRAME model (version 9.15.0) as a case study

    NASA Astrophysics Data System (ADS)

    Aleksankina, Ksenia; Heal, Mathew R.; Dore, Anthony J.; Van Oijen, Marcel; Reis, Stefan

    2018-04-01

    Atmospheric chemistry transport models (ACTMs) are widely used to underpin policy decisions associated with the impact of potential changes in emissions on future pollutant concentrations and deposition. It is therefore essential to have a quantitative understanding of the uncertainty in model output arising from uncertainties in the input pollutant emissions. ACTMs incorporate complex and non-linear descriptions of chemical and physical processes which means that interactions and non-linearities in input-output relationships may not be revealed through the local one-at-a-time sensitivity analysis typically used. The aim of this work is to demonstrate a global sensitivity and uncertainty analysis approach for an ACTM, using as an example the FRAME model, which is extensively employed in the UK to generate source-receptor matrices for the UK Integrated Assessment Model and to estimate critical load exceedances. An optimised Latin hypercube sampling design was used to construct model runs within ±40 % variation range for the UK emissions of SO2, NOx, and NH3, from which regression coefficients for each input-output combination and each model grid ( > 10 000 across the UK) were calculated. Surface concentrations of SO2, NOx, and NH3 (and of deposition of S and N) were found to be predominantly sensitive to the emissions of the respective pollutant, while sensitivities of secondary species such as HNO3 and particulate SO42-, NO3-, and NH4+ to pollutant emissions were more complex and geographically variable. The uncertainties in model output variables were propagated from the uncertainty ranges reported by the UK National Atmospheric Emissions Inventory for the emissions of SO2, NOx, and NH3 (±4, ±10, and ±20 % respectively). The uncertainties in the surface concentrations of NH3 and NOx and the depositions of NHx and NOy were dominated by the uncertainties in emissions of NH3, and NOx respectively, whilst concentrations of SO2 and deposition of SOy were affected by the uncertainties in both SO2 and NH3 emissions. Likewise, the relative uncertainties in the modelled surface concentrations of each of the secondary pollutant variables (NH4+, NO3-, SO42-, and HNO3) were due to uncertainties in at least two input variables. In all cases the spatial distribution of relative uncertainty was found to be geographically heterogeneous. The global methods used here can be applied to conduct sensitivity and uncertainty analyses of other ACTMs.

  12. THE RELATIVE IMPORTANCE OF THE VADOSE ZONE IN MULTIMEDIA RISK ASSESSMENT MODELING APPLIED AT A NATIONAL SCALE: AN ANALYSIS OF BENZENE USING 3MRA

    EPA Science Inventory

    Evaluating uncertainty and parameter sensitivity in environmental models can be a difficult task, even for low-order, single-media constructs driven by a unique set of site-specific data. The challenge of examining ever more complex, integrated, higher-order models is a formidab...

  13. Expected frontiers: Incorporating weather uncertainty into a policy analysis using an integrated bi-level multi-objective optimization framework

    EPA Science Inventory

    Weather is the main driver in both plant use of nutrients and fate and transport of nutrients in the environment. In previous work, we evaluated a green tax for control of agricultural nutrients in a bi-level optimization framework that linked deterministic models. In this study,...

  14. A DDDAS Framework for Volcanic Ash Propagation and Hazard Analysis

    DTIC Science & Technology

    2012-01-01

    probability distribution for the input variables (for example, Hermite polynomials for normally distributed parameters, or Legendre for uniformly...parameters and windfields will drive our simulations. We will use uncertainty quantification methodology – polynomial chaos quadrature in combination...quantification methodology ? polynomial chaos quadrature in combination with data integration to complete the DDDAS loop. 15. SUBJECT TERMS 16. SECURITY

  15. SPREADSHEET METHOD FOR EVALUATION OF BIOCHEMICAL REACTION RATE COEFFICIENTS AND THEIR UNCERTAINTIES BY WEIGHTED NONLINEAR LEAST-SQUARES ANALYSIS OF THE INTEGRATED MONOD EQUATION. (R825689C036)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  16. Analysis of Coupled Model Uncertainties in Source to Dose Modeling of Human Exposures to Ambient Air Pollution: a PM2.5 Case-Study

    EPA Science Inventory

    Quantitative assessment of human exposures and health effects due to air pollution involve detailed characterization of impacts of air quality on exposure and dose. A key challenge is to integrate these three components on a consistent spatial and temporal basis taking into acco...

  17. Adding uncertainty to forest inventory plot locations: effects on analyses using geospatial data

    Treesearch

    Alexia A. Sabor; Volker C. Radeloff; Ronald E. McRoberts; Murray Clayton; Susan I. Stewart

    2007-01-01

    The Forest Inventory and Analysis (FIA) program of the USDA Forest Service alters plot locations before releasing data to the public to ensure landowner confidentiality and sample integrity, but using data with altered plot locations in conjunction with other spatially explicit data layers produces analytical results with unknown amounts of error. We calculated the...

  18. Trajectory-Based Loads for the Ares I-X Test Flight Vehicle

    NASA Technical Reports Server (NTRS)

    Vause, Roland F.; Starr, Brett R.

    2011-01-01

    In trajectory-based loads, the structural engineer treats each point on the trajectory as a load case. Distributed aero, inertial, and propulsion forces are developed for the structural model which are equivalent to the integrated values of the trajectory model. Free-body diagrams are then used to solve for the internal forces, or loads, that keep the applied aero, inertial, and propulsion forces in dynamic equilibrium. There are several advantages to using trajectory-based loads. First, consistency is maintained between the integrated equilibrium equations of the trajectory analysis and the distributed equilibrium equations of the structural analysis. Second, the structural loads equations are tied to the uncertainty model for the trajectory systems analysis model. Atmosphere, aero, propulsion, mass property, and controls uncertainty models all feed into the dispersions that are generated for the trajectory systems analysis model. Changes in any of these input models will affect structural loads response. The trajectory systems model manages these inputs as well as the output from the structural model over thousands of dispersed cases. Large structural models with hundreds of thousands of degrees of freedom would execute too slowly to be an efficient part of several thousand system analyses. Trajectory-based loads provide a means for the structures discipline to be included in the integrated systems analysis. Successful applications of trajectory-based loads methods for the Ares I-X vehicle are covered in this paper. Preliminary design loads were based on 2000 trajectories using Monte Carlo dispersions. Range safety loads were tied to 8423 malfunction turn trajectories. In addition, active control system loads were based on 2000 preflight trajectories using Monte Carlo dispersions.

  19. Method to manage integration error in the Green-Kubo method.

    PubMed

    Oliveira, Laura de Sousa; Greaney, P Alex

    2017-02-01

    The Green-Kubo method is a commonly used approach for predicting transport properties in a system from equilibrium molecular dynamics simulations. The approach is founded on the fluctuation dissipation theorem and relates the property of interest to the lifetime of fluctuations in its thermodynamic driving potential. For heat transport, the lattice thermal conductivity is related to the integral of the autocorrelation of the instantaneous heat flux. A principal source of error in these calculations is that the autocorrelation function requires a long averaging time to reduce remnant noise. Integrating the noise in the tail of the autocorrelation function becomes conflated with physically important slow relaxation processes. In this paper we present a method to quantify the uncertainty on transport properties computed using the Green-Kubo formulation based on recognizing that the integrated noise is a random walk, with a growing envelope of uncertainty. By characterizing the noise we can choose integration conditions to best trade off systematic truncation error with unbiased integration noise, to minimize uncertainty for a given allocation of computational resources.

  20. Method to manage integration error in the Green-Kubo method

    NASA Astrophysics Data System (ADS)

    Oliveira, Laura de Sousa; Greaney, P. Alex

    2017-02-01

    The Green-Kubo method is a commonly used approach for predicting transport properties in a system from equilibrium molecular dynamics simulations. The approach is founded on the fluctuation dissipation theorem and relates the property of interest to the lifetime of fluctuations in its thermodynamic driving potential. For heat transport, the lattice thermal conductivity is related to the integral of the autocorrelation of the instantaneous heat flux. A principal source of error in these calculations is that the autocorrelation function requires a long averaging time to reduce remnant noise. Integrating the noise in the tail of the autocorrelation function becomes conflated with physically important slow relaxation processes. In this paper we present a method to quantify the uncertainty on transport properties computed using the Green-Kubo formulation based on recognizing that the integrated noise is a random walk, with a growing envelope of uncertainty. By characterizing the noise we can choose integration conditions to best trade off systematic truncation error with unbiased integration noise, to minimize uncertainty for a given allocation of computational resources.

  1. Effect of soil property uncertainties on permafrost thaw projections: A calibration-constrained analysis

    DOE PAGES

    Harp, Dylan R.; Atchley, Adam L.; Painter, Scott L.; ...

    2016-02-11

    Here, the effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21more » $$^{st}$$ century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties, although significant, is less than that produced by structural climate model uncertainty for this location.« less

  2. Effect of soil property uncertainties on permafrost thaw projections: A calibration-constrained analysis

    DOE PAGES

    Harp, D. R.; Atchley, A. L.; Painter, S. L.; ...

    2015-06-29

    The effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows formore » the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. As a result, by comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties, although significant, is less than that produced by structural climate model uncertainty for this location.« less

  3. Effect of soil property uncertainties on permafrost thaw projections: a calibration-constrained analysis

    NASA Astrophysics Data System (ADS)

    Harp, D. R.; Atchley, A. L.; Painter, S. L.; Coon, E. T.; Wilson, C. J.; Romanovsky, V. E.; Rowland, J. C.

    2015-06-01

    The effect of soil property uncertainties on permafrost thaw projections are studied using a three-phase subsurface thermal hydrology model and calibration-constrained uncertainty analysis. The Null-Space Monte Carlo method is used to identify soil hydrothermal parameter combinations that are consistent with borehole temperature measurements at the study site, the Barrow Environmental Observatory. Each parameter combination is then used in a forward projection of permafrost conditions for the 21st century (from calendar year 2006 to 2100) using atmospheric forcings from the Community Earth System Model (CESM) in the Representative Concentration Pathway (RCP) 8.5 greenhouse gas concentration trajectory. A 100-year projection allows for the evaluation of intra-annual uncertainty due to soil properties and the inter-annual variability due to year to year differences in CESM climate forcings. After calibrating to borehole temperature data at this well-characterized site, soil property uncertainties are still significant and result in significant intra-annual uncertainties in projected active layer thickness and annual thaw depth-duration even with a specified future climate. Intra-annual uncertainties in projected soil moisture content and Stefan number are small. A volume and time integrated Stefan number decreases significantly in the future climate, indicating that latent heat of phase change becomes more important than heat conduction in future climates. Out of 10 soil parameters, ALT, annual thaw depth-duration, and Stefan number are highly dependent on mineral soil porosity, while annual mean liquid saturation of the active layer is highly dependent on the mineral soil residual saturation and moderately dependent on peat residual saturation. By comparing the ensemble statistics to the spread of projected permafrost metrics using different climate models, we show that the effect of calibration-constrained uncertainty in soil properties, although significant, is less than that produced by structural climate model uncertainty for this location.

  4. Effects of temporal and spatial resolution of calibration data on integrated hydrologic water quality model identification

    NASA Astrophysics Data System (ADS)

    Jiang, Sanyuan; Jomaa, Seifeddine; Büttner, Olaf; Rode, Michael

    2014-05-01

    Hydrological water quality modeling is increasingly used for investigating runoff and nutrient transport processes as well as watershed management but it is mostly unclear how data availablity determins model identification. In this study, the HYPE (HYdrological Predictions for the Environment) model, which is a process-based, semi-distributed hydrological water quality model, was applied in two different mesoscale catchments (Selke (463 km2) and Weida (99 km2)) located in central Germany to simulate discharge and inorganic nitrogen (IN) transport. PEST and DREAM(ZS) were combined with the HYPE model to conduct parameter calibration and uncertainty analysis. Split-sample test was used for model calibration (1994-1999) and validation (1999-2004). IN concentration and daily IN load were found to be highly correlated with discharge, indicating that IN leaching is mainly controlled by runoff. Both dynamics and balances of water and IN load were well captured with NSE greater than 0.83 during validation period. Multi-objective calibration (calibrating hydrological and water quality parameters simultaneously) was found to outperform step-wise calibration in terms of model robustness. Multi-site calibration was able to improve model performance at internal sites, decrease parameter posterior uncertainty and prediction uncertainty. Nitrogen-process parameters calibrated using continuous daily averages of nitrate-N concentration observations produced better and more robust simulations of IN concentration and load, lower posterior parameter uncertainty and IN concentration prediction uncertainty compared to the calibration against uncontinuous biweekly nitrate-N concentration measurements. Both PEST and DREAM(ZS) are efficient in parameter calibration. However, DREAM(ZS) is more sound in terms of parameter identification and uncertainty analysis than PEST because of its capability to evolve parameter posterior distributions and estimate prediction uncertainty based on global search and Bayesian inference schemes.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santamarina, A.; Bernard, D.; Dos Santos, N.

    This paper describes the method to define relevant targeted integral measurements that allow the improvement of nuclear data evaluations and the determination of corresponding reliable covariances. {sup 235}U and {sup 56}Fe examples are pointed out for the improvement of JEFF3 data. Utilizations of these covariances are shown for Sensitivity and Representativity studies, Uncertainty calculations, and Transposition of experimental results to industrial applications. S/U studies are more and more used in Reactor Physics and Safety-Criticality. However, the reliability of study results relies strongly on the ND covariance relevancy. Our method derives the real uncertainty associated with each evaluation from calibration onmore » targeted integral measurements. These realistic covariance matrices allow reliable JEFF3.1.1 calculation of prior uncertainty due to nuclear data, as well as uncertainty reduction based on representative integral experiments, in challenging design calculations such as GEN3 and RJH reactors.« less

  6. Closed-Loop Evaluation of an Integrated Failure Identification and Fault Tolerant Control System for a Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Shin, Jong-Yeob; Belcastro, Christine; Khong, thuan

    2006-01-01

    Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. Such systems developed for failure detection, identification, and reconfiguration, as well as upset recovery, need to be evaluated over broad regions of the flight envelope or under extreme flight conditions, and should include various sources of uncertainty. To apply formal robustness analysis, formulation of linear fractional transformation (LFT) models of complex parameter-dependent systems is required, which represent system uncertainty due to parameter uncertainty and actuator faults. This paper describes a detailed LFT model formulation procedure from the nonlinear model of a transport aircraft by using a preliminary LFT modeling software tool developed at the NASA Langley Research Center, which utilizes a matrix-based computational approach. The closed-loop system is evaluated over the entire flight envelope based on the generated LFT model which can cover nonlinear dynamics. The robustness analysis results of the closed-loop fault tolerant control system of a transport aircraft are presented. A reliable flight envelope (safe flight regime) is also calculated from the robust performance analysis results, over which the closed-loop system can achieve the desired performance of command tracking and failure detection.

  7. Integrated ground-water monitoring strategy for NRC-licensed facilities and sites: Case study applications

    USGS Publications Warehouse

    Price, V.; Temples, T.; Hodges, R.; Dai, Z.; Watkins, D.; Imrich, J.

    2007-01-01

    This document discusses results of applying the Integrated Ground-Water Monitoring Strategy (the Strategy) to actual waste sites using existing field characterization and monitoring data. The Strategy is a systematic approach to dealing with complex sites. Application of such a systematic approach will reduce uncertainty associated with site analysis, and therefore uncertainty associated with management decisions about a site. The Strategy can be used to guide the development of a ground-water monitoring program or to review an existing one. The sites selected for study fall within a wide range of geologic and climatic settings, waste compositions, and site design characteristics and represent realistic cases that might be encountered by the NRC. No one case study illustrates a comprehensive application of the Strategy using all available site data. Rather, within each case study we focus on certain aspects of the Strategy, to illustrate concepts that can be applied generically to all sites. The test sites selected include:Charleston, South Carolina, Naval Weapons Station,Brookhaven National Laboratory on Long Island, New York,The USGS Amargosa Desert Research Site in Nevada,Rocky Flats in Colorado,C-Area at the Savannah River Site in South Carolina, andThe Hanford 300 Area.A Data Analysis section provides examples of detailed data analysis of monitoring data.

  8. Modeling Nitrogen Dynamics in a Waste Stabilization Pond System Using Flexible Modeling Environment with MCMC

    PubMed Central

    Mukhtar, Hussnain; Lin, Yu-Pin; Shipin, Oleg V.; Petway, Joy R.

    2017-01-01

    This study presents an approach for obtaining realization sets of parameters for nitrogen removal in a pilot-scale waste stabilization pond (WSP) system. The proposed approach was designed for optimal parameterization, local sensitivity analysis, and global uncertainty analysis of a dynamic simulation model for the WSP by using the R software package Flexible Modeling Environment (R-FME) with the Markov chain Monte Carlo (MCMC) method. Additionally, generalized likelihood uncertainty estimation (GLUE) was integrated into the FME to evaluate the major parameters that affect the simulation outputs in the study WSP. Comprehensive modeling analysis was used to simulate and assess nine parameters and concentrations of ON-N, NH3-N and NO3-N. Results indicate that the integrated FME-GLUE-based model, with good Nash–Sutcliffe coefficients (0.53–0.69) and correlation coefficients (0.76–0.83), successfully simulates the concentrations of ON-N, NH3-N and NO3-N. Moreover, the Arrhenius constant was the only parameter sensitive to model performances of ON-N and NH3-N simulations. However, Nitrosomonas growth rate, the denitrification constant, and the maximum growth rate at 20 °C were sensitive to ON-N and NO3-N simulation, which was measured using global sensitivity. PMID:28704958

  9. Mixed oxidizer hybrid propulsion system optimization under uncertainty using applied response surface methodology and Monte Carlo simulation

    NASA Astrophysics Data System (ADS)

    Whitehead, James Joshua

    The analysis documented herein provides an integrated approach for the conduct of optimization under uncertainty (OUU) using Monte Carlo Simulation (MCS) techniques coupled with response surface-based methods for characterization of mixture-dependent variables. This novel methodology provides an innovative means of conducting optimization studies under uncertainty in propulsion system design. Analytic inputs are based upon empirical regression rate information obtained from design of experiments (DOE) mixture studies utilizing a mixed oxidizer hybrid rocket concept. Hybrid fuel regression rate was selected as the target response variable for optimization under uncertainty, with maximization of regression rate chosen as the driving objective. Characteristic operational conditions and propellant mixture compositions from experimental efforts conducted during previous foundational work were combined with elemental uncertainty estimates as input variables. Response surfaces for mixture-dependent variables and their associated uncertainty levels were developed using quadratic response equations incorporating single and two-factor interactions. These analysis inputs, response surface equations and associated uncertainty contributions were applied to a probabilistic MCS to develop dispersed regression rates as a function of operational and mixture input conditions within design space. Illustrative case scenarios were developed and assessed using this analytic approach including fully and partially constrained operational condition sets over all of design mixture space. In addition, optimization sets were performed across an operationally representative region in operational space and across all investigated mixture combinations. These scenarios were selected as representative examples relevant to propulsion system optimization, particularly for hybrid and solid rocket platforms. Ternary diagrams, including contour and surface plots, were developed and utilized to aid in visualization. The concept of Expanded-Durov diagrams was also adopted and adapted to this study to aid in visualization of uncertainty bounds. Regions of maximum regression rate and associated uncertainties were determined for each set of case scenarios. Application of response surface methodology coupled with probabilistic-based MCS allowed for flexible and comprehensive interrogation of mixture and operating design space during optimization cases. Analyses were also conducted to assess sensitivity of uncertainty to variations in key elemental uncertainty estimates. The methodology developed during this research provides an innovative optimization tool for future propulsion design efforts.

  10. Uncertainty and sensitivity assessment of flood risk assessments

    NASA Astrophysics Data System (ADS)

    de Moel, H.; Aerts, J. C.

    2009-12-01

    Floods are one of the most frequent and costly natural disasters. In order to protect human lifes and valuable assets from the effect of floods many defensive structures have been build. Despite these efforts economic losses due to catastrophic flood events have, however, risen substantially during the past couple of decades because of continuing economic developments in flood prone areas. On top of that, climate change is expected to affect the magnitude and frequency of flood events. Because these ongoing trends are expected to continue, a transition can be observed in various countries to move from a protective flood management approach to a more risk based flood management approach. In a risk based approach, flood risk assessments play an important role in supporting decision making. Most flood risk assessments assess flood risks in monetary terms (damage estimated for specific situations or expected annual damage) in order to feed cost-benefit analysis of management measures. Such flood risk assessments contain, however, considerable uncertainties. This is the result from uncertainties in the many different input parameters propagating through the risk assessment and accumulating in the final estimate. Whilst common in some other disciplines, as with integrated assessment models, full uncertainty and sensitivity analyses of flood risk assessments are not so common. Various studies have addressed uncertainties regarding flood risk assessments, but have mainly focussed on the hydrological conditions. However, uncertainties in other components of the risk assessment, like the relation between water depth and monetary damage, can be substantial as well. This research therefore tries to assess the uncertainties of all components of monetary flood risk assessments, using a Monte Carlo based approach. Furthermore, the total uncertainty will also be attributed to the different input parameters using a variance based sensitivity analysis. Assessing and visualizing the uncertainties of the final risk estimate will be helpful to decision makers to make better informed decisions and attributing this uncertainty to the input parameters helps to identify which parameters are most important when it comes to uncertainty in the final estimate and should therefore deserve additional attention in further research.

  11. Integrating chronological uncertainties for annually laminated lake sediments using layer counting, independent chronologies and Bayesian age modelling (Lake Ohau, South Island, New Zealand)

    NASA Astrophysics Data System (ADS)

    Vandergoes, Marcus J.; Howarth, Jamie D.; Dunbar, Gavin B.; Turnbull, Jocelyn C.; Roop, Heidi A.; Levy, Richard H.; Li, Xun; Prior, Christine; Norris, Margaret; Keller, Liz D.; Baisden, W. Troy; Ditchburn, Robert; Fitzsimons, Sean J.; Bronk Ramsey, Christopher

    2018-05-01

    Annually resolved (varved) lake sequences are important palaeoenvironmental archives as they offer a direct incremental dating technique for high-frequency reconstruction of environmental and climate change. Despite the importance of these records, establishing a robust chronology and quantifying its precision and accuracy (estimations of error) remains an essential but challenging component of their development. We outline an approach for building reliable independent chronologies, testing the accuracy of layer counts and integrating all chronological uncertainties to provide quantitative age and error estimates for varved lake sequences. The approach incorporates (1) layer counts and estimates of counting precision; (2) radiometric and biostratigrapic dating techniques to derive independent chronology; and (3) the application of Bayesian age modelling to produce an integrated age model. This approach is applied to a case study of an annually resolved sediment record from Lake Ohau, New Zealand. The most robust age model provides an average error of 72 years across the whole depth range. This represents a fractional uncertainty of ∼5%, higher than the <3% quoted for most published varve records. However, the age model and reported uncertainty represent the best fit between layer counts and independent chronology and the uncertainties account for both layer counting precision and the chronological accuracy of the layer counts. This integrated approach provides a more representative estimate of age uncertainty and therefore represents a statistically more robust chronology.

  12. RECOVERY ACT - Methods for Decision under Technological Change Uncertainty and Risk Assessment for Integrated Assessment of Climate Change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Webster, Mort David

    2015-03-10

    This report presents the final outcomes and products of the project as performed at the Massachusetts Institute of Technology. The research project consists of three main components: methodology development for decision-making under uncertainty, improving the resolution of the electricity sector to improve integrated assessment, and application of these methods to integrated assessment. Results in each area is described in the report.

  13. Integrating adaptive management and ecosystem services concepts to improve natural resource management: Challenges and opportunities

    USGS Publications Warehouse

    Epanchin-Niell, Rebecca S.; Boyd, James W.; Macauley, Molly K.; Scarlett, Lynn; Shapiro, Carl D.; Williams, Byron K.

    2018-05-07

    Executive Summary—OverviewNatural resource managers must make decisions that affect broad-scale ecosystem processes involving large spatial areas, complex biophysical interactions, numerous competing stakeholder interests, and highly uncertain outcomes. Natural and social science information and analyses are widely recognized as important for informing effective management. Chief among the systematic approaches for improving the integration of science into natural resource management are two emergent science concepts, adaptive management and ecosystem services. Adaptive management (also referred to as “adaptive decision making”) is a deliberate process of learning by doing that focuses on reducing uncertainties about management outcomes and system responses to improve management over time. Ecosystem services is a conceptual framework that refers to the attributes and outputs of ecosystems (and their components and functions) that have value for humans.This report explores how ecosystem services can be moved from concept into practice through connection to a decision framework—adaptive management—that accounts for inherent uncertainties. Simultaneously, the report examines the value of incorporating ecosystem services framing and concepts into adaptive management efforts.Adaptive management and ecosystem services analyses have not typically been used jointly in decision making. However, as frameworks, they have a natural—but to date underexplored—affinity. Both are policy and decision oriented in that they attempt to represent the consequences of resource management choices on outcomes of interest to stakeholders. Both adaptive management and ecosystem services analysis take an empirical approach to the analysis of ecological systems. This systems orientation is a byproduct of the fact that natural resource actions affect ecosystems—and corresponding societal outcomes—often across large geographic scales. Moreover, because both frameworks focus on resource systems, both must confront the analytical challenges of systems modeling—in terms of complexity, dynamics, and uncertainty.Given this affinity, the integration of ecosystem services analysis and adaptive management poses few conceptual hurdles. In this report, we synthesize discussions from two workshops that considered ways in which adaptive management approaches and ecosystem service concepts may be complementary, such that integrating them into a common framework may lead to improved natural resource management outcomes. Although the literature on adaptive management and ecosystem services is vast and growing, the report focuses specifically on the integration of these two concepts rather than aiming to provide new definitions or an indepth review or primer of the concepts individually.Key issues considered include the bidirectional links between adaptive decision making and ecosystem services, as well as the potential benefits and inevitable challenges arising in the development and use of an integrated framework. Specifically, the workshops addressed the following questions:How can application of ecosystem service analysis within an adaptive decision process improve the outcomes of management and advance understanding of ecosystem service identification, production, and valuation?How can these concepts be integrated in concept and practice?What are the constraints and challenges to integrating adaptive management and ecosystem services?And, should the integration of these concepts be moved forward to wider application—and if so, how?

  14. Hierarchical Bayesian analysis to incorporate age uncertainty in growth curve analysis and estimates of age from length: Florida manatee (Trichechus manatus) carcasses

    USGS Publications Warehouse

    Schwarz, L.K.; Runge, M.C.

    2009-01-01

    Age estimation of individuals is often an integral part of species management research, and a number of ageestimation techniques are commonly employed. Often, the error in these techniques is not quantified or accounted for in other analyses, particularly in growth curve models used to describe physiological responses to environment and human impacts. Also, noninvasive, quick, and inexpensive methods to estimate age are needed. This research aims to provide two Bayesian methods to (i) incorporate age uncertainty into an age-length Schnute growth model and (ii) produce a method from the growth model to estimate age from length. The methods are then employed for Florida manatee (Trichechus manatus) carcasses. After quantifying the uncertainty in the aging technique (counts of ear bone growth layers), we fit age-length data to the Schnute growth model separately by sex and season. Independent prior information about population age structure and the results of the Schnute model are then combined to estimate age from length. Results describing the age-length relationship agree with our understanding of manatee biology. The new methods allow us to estimate age, with quantified uncertainty, for 98% of collected carcasses: 36% from ear bones, 62% from length.

  15. IMPACT: Integrated Modeling of Perturbations in Atmospheres for Conjunction Tracking

    NASA Astrophysics Data System (ADS)

    Koller, J.; Brennan, S.; Godinez, H. C.; Higdon, D. M.; Klimenko, A.; Larsen, B.; Lawrence, E.; Linares, R.; McLaughlin, C. A.; Mehta, P. M.; Palmer, D.; Ridley, A. J.; Shoemaker, M.; Sutton, E.; Thompson, D.; Walker, A.; Wohlberg, B.

    2013-12-01

    Low-Earth orbiting satellites suffer from atmospheric drag due to thermospheric density which changes on the order of several magnitudes especially during space weather events. Solar flares, precipitating particles and ionospheric currents cause the upper atmosphere to heat up, redistribute, and cool again. These processes are intrinsically included in empirical models, e.g. MSIS and Jacchia-Bowman type models. However, sensitivity analysis has shown that atmospheric drag has the highest influence on satellite conjunction analysis and empirical model still do not adequately represent a desired accuracy. Space debris and collision avoidance have become an increasingly operational reality. It is paramount to accurately predict satellite orbits and include drag effect driven by space weather. The IMPACT project (Integrated Modeling of Perturbations in Atmospheres for Conjunction Tracking), funded with over $5 Million by the Los Alamos Laboratory Directed Research and Development office, has the goal to develop an integrated system of atmospheric drag modeling, orbit propagation, and conjunction analysis with detailed uncertainty quantification to address the space debris and collision avoidance problem. Now with over two years into the project, we have developed an integrated solution combining physics-based density modeling of the upper atmosphere between 120-700 km altitude, satellite drag forecasting for quiet and disturbed geomagnetic conditions, and conjunction analysis with non-Gaussian uncertainty quantification. We are employing several novel approaches including a unique observational sensor developed at Los Alamos; machine learning with a support-vector machine approach of the coupling between solar drivers of the upper atmosphere and satellite drag; rigorous data assimilative modeling using a physics-based approach instead of empirical modeling of the thermosphere; and a computed-tomography method for extracting temporal maps of thermospheric densities using ground based observations. The developed IMPACT framework is an open research framework enabling the exchange and testing of a variety of atmospheric density models, orbital propagators, drag coefficient models, ground based observations, etc. and study their effect on conjunctions and uncertainty predictions. The framework is based on a modern service-oriented architecture controlled by a web interface and providing 3D visualizations. The goal of this project is to revolutionize the ability to monitor and track space objects during highly disturbed space weather conditions, provide suitable forecasts for satellite drag conditions and conjunction analysis, and enable the exchange of models, codes, and data in an open research environment. We will present capabilities and results of the IMPACT framework including a demo of the control interface and visualizations.

  16. Probabilistic structural analysis methods and applications

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Wu, Y.-T.; Dias, B.; Rajagopal, K. R.

    1988-01-01

    An advanced algorithm for simulating the probabilistic distribution of structural responses due to statistical uncertainties in loads, geometry, material properties, and boundary conditions is reported. The method effectively combines an advanced algorithm for calculating probability levels for multivariate problems (fast probability integration) together with a general-purpose finite-element code for stress, vibration, and buckling analysis. Application is made to a space propulsion system turbine blade for which the geometry and material properties are treated as random variables.

  17. Advances in audio source seperation and multisource audio content retrieval

    NASA Astrophysics Data System (ADS)

    Vincent, Emmanuel

    2012-06-01

    Audio source separation aims to extract the signals of individual sound sources from a given recording. In this paper, we review three recent advances which improve the robustness of source separation in real-world challenging scenarios and enable its use for multisource content retrieval tasks, such as automatic speech recognition (ASR) or acoustic event detection (AED) in noisy environments. We present a Flexible Audio Source Separation Toolkit (FASST) and discuss its advantages compared to earlier approaches such as independent component analysis (ICA) and sparse component analysis (SCA). We explain how cues as diverse as harmonicity, spectral envelope, temporal fine structure or spatial location can be jointly exploited by this toolkit. We subsequently present the uncertainty decoding (UD) framework for the integration of audio source separation and audio content retrieval. We show how the uncertainty about the separated source signals can be accurately estimated and propagated to the features. Finally, we explain how this uncertainty can be efficiently exploited by a classifier, both at the training and the decoding stage. We illustrate the resulting performance improvements in terms of speech separation quality and speaker recognition accuracy.

  18. Probabilistic Mass Growth Uncertainties

    NASA Technical Reports Server (NTRS)

    Plumer, Eric; Elliott, Darren

    2013-01-01

    Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.

  19. Characterization of Tactical Departure Scheduling in the National Airspace System

    NASA Technical Reports Server (NTRS)

    Capps, Alan; Engelland, Shawn A.

    2011-01-01

    This paper discusses and analyzes current day utilization and performance of the tactical departure scheduling process in the National Airspace System (NAS) to understand the benefits in improving this process. The analysis used operational air traffic data from over 1,082,000 flights during the month of January, 2011. Specific metrics included the frequency of tactical departure scheduling, site specific variances in the technology's utilization, departure time prediction compliance used in the tactical scheduling process and the performance with which the current system can predict the airborne slot that aircraft are being scheduled into from the airport surface. Operational data analysis described in this paper indicates significant room for improvement exists in the current system primarily in the area of reduced departure time prediction uncertainty. Results indicate that a significant number of tactically scheduled aircraft did not meet their scheduled departure slot due to departure time uncertainty. In addition to missed slots, the operational data analysis identified increased controller workload associated with tactical departures which were subject to traffic management manual re-scheduling or controller swaps. An analysis of achievable levels of departure time prediction accuracy as obtained by a new integrated surface and tactical scheduling tool is provided to assess the benefit it may provide as a solution to the identified shortfalls. A list of NAS facilities which are likely to receive the greatest benefit from the integrated surface and tactical scheduling technology are provided.

  20. Uncertainty and the Social Cost of Methane Using Bayesian Constrained Climate Models

    NASA Astrophysics Data System (ADS)

    Errickson, F. C.; Anthoff, D.; Keller, K.

    2016-12-01

    Social cost estimates of greenhouse gases are important for the design of sound climate policies and are also plagued by uncertainty. One major source of uncertainty stems from the simplified representation of the climate system used in the integrated assessment models that provide these social cost estimates. We explore how uncertainty over the social cost of methane varies with the way physical processes and feedbacks in the methane cycle are modeled by (i) coupling three different methane models to a simple climate model, (ii) using MCMC to perform a Bayesian calibration of the three coupled climate models that simulates direct sampling from the joint posterior probability density function (pdf) of model parameters, and (iii) producing probabilistic climate projections that are then used to calculate the Social Cost of Methane (SCM) with the DICE and FUND integrated assessment models. We find that including a temperature feedback in the methane cycle acts as an additional constraint during the calibration process and results in a correlation between the tropospheric lifetime of methane and several climate model parameters. This correlation is not seen in the models lacking this feedback. Several of the estimated marginal pdfs of the model parameters also exhibit different distributional shapes and expected values depending on the methane model used. As a result, probabilistic projections of the climate system out to the year 2300 exhibit different levels of uncertainty and magnitudes of warming for each of the three models under an RCP8.5 scenario. We find these differences in climate projections result in differences in the distributions and expected values for our estimates of the SCM. We also examine uncertainty about the SCM by performing a Monte Carlo analysis using a distribution for the climate sensitivity while holding all other climate model parameters constant. Our SCM estimates using the Bayesian calibration are lower and exhibit less uncertainty about extremely high values in the right tail of the distribution compared to the Monte Carlo approach. This finding has important climate policy implications and suggests previous work that accounts for climate model uncertainty by only varying the climate sensitivity parameter may overestimate the SCM.

  1. Reliability of a new biokinetic model of zirconium in internal dosimetry: part II, parameter sensitivity analysis.

    PubMed

    Li, Wei Bo; Greiter, Matthias; Oeh, Uwe; Hoeschen, Christoph

    2011-12-01

    The reliability of biokinetic models is essential for the assessment of internal doses and a radiation risk analysis for the public and occupational workers exposed to radionuclides. In the present study, a method for assessing the reliability of biokinetic models by means of uncertainty and sensitivity analysis was developed. In the first part of the paper, the parameter uncertainty was analyzed for two biokinetic models of zirconium (Zr); one was reported by the International Commission on Radiological Protection (ICRP), and one was developed at the Helmholtz Zentrum München-German Research Center for Environmental Health (HMGU). In the second part of the paper, the parameter uncertainties and distributions of the Zr biokinetic models evaluated in Part I are used as the model inputs for identifying the most influential parameters in the models. Furthermore, the most influential model parameter on the integral of the radioactivity of Zr over 50 y in source organs after ingestion was identified. The results of the systemic HMGU Zr model showed that over the first 10 d, the parameters of transfer rates between blood and other soft tissues have the largest influence on the content of Zr in the blood and the daily urinary excretion; however, after day 1,000, the transfer rate from bone to blood becomes dominant. For the retention in bone, the transfer rate from blood to bone surfaces has the most influence out to the endpoint of the simulation; the transfer rate from blood to the upper larger intestine contributes a lot in the later days; i.e., after day 300. The alimentary tract absorption factor (fA) influences mostly the integral of radioactivity of Zr in most source organs after ingestion.

  2. Uncertainty analysis of the radiological characteristics of radioactive waste using a method based on log-normal distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gigase, Yves

    2007-07-01

    Available in abstract form only. Full text of publication follows: The uncertainty on characteristics of radioactive LILW waste packages is difficult to determine and often very large. This results from a lack of knowledge of the constitution of the waste package and of the composition of the radioactive sources inside. To calculate a quantitative estimate of the uncertainty on a characteristic of a waste package one has to combine these various uncertainties. This paper discusses an approach to this problem, based on the use of the log-normal distribution, which is both elegant and easy to use. It can provide asmore » example quantitative estimates of uncertainty intervals that 'make sense'. The purpose is to develop a pragmatic approach that can be integrated into existing characterization methods. In this paper we show how our method can be applied to the scaling factor method. We also explain how it can be used when estimating other more complex characteristics such as the total uncertainty of a collection of waste packages. This method could have applications in radioactive waste management, more in particular in those decision processes where the uncertainty on the amount of activity is considered to be important such as in probability risk assessment or the definition of criteria for acceptance or categorization. (author)« less

  3. A Comprehensive Analysis of Uncertainties Affecting the Stellar Mass-Halo Mass Relation for 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Behroozi, Peter S.; Conroy, Charlie; Wechsler, Risa H.

    2010-06-07

    We conduct a comprehensive analysis of the relationship between central galaxies and their host dark matter halos, as characterized by the stellar mass - halo mass (SM-HM) relation, with rigorous consideration of uncertainties. Our analysis focuses on results from the abundance matching technique, which assumes that every dark matter halo or subhalo above a specific mass threshold hosts one galaxy. We provide a robust estimate of the SM-HM relation for 0 < z < 1 and discuss the quantitative effects of uncertainties in observed galaxy stellar mass functions (GSMFs) (including stellar mass estimates and counting uncertainties), halo mass functions (includingmore » cosmology and uncertainties from substructure), and the abundance matching technique used to link galaxies to halos (including scatter in this connection). Our analysis results in a robust estimate of the SM-HM relation and its evolution from z=0 to z=4. The shape and evolution are well constrained for z < 1. The largest uncertainties at these redshifts are due to stellar mass estimates (0.25 dex uncertainty in normalization); however, failure to account for scatter in stellar masses at fixed halo mass can lead to errors of similar magnitude in the SM-HM relation for central galaxies in massive halos. We also investigate the SM-HM relation to z = 4, although the shape of the relation at higher redshifts remains fairly unconstrained when uncertainties are taken into account. We find that the integrated star formation at a given halo mass peaks at 10-20% of available baryons for all redshifts from 0 to 4. This peak occurs at a halo mass of 7 x 10{sup 11} M{sub {circle_dot}} at z = 0 and this mass increases by a factor of 5 to z = 4. At lower and higher masses, star formation is substantially less efficient, with stellar mass scaling as M{sub *} {approx} M{sub h}{sup 2.3} at low masses and M{sub *} {approx} M{sub h}{sup 0.29} at high masses. The typical stellar mass for halos with mass less than 10{sup 12} M{sub {circle_dot}} has increased by 0.3-0.45 dex for halos since z {approx} 1. These results will provide a powerful tool to inform galaxy evolution models.« less

  4. Methods for exploring uncertainty in groundwater management predictions

    USGS Publications Warehouse

    Guillaume, Joseph H. A.; Hunt, Randall J.; Comunian, Alessandro; Fu, Baihua; Blakers, Rachel S; Jakeman, Anthony J.; Barreteau, Olivier; Hunt, Randall J.; Rinaudo, Jean-Daniel; Ross, Andrew

    2016-01-01

    Models of groundwater systems help to integrate knowledge about the natural and human system covering different spatial and temporal scales, often from multiple disciplines, in order to address a range of issues of concern to various stakeholders. A model is simply a tool to express what we think we know. Uncertainty, due to lack of knowledge or natural variability, means that there are always alternative models that may need to be considered. This chapter provides an overview of uncertainty in models and in the definition of a problem to model, highlights approaches to communicating and using predictions of uncertain outcomes and summarises commonly used methods to explore uncertainty in groundwater management predictions. It is intended to raise awareness of how alternative models and hence uncertainty can be explored in order to facilitate the integration of these techniques with groundwater management.

  5. Probabilistic risk assessment for CO2 storage in geological formations: robust design and support for decision making under uncertainty

    NASA Astrophysics Data System (ADS)

    Oladyshkin, Sergey; Class, Holger; Helmig, Rainer; Nowak, Wolfgang

    2010-05-01

    CO2 storage in geological formations is currently being discussed intensively as a technology for mitigating CO2 emissions. However, any large-scale application requires a thorough analysis of the potential risks. Current numerical simulation models are too expensive for probabilistic risk analysis and for stochastic approaches based on brute-force repeated simulation. Even single deterministic simulations may require parallel high-performance computing. The multiphase flow processes involved are too non-linear for quasi-linear error propagation and other simplified stochastic tools. As an alternative approach, we propose a massive stochastic model reduction based on the probabilistic collocation method. The model response is projected onto a orthogonal basis of higher-order polynomials to approximate dependence on uncertain parameters (porosity, permeability etc.) and design parameters (injection rate, depth etc.). This allows for a non-linear propagation of model uncertainty affecting the predicted risk, ensures fast computation and provides a powerful tool for combining design variables and uncertain variables into one approach based on an integrative response surface. Thus, the design task of finding optimal injection regimes explicitly includes uncertainty, which leads to robust designs of the non-linear system that minimize failure probability and provide valuable support for risk-informed management decisions. We validate our proposed stochastic approach by Monte Carlo simulation using a common 3D benchmark problem (Class et al. Computational Geosciences 13, 2009). A reasonable compromise between computational efforts and precision was reached already with second-order polynomials. In our case study, the proposed approach yields a significant computational speedup by a factor of 100 compared to Monte Carlo simulation. We demonstrate that, due to the non-linearity of the flow and transport processes during CO2 injection, including uncertainty in the analysis leads to a systematic and significant shift of predicted leakage rates towards higher values compared with deterministic simulations, affecting both risk estimates and the design of injection scenarios. This implies that, neglecting uncertainty can be a strong simplification for modeling CO2 injection, and the consequences can be stronger than when neglecting several physical phenomena (e.g. phase transition, convective mixing, capillary forces etc.). The authors would like to thank the German Research Foundation (DFG) for financial support of the project within the Cluster of Excellence in Simulation Technology (EXC 310/1) at the University of Stuttgart. Keywords: polynomial chaos; CO2 storage; multiphase flow; porous media; risk assessment; uncertainty; integrative response surfaces

  6. Stochastic Optimization For Water Resources Allocation

    NASA Astrophysics Data System (ADS)

    Yamout, G.; Hatfield, K.

    2003-12-01

    For more than 40 years, water resources allocation problems have been addressed using deterministic mathematical optimization. When data uncertainties exist, these methods could lead to solutions that are sub-optimal or even infeasible. While optimization models have been proposed for water resources decision-making under uncertainty, no attempts have been made to address the uncertainties in water allocation problems in an integrated approach. This paper presents an Integrated Dynamic, Multi-stage, Feedback-controlled, Linear, Stochastic, and Distributed parameter optimization approach to solve a problem of water resources allocation. It attempts to capture (1) the conflict caused by competing objectives, (2) environmental degradation produced by resource consumption, and finally (3) the uncertainty and risk generated by the inherently random nature of state and decision parameters involved in such a problem. A theoretical system is defined throughout its different elements. These elements consisting mainly of water resource components and end-users are described in terms of quantity, quality, and present and future associated risks and uncertainties. Models are identified, modified, and interfaced together to constitute an integrated water allocation optimization framework. This effort is a novel approach to confront the water allocation optimization problem while accounting for uncertainties associated with all its elements; thus resulting in a solution that correctly reflects the physical problem in hand.

  7. Computer Model Inversion and Uncertainty Quantification in the Geosciences

    NASA Astrophysics Data System (ADS)

    White, Jeremy T.

    The subject of this dissertation is use of computer models as data analysis tools in several different geoscience settings, including integrated surface water/groundwater modeling, tephra fallout modeling, geophysical inversion, and hydrothermal groundwater modeling. The dissertation is organized into three chapters, which correspond to three individual publication manuscripts. In the first chapter, a linear framework is developed to identify and estimate the potential predictive consequences of using a simple computer model as a data analysis tool. The framework is applied to a complex integrated surface-water/groundwater numerical model with thousands of parameters. Several types of predictions are evaluated, including particle travel time and surface-water/groundwater exchange volume. The analysis suggests that model simplifications have the potential to corrupt many types of predictions. The implementation of the inversion, including how the objective function is formulated, what minimum of the objective function value is acceptable, and how expert knowledge is enforced on parameters, can greatly influence the manifestation of model simplification. Depending on the prediction, failure to specifically address each of these important issues during inversion is shown to degrade the reliability of some predictions. In some instances, inversion is shown to increase, rather than decrease, the uncertainty of a prediction, which defeats the purpose of using a model as a data analysis tool. In the second chapter, an efficient inversion and uncertainty quantification approach is applied to a computer model of volcanic tephra transport and deposition. The computer model simulates many physical processes related to tephra transport and fallout. The utility of the approach is demonstrated for two eruption events. In both cases, the importance of uncertainty quantification is highlighted by exposing the variability in the conditioning provided by the observations used for inversion. The worth of different types of tephra data to reduce parameter uncertainty is evaluated, as is the importance of different observation error models. The analyses reveal the importance using tephra granulometry data for inversion, which results in reduced uncertainty for most eruption parameters. In the third chapter, geophysical inversion is combined with hydrothermal modeling to evaluate the enthalpy of an undeveloped geothermal resource in a pull-apart basin located in southeastern Armenia. A high-dimensional gravity inversion is used to define the depth to the contact between the lower-density valley fill sediments and the higher-density surrounding host rock. The inverted basin depth distribution was used to define the hydrostratigraphy for the coupled groundwater-flow and heat-transport model that simulates the circulation of hydrothermal fluids in the system. Evaluation of several different geothermal system configurations indicates that the most likely system configuration is a low-enthalpy, liquid-dominated geothermal system.

  8. Accounting for respondent uncertainty to improve willingness-to-pay estimates

    Treesearch

    Rebecca Moore; Richard C. Bishop; Bill Provencher; Patricia A. Champ

    2010-01-01

    In this paper, we develop an econometric model of willingness to pay (WTP) that integrates data on respondent uncertainty regarding their own WTP. The integration is utility consistent, there is no recoding of variables, and no need to calibrate the contingent responses to actual payment data, so the approach can "stand alone." In an application to a...

  9. Reducing uncertainty with flood frequency analysis: The contribution of paleoflood and historical flood information

    NASA Astrophysics Data System (ADS)

    Lam, Daryl; Thompson, Chris; Croke, Jacky; Sharma, Ashneel; Macklin, Mark

    2017-03-01

    Using a combination of stream gauge, historical, and paleoflood records to extend extreme flood records has proven to be useful in improving flood frequency analysis (FFA). The approach has typically been applied in localities with long historical records and/or suitable river settings for paleoflood reconstruction from slack-water deposits (SWDs). However, many regions around the world have neither extensive historical information nor bedrock gorges suitable for SWDs preservation and paleoflood reconstruction. This study from subtropical Australia demonstrates that confined, semialluvial channels such as macrochannels provide relatively stable boundaries over the 1000-2000 year time period and the preserved SWDs enabled paleoflood reconstruction and their incorporation into FFA. FFA for three sites in subtropical Australia with the integration of historical and paleoflood data using Bayesian Inference methods showed a significant reduction in uncertainty associated with the estimated discharge of a flood quantile. Uncertainty associated with estimated discharge for the 1% Annual Exceedance Probability (AEP) flood is reduced by more than 50%. In addition, sensitivity analysis of possible within-channel boundary changes shows that FFA is not significantly affected by any associated changes in channel capacity. Therefore, a greater range of channel types may be used for reliable paleoflood reconstruction by evaluating the stability of inset alluvial units, thereby increasing the quantity of temporal data available for FFA. The reduction in uncertainty, particularly in the prediction of the ≤1% AEP design flood, will improve flood risk planning and management in regions with limited temporal flood data.

  10. Hypersonic vehicle model and control law development using H(infinity) and micron synthesis

    NASA Astrophysics Data System (ADS)

    Gregory, Irene M.; Chowdhry, Rajiv S.; McMinn, John D.; Shaughnessy, John D.

    1994-10-01

    The control system design for a Single Stage To Orbit (SSTO) air breathing vehicle will be central to a successful mission because a precise ascent trajectory will preserve narrow payload margins. The air breathing propulsion system requires the vehicle to fly roughly halfway around the Earth through atmospheric turbulence. The turbulence, the high sensitivity of the propulsion system to inlet flow conditions, the relatively large uncertainty of the parameters characterizing the vehicle, and continuous acceleration make the problem especially challenging. Adequate stability margins must be provided without sacrificing payload mass since payload margins are critical. Therefore, a multivariable control theory capable of explicitly including both uncertainty and performance is needed. The H(infinity) controller in general provides good robustness but can result in conservative solutions for practical problems involving structured uncertainty. Structured singular value mu framework for analysis and synthesis is potentially much less conservative and hence more appropriate for problems with tight margins. An SSTO control system requires: highly accurate tracking of velocity and altitude commands while limiting angle-of-attack oscillations, minimized control power usage, and a stabilized vehicle when atmospheric turbulence and system uncertainty are present. The controller designs using H(infinity) and mu-synthesis procedures were compared. An integrated flight/propulsion dynamic mathematical model of a conical accelerator vehicle was linearized as the vehicle accelerated through Mach 8. Vehicle acceleration through the selected flight condition gives rise to parametric variation that was modeled as a structured uncertainty. The mu-analysis approach was used in the frequency domain to conduct controller analysis and was confirmed by time history plots. Results demonstrate the inherent advantages of the mu framework for this class of problems.

  11. A terrestrial lidar-based workflow for determining three-dimensional slip vectors and associated uncertainties

    USGS Publications Warehouse

    Gold, Peter O.; Cowgill, Eric; Kreylos, Oliver; Gold, Ryan D.

    2012-01-01

    Three-dimensional (3D) slip vectors recorded by displaced landforms are difficult to constrain across complex fault zones, and the uncertainties associated with such measurements become increasingly challenging to assess as landforms degrade over time. We approach this problem from a remote sensing perspective by using terrestrial laser scanning (TLS) and 3D structural analysis. We have developed an integrated TLS data collection and point-based analysis workflow that incorporates accurate assessments of aleatoric and epistemic uncertainties using experimental surveys, Monte Carlo simulations, and iterative site reconstructions. Our scanning workflow and equipment requirements are optimized for single-operator surveying, and our data analysis process is largely completed using new point-based computing tools in an immersive 3D virtual reality environment. In a case study, we measured slip vector orientations at two sites along the rupture trace of the 1954 Dixie Valley earthquake (central Nevada, United States), yielding measurements that are the first direct constraints on the 3D slip vector for this event. These observations are consistent with a previous approximation of net extension direction for this event. We find that errors introduced by variables in our survey method result in <2.5 cm of variability in components of displacement, and are eclipsed by the 10–60 cm epistemic errors introduced by reconstructing the field sites to their pre-erosion geometries. Although the higher resolution TLS data sets enabled visualization and data interactivity critical for reconstructing the 3D slip vector and for assessing uncertainties, dense topographic constraints alone were not sufficient to significantly narrow the wide (<26°) range of allowable slip vector orientations that resulted from accounting for epistemic uncertainties.

  12. Hypersonic vehicle model and control law development using H(infinity) and micron synthesis

    NASA Technical Reports Server (NTRS)

    Gregory, Irene M.; Chowdhry, Rajiv S.; Mcminn, John D.; Shaughnessy, John D.

    1994-01-01

    The control system design for a Single Stage To Orbit (SSTO) air breathing vehicle will be central to a successful mission because a precise ascent trajectory will preserve narrow payload margins. The air breathing propulsion system requires the vehicle to fly roughly halfway around the Earth through atmospheric turbulence. The turbulence, the high sensitivity of the propulsion system to inlet flow conditions, the relatively large uncertainty of the parameters characterizing the vehicle, and continuous acceleration make the problem especially challenging. Adequate stability margins must be provided without sacrificing payload mass since payload margins are critical. Therefore, a multivariable control theory capable of explicitly including both uncertainty and performance is needed. The H(infinity) controller in general provides good robustness but can result in conservative solutions for practical problems involving structured uncertainty. Structured singular value mu framework for analysis and synthesis is potentially much less conservative and hence more appropriate for problems with tight margins. An SSTO control system requires: highly accurate tracking of velocity and altitude commands while limiting angle-of-attack oscillations, minimized control power usage, and a stabilized vehicle when atmospheric turbulence and system uncertainty are present. The controller designs using H(infinity) and mu-synthesis procedures were compared. An integrated flight/propulsion dynamic mathematical model of a conical accelerator vehicle was linearized as the vehicle accelerated through Mach 8. Vehicle acceleration through the selected flight condition gives rise to parametric variation that was modeled as a structured uncertainty. The mu-analysis approach was used in the frequency domain to conduct controller analysis and was confirmed by time history plots. Results demonstrate the inherent advantages of the mu framework for this class of problems.

  13. Predicting Changes in Arctic Tundra Vegetation: Towards an Understanding of Plant Trait Uncertainty

    NASA Astrophysics Data System (ADS)

    Euskirchen, E. S.; Serbin, S.; Carman, T.; Iversen, C. M.; Salmon, V.; Helene, G.; McGuire, A. D.

    2017-12-01

    Arctic tundra plant communities are currently undergoing unprecedented changes in both composition and distribution under a warming climate. Predicting how these dynamics may play out in the future is important since these vegetation shifts impact both biogeochemical and biogeophysical processes. More precise estimates of these future vegetation shifts is a key challenge due to both a scarcity of data with which to parameterize vegetation models, particularly in the Arctic, as well as a limited understanding of the importance of each of the model parameters and how they may vary over space and time. Here, we incorporate newly available field data from arctic Alaska into a dynamic vegetation model specifically developed to take into account a particularly wide array of plant species as well as the permafrost soils of the arctic tundra (the Terrestrial Ecosystem Model with Dynamic Vegetation and Dynamic Organic Soil, Terrestrial Ecosystem Model; DVM-DOS-TEM). We integrate the model within the Predicative Ecosystem Analyzer (PEcAn), an open-source integrated ecological bioinformatics toolbox that facilitates the flows of information into and out of process models and model-data integration. We use PEcAn to evaluate the plant functional traits that contribute most to model variability based on a sensitivity analysis. We perform this analysis for the dominant types of tundra in arctic Alaska, including heath, shrub, tussock and wet sedge tundra. The results from this analysis will help inform future data collection in arctic tundra and reduce model uncertainty, thereby improving our ability to simulate Arctic vegetation structure and function in response to global change.

  14. Bidirectional active control of structures with type-2 fuzzy PD and PID

    NASA Astrophysics Data System (ADS)

    Paul, Satyam; Yu, Wen; Li, Xiaoou

    2018-03-01

    Proportional-derivative and proportional-integral-derivative (PD/PID) controllers are popular algorithms in structure vibration control. In order to maintain minimum regulation error, the PD/PID control require big proportional and derivative gains. The control performances are not satisfied because of the big uncertainties in the buildings. In this paper, type-2 fuzzy system is applied to compensate the unknown uncertainties, and is combined with the PD/PID control. We prove the stability of these fuzzy PD and PID controllers. The sufficient conditions can be used for choosing the gains of PD/PID. The theory results are verified by a two-storey building prototype. The experimental results validate our analysis.

  15. Coal-Based Fuel-Cell Powerplants

    NASA Technical Reports Server (NTRS)

    Ferral, J. F.; Pappano, A. W.; Jennings, C. N.

    1986-01-01

    Report assesses advanced technologyy design alternatives for integrated coal-gasifier/fuel-cell powerplants. Various gasifier, cleanup, and fuelcell options evaluated. Evaluation includes adjustments to assumed performances and costs of proposed technologies where required. Analysis identifies uncertainties remaining in designs and most promising alternatives and research and development required to develop these technologies. Bulk of report summary and detailed analysis of six major conceptual designs and variations of each. All designs for plant that uses Illinois No. 6 coal and produces 675 MW of net power.

  16. Neutron Thermal Cross Sections, Westcott Factors, Resonance Integrals, Maxwellian Averaged Cross Sections and Astrophysical Reaction Rates Calculated from the ENDF/B-VII.1, JEFF-3.1.2, JENDL-4.0, ROSFOND-2010, CENDL-3.1 and EAF-2010 Evaluated Data Libraries

    NASA Astrophysics Data System (ADS)

    Pritychenko, B.; Mughabghab, S. F.

    2012-12-01

    We present calculations of neutron thermal cross sections, Westcott factors, resonance integrals, Maxwellian-averaged cross sections and astrophysical reaction rates for 843 ENDF materials using data from the major evaluated nuclear libraries and European activation file. Extensive analysis of newly-evaluated neutron reaction cross sections, neutron covariances, and improvements in data processing techniques motivated us to calculate nuclear industry and neutron physics quantities, produce s-process Maxwellian-averaged cross sections and astrophysical reaction rates, systematically calculate uncertainties, and provide additional insights on currently available neutron-induced reaction data. Nuclear reaction calculations are discussed and new results are presented. Due to space limitations, the present paper contains only calculated Maxwellian-averaged cross sections and their uncertainties. The complete data sets for all results are published in the Brookhaven National Laboratory report.

  17. A statistical approach to the brittle fracture of a multi-phase solid

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Lua, Y. I.; Belytschko, T.

    1991-01-01

    A stochastic damage model is proposed to quantify the inherent statistical distribution of the fracture toughness of a brittle, multi-phase solid. The model, based on the macrocrack-microcrack interaction, incorporates uncertainties in locations and orientations of microcracks. Due to the high concentration of microcracks near the macro-tip, a higher order analysis based on traction boundary integral equations is formulated first for an arbitrary array of cracks. The effects of uncertainties in locations and orientations of microcracks at a macro-tip are analyzed quantitatively by using the boundary integral equations method in conjunction with the computer simulation of the random microcrack array. The short range interactions resulting from surrounding microcracks closet to the main crack tip are investigated. The effects of microcrack density parameter are also explored in the present study. The validity of the present model is demonstrated by comparing its statistical output with the Neville distribution function, which gives correct fits to sets of experimental data from multi-phase solids.

  18. Evaluation of Neutron-induced Cross Sections and their Related Covariances with Physical Constraints

    NASA Astrophysics Data System (ADS)

    De Saint Jean, C.; Archier, P.; Privas, E.; Noguère, G.; Habert, B.; Tamagno, P.

    2018-02-01

    Nuclear data, along with numerical methods and the associated calculation schemes, continue to play a key role in reactor design, reactor core operating parameters calculations, fuel cycle management and criticality safety calculations. Due to the intensive use of Monte-Carlo calculations reducing numerical biases, the final accuracy of neutronic calculations increasingly depends on the quality of nuclear data used. This paper gives a broad picture of all ingredients treated by nuclear data evaluators during their analyses. After giving an introduction to nuclear data evaluation, we present implications of using the Bayesian inference to obtain evaluated cross sections and related uncertainties. In particular, a focus is made on systematic uncertainties appearing in the analysis of differential measurements as well as advantages and drawbacks one may encounter by analyzing integral experiments. The evaluation work is in general done independently in the resonance and in the continuum energy ranges giving rise to inconsistencies in evaluated files. For future evaluations on the whole energy range, we call attention to two innovative methods used to analyze several nuclear reaction models and impose constraints. Finally, we discuss suggestions for possible improvements in the evaluation process to master the quantification of uncertainties. These are associated with experiments (microscopic and integral), nuclear reaction theories and the Bayesian inference.

  19. Gaussian Process Interpolation for Uncertainty Estimation in Image Registration

    PubMed Central

    Wachinger, Christian; Golland, Polina; Reuter, Martin; Wells, William

    2014-01-01

    Intensity-based image registration requires resampling images on a common grid to evaluate the similarity function. The uncertainty of interpolation varies across the image, depending on the location of resampled points relative to the base grid. We propose to perform Bayesian inference with Gaussian processes, where the covariance matrix of the Gaussian process posterior distribution estimates the uncertainty in interpolation. The Gaussian process replaces a single image with a distribution over images that we integrate into a generative model for registration. Marginalization over resampled images leads to a new similarity measure that includes the uncertainty of the interpolation. We demonstrate that our approach increases the registration accuracy and propose an efficient approximation scheme that enables seamless integration with existing registration methods. PMID:25333127

  20. Coordination and Control of Flexible Building Loads for Renewable Integration; Demonstrations using VOLTTRON

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hao, He; Liu, Guopeng; Huang, Sen

    Renewable energy resources such as wind and solar power have a high degree of uncertainty. Large-scale integration of these variable generation sources into the grid is a big challenge for power system operators. Buildings, in which we live and work, consume about 75% of the total electricity in the United States. They also have a large capacity of power flexibility due to their massive thermal capacitance. Therefore, they present a great opportunity to help the grid to manage power balance. In this report, we study coordination and control of flexible building loads for renewable integration. We first present the motivationmore » and background, and conduct a literature review on building-to-grid integration. We also compile a catalog of flexible building loads that have great potential for renewable integration, and discuss their characteristics. We next collect solar generation data from a photovoltaic panel on Pacific Northwest National Laboratory campus, and conduct data analysis to study their characteristics. We find that solar generation output has a strong uncertainty, and the uncertainty occurs at almost all time scales. Additional data from other sources are also used to verify our study. We propose two transactive coordination strategies to manage flexible building loads for renewable integration. We prove the theories that support the two transactive coordination strategies and discuss their pros and cons. In this report, we select three types of flexible building loads—air-handling unit, rooftop unit, and a population of WHs—for which we demonstrate control of the flexible load to track a dispatch signal (e.g., renewable generation fluctuation) using experiment, simulation, or hardware-in-the-loop study. More specifically, we present the system description, model identification, controller design, test bed setup, and experiment results for each demonstration. We show that coordination and control of flexible loads has a great potential to integrate variable generation sources. The flexible loads can successfully track a power dispatch signal from the coordinator, while having little impact on the quality of service to the end-users.« less

  1. Effects of Lambertian sources design on uniformity and measurements

    NASA Astrophysics Data System (ADS)

    Cariou, Nadine; Durell, Chris; McKee, Greg; Wilks, Dylan; Glastre, Wilfried

    2014-10-01

    Integrating sphere (IS) based uniform sources are a primary tool for ground based calibration, characterization and testing of flight radiometric equipment. The idea of a Lambertian field of energy is a very useful tool in radiometric testing, but this concept is being checked in many ways by newly lowered uncertainty goals. At an uncertainty goal of 2% one needs to assess carefully uniformity in addition to calibration uncertainties, as even sources with a 0.5% uniformity are now substantial proportions of uncertainty budgets. The paper explores integrating sphere design options for achieving 99.5% and better uniformity of exit port radiance and spectral irradiance created by an integrating sphere. Uniformity in broad spectrum and spectral bands are explored. We discuss mapping techniques and results as a function of observed uniformity as well as laboratory testing results customized to match with customer's instrumentation field of view. We will also discuss recommendations with basic commercial instrumentation, we have used to validate, inspect, and improve correlation of uniformity measurements with the intended application.

  2. Development of probabilistic internal dosimetry computer code

    NASA Astrophysics Data System (ADS)

    Noh, Siwan; Kwon, Tae-Eun; Lee, Jai-Ki

    2017-02-01

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values ( e.g. the 2.5th, 5th, median, 95th, and 97.5th percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various situations. In cases of severe internal exposure, the causation probability of a deterministic health effect can be derived from the dose distribution, and a high statistical value ( e.g., the 95th percentile of the distribution) can be used to determine the appropriate intervention. The distribution-based sensitivity analysis can also be used to quantify the contribution of each factor to the dose uncertainty, which is essential information for reducing and optimizing the uncertainty in the internal dose assessment. Therefore, the present study can contribute to retrospective dose assessment for accidental internal exposure scenarios, as well as to internal dose monitoring optimization and uncertainty reduction.

  3. Uncertainty Quantification and Regional Sensitivity Analysis of Snow-related Parameters in the Canadian LAnd Surface Scheme (CLASS)

    NASA Astrophysics Data System (ADS)

    Badawy, B.; Fletcher, C. G.

    2017-12-01

    The parameterization of snow processes in land surface models is an important source of uncertainty in climate simulations. Quantifying the importance of snow-related parameters, and their uncertainties, may therefore lead to better understanding and quantification of uncertainty within integrated earth system models. However, quantifying the uncertainty arising from parameterized snow processes is challenging due to the high-dimensional parameter space, poor observational constraints, and parameter interaction. In this study, we investigate the sensitivity of the land simulation to uncertainty in snow microphysical parameters in the Canadian LAnd Surface Scheme (CLASS) using an uncertainty quantification (UQ) approach. A set of training cases (n=400) from CLASS is used to sample each parameter across its full range of empirical uncertainty, as determined from available observations and expert elicitation. A statistical learning model using support vector regression (SVR) is then constructed from the training data (CLASS output variables) to efficiently emulate the dynamical CLASS simulations over a much larger (n=220) set of cases. This approach is used to constrain the plausible range for each parameter using a skill score, and to identify the parameters with largest influence on the land simulation in CLASS at global and regional scales, using a random forest (RF) permutation importance algorithm. Preliminary sensitivity tests indicate that snow albedo refreshment threshold and the limiting snow depth, below which bare patches begin to appear, have the highest impact on snow output variables. The results also show a considerable reduction of the plausible ranges of the parameters values and hence reducing their uncertainty ranges, which can lead to a significant reduction of the model uncertainty. The implementation and results of this study will be presented and discussed in details.

  4. Provider Recommendations in the Face of Scientific Uncertainty: An Analysis of Audio-Recorded Discussions about Vitamin D.

    PubMed

    Tarn, Derjung M; Paterniti, Debora A; Wenger, Neil S

    2016-08-01

    Little is known about how providers communicate recommendations when scientific uncertainty exists. To compare provider recommendations to those in the scientific literature, with a focus on whether uncertainty was communicated. Qualitative (inductive systematic content analysis) and quantitative analysis of previously collected audio-recorded provider-patient office visits. Sixty-one providers and a socio-economically diverse convenience sample of 603 of their patients from outpatient community- and academic-based primary care, integrative medicine, and complementary and alternative medicine provider offices in Southern California. Comparison of provider information-giving about vitamin D to professional guidelines and scientific information for which conflicting recommendations or insufficient scientific evidence exists; certainty with which information was conveyed. Ninety-two (15.3 %) of 603 visit discussions touched upon issues related to vitamin D testing, management and benefits. Vitamin D deficiency screening was discussed with 23 (25 %) patients, the definition of vitamin D deficiency with 21 (22.8 %), the optimal range for vitamin D levels with 26 (28.3 %), vitamin D supplementation dosing with 50 (54.3 %), and benefits of supplementation with 46 (50 %). For each of the professional guidelines/scientific information examined, providers conveyed information that deviated from professional guidelines and the existing scientific evidence. Of 166 statements made about vitamin D in this study, providers conveyed 160 (96.4 %) with certainty, without mention of any equivocal or contradictory evidence in the scientific literature. No uncertainty was mentioned when vitamin D dosing was discussed, even when recommended dosing was higher than guideline recommendations. Providers convey the vast majority of information and recommendations about vitamin D with certainty, even though the scientific literature contains inconsistent recommendations and declarations of inadequate evidence. Not communicating uncertainty blurs the contrast between evidence-based recommendations and those without evidence. Providers should explore best practices for involving patients in decision-making by acknowledging the uncertainty behind their recommendations.

  5. Barriers and Bridges to the Integration of Social-ecological Resilience and Law

    EPA Science Inventory

    There is a fundamental rift between how ecologists and lawyers view uncertainty. In the study of ecology, uncertainty is a catalyst for exploration. However, uncertainty is antithetical to the rule of law. This issue is particularly troubling in environmental management, where th...

  6. Space Shuttle Main Engine performance analysis

    NASA Technical Reports Server (NTRS)

    Santi, L. Michael

    1993-01-01

    For a number of years, NASA has relied primarily upon periodically updated versions of Rocketdyne's power balance model (PBM) to provide space shuttle main engine (SSME) steady-state performance prediction. A recent computational study indicated that PBM predictions do not satisfy fundamental energy conservation principles. More recently, SSME test results provided by the Technology Test Bed (TTB) program have indicated significant discrepancies between PBM flow and temperature predictions and TTB observations. Results of these investigations have diminished confidence in the predictions provided by PBM, and motivated the development of new computational tools for supporting SSME performance analysis. A multivariate least squares regression algorithm was developed and implemented during this effort in order to efficiently characterize TTB data. This procedure, called the 'gains model,' was used to approximate the variation of SSME performance parameters such as flow rate, pressure, temperature, speed, and assorted hardware characteristics in terms of six assumed independent influences. These six influences were engine power level, mixture ratio, fuel inlet pressure and temperature, and oxidizer inlet pressure and temperature. A BFGS optimization algorithm provided the base procedure for determining regression coefficients for both linear and full quadratic approximations of parameter variation. Statistical information relative to data deviation from regression derived relations was also computed. A new strategy for integrating test data with theoretical performance prediction was also investigated. The current integration procedure employed by PBM treats test data as pristine and adjusts hardware characteristics in a heuristic manner to achieve engine balance. Within PBM, this integration procedure is called 'data reduction.' By contrast, the new data integration procedure, termed 'reconciliation,' uses mathematical optimization techniques, and requires both measurement and balance uncertainty estimates. The reconciler attempts to select operational parameters that minimize the difference between theoretical prediction and observation. Selected values are further constrained to fall within measurement uncertainty limits and to satisfy fundamental physical relations (mass conservation, energy conservation, pressure drop relations, etc.) within uncertainty estimates for all SSME subsystems. The parameter selection problem described above is a traditional nonlinear programming problem. The reconciler employs a mixed penalty method to determine optimum values of SSME operating parameters associated with this problem formulation.

  7. Embracing uncertainty in applied ecology.

    PubMed

    Milner-Gulland, E J; Shea, K

    2017-12-01

    Applied ecologists often face uncertainty that hinders effective decision-making.Common traps that may catch the unwary are: ignoring uncertainty, acknowledging uncertainty but ploughing on, focussing on trivial uncertainties, believing your models, and unclear objectives.We integrate research insights and examples from a wide range of applied ecological fields to illustrate advances that are generally underused, but could facilitate ecologists' ability to plan and execute research to support management.Recommended approaches to avoid uncertainty traps are: embracing models, using decision theory, using models more effectively, thinking experimentally, and being realistic about uncertainty. Synthesis and applications . Applied ecologists can become more effective at informing management by using approaches that explicitly take account of uncertainty.

  8. Competitive assessment of aerospace systems using system dynamics

    NASA Astrophysics Data System (ADS)

    Pfaender, Jens Holger

    Aircraft design has recently experienced a trend away from performance centric design towards a more balanced approach with increased emphasis on engineering an economically successful system. This approach focuses on bringing forward a comprehensive economic and life-cycle cost analysis. Since the success of any system also depends on many external factors outside of the control of the designer, this traditionally has been modeled as noise affecting the uncertainty of the design. However, this approach is currently lacking a strategic treatment of necessary early decisions affecting the probability of success of a given concept in a dynamic environment. This suggests that the introduction of a dynamic method into a life-cycle cost analysis should allow the analysis of the future attractiveness of such a concept in the presence of uncertainty. One way of addressing this is through the use of a competitive market model. However, existing market models do not focus on the dynamics of the market. Instead, they focus on modeling and predicting market share through logit regression models. The resulting models exhibit relatively poor predictive capabilities. The method proposed here focuses on a top-down approach that integrates a competitive model based on work in the field of system dynamics into the aircraft design process. Demonstrating such integration is one of the primary contributions of this work, which previously has not been demonstrated. This integration is achieved through the use of surrogate models, in this case neural networks. This enabled not only the practical integration of analysis techniques, but also reduced the computational requirements so that interactive exploration as envisioned was actually possible. The example demonstration of this integration is built on the competition in the 250 seat large commercial aircraft market exemplified by the Boeing 767-400ER and the Airbus A330-200. Both aircraft models were calibrated to existing performance and certification data and then integrated into the system dynamics market model. The market model was then calibrated with historical market data. This calibration showed a much improved predictive capability as compared to the conventional logit regression models. An additional advantage of this dynamic model is that to realize this improved capability, no additional explanatory variables were required. Furthermore, the resulting market model was then integrated into a prediction profiler environment with a time variant Monte-Carlo analysis resulting in a unique trade-off environment. This environment was shown to allow interactive trade-off between aircraft design decisions and economic considerations while allowing the exploration potential market success in the light of varying external market conditions and scenarios. The resulting method is capable of reduced decision support uncertainty and identification of robust design decisions in future scenarios with a high likelihood of occurrence with special focus on the path dependent nature of future implications of decisions. Furthermore, it was possible to demonstrate the increased importance of design and technology choices on the competitiveness in scenarios with drastic increases in commodity prices during the time period modeled. Another use of the existing outputs of the Monte-Carlo analysis was then realized by showing them on a multivariate scatter plot. This plot was then shown to enable by appropriate grouping of variables to enable the top down definition of an aircraft design, also known as inverse design. In other words this enables the designer to define strategic market and return on investment goals for a number of scenarios, for example the development of fuel prices, and then directly see which specific aircraft designs meet these goals.

  9. A Verification-Driven Approach to Control Analysis and Tuning

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2008-01-01

    This paper proposes a methodology for the analysis and tuning of controllers using control verification metrics. These metrics, which are introduced in a companion paper, measure the size of the largest uncertainty set of a given class for which the closed-loop specifications are satisfied. This framework integrates deterministic and probabilistic uncertainty models into a setting that enables the deformation of sets in the parameter space, the control design space, and in the union of these two spaces. In regard to control analysis, we propose strategies that enable bounding regions of the design space where the specifications are satisfied by all the closed-loop systems associated with a prescribed uncertainty set. When this is unfeasible, we bound regions where the probability of satisfying the requirements exceeds a prescribed value. In regard to control tuning, we propose strategies for the improvement of the robust characteristics of a baseline controller. Some of these strategies use multi-point approximations to the control verification metrics in order to alleviate the numerical burden of solving a min-max problem. Since this methodology targets non-linear systems having an arbitrary, possibly implicit, functional dependency on the uncertain parameters and for which high-fidelity simulations are available, they are applicable to realistic engineering problems..

  10. Advances on the Failure Analysis of the Dam-Foundation Interface of Concrete Dams.

    PubMed

    Altarejos-García, Luis; Escuder-Bueno, Ignacio; Morales-Torres, Adrián

    2015-12-02

    Failure analysis of the dam-foundation interface in concrete dams is characterized by complexity, uncertainties on models and parameters, and a strong non-linear softening behavior. In practice, these uncertainties are dealt with a well-structured mixture of experience, best practices and prudent, conservative design approaches based on the safety factor concept. Yet, a sound, deep knowledge of some aspects of this failure mode remain unveiled, as they have been offset in practical applications by the use of this conservative approach. In this paper we show a strategy to analyse this failure mode under a reliability-based approach. The proposed methodology of analysis integrates epistemic uncertainty on spatial variability of strength parameters and data from dam monitoring. The purpose is to produce meaningful and useful information regarding the probability of occurrence of this failure mode that can be incorporated in risk-informed dam safety reviews. In addition, relationships between probability of failure and factors of safety are obtained. This research is supported by a more than a decade of intensive professional practice on real world cases and its final purpose is to bring some clarity, guidance and to contribute to the improvement of current knowledge and best practices on such an important dam safety concern.

  11. Effects of 2D and 3D Error Fields on the SAS Divertor Magnetic Topology

    NASA Astrophysics Data System (ADS)

    Trevisan, G. L.; Lao, L. L.; Strait, E. J.; Guo, H. Y.; Wu, W.; Evans, T. E.

    2016-10-01

    The successful design of plasma-facing components in fusion experiments is of paramount importance in both the operation of future reactors and in the modification of operating machines. Indeed, the Small Angle Slot (SAS) divertor concept, proposed for application on the DIII-D experiment, combines a small incident angle at the plasma strike point with a progressively opening slot, so as to better control heat flux and erosion in high-performance tokamak plasmas. Uncertainty quantification of the error fields expected around the striking point provides additional useful information in both the design and the modeling phases of the new divertor, in part due to the particular geometric requirement of the striking flux surfaces. The presented work involves both 2D and 3D magnetic error field analysis on the SAS strike point carried out using the EFIT code for 2D equilibrium reconstruction, V3POST for vacuum 3D computations and the OMFIT integrated modeling framework for data analysis. An uncertainty in the magnetic probes' signals is found to propagate non-linearly as an uncertainty in the striking point and angle, which can be quantified through statistical analysis to yield robust estimates. Work supported by contracts DE-FG02-95ER54309 and DE-FC02-04ER54698.

  12. Advances on the Failure Analysis of the Dam—Foundation Interface of Concrete Dams

    PubMed Central

    Altarejos-García, Luis; Escuder-Bueno, Ignacio; Morales-Torres, Adrián

    2015-01-01

    Failure analysis of the dam-foundation interface in concrete dams is characterized by complexity, uncertainties on models and parameters, and a strong non-linear softening behavior. In practice, these uncertainties are dealt with a well-structured mixture of experience, best practices and prudent, conservative design approaches based on the safety factor concept. Yet, a sound, deep knowledge of some aspects of this failure mode remain unveiled, as they have been offset in practical applications by the use of this conservative approach. In this paper we show a strategy to analyse this failure mode under a reliability-based approach. The proposed methodology of analysis integrates epistemic uncertainty on spatial variability of strength parameters and data from dam monitoring. The purpose is to produce meaningful and useful information regarding the probability of occurrence of this failure mode that can be incorporated in risk-informed dam safety reviews. In addition, relationships between probability of failure and factors of safety are obtained. This research is supported by a more than a decade of intensive professional practice on real world cases and its final purpose is to bring some clarity, guidance and to contribute to the improvement of current knowledge and best practices on such an important dam safety concern. PMID:28793709

  13. A peaking-regulation-balance-based method for wind & PV power integrated accommodation

    NASA Astrophysics Data System (ADS)

    Zhang, Jinfang; Li, Nan; Liu, Jun

    2018-02-01

    Rapid development of China’s new energy in current and future should be focused on cooperation of wind and PV power. Based on the analysis of system peaking balance, combined with the statistical features of wind and PV power output characteristics, a method of comprehensive integrated accommodation analysis of wind and PV power is put forward. By the electric power balance during night peaking load period in typical day, wind power installed capacity is determined firstly; then PV power installed capacity could be figured out by midday peak load hours, which effectively solves the problem of uncertainty when traditional method hard determines the combination of the wind and solar power simultaneously. The simulation results have validated the effectiveness of the proposed method.

  14. AUSPEX: a graphical tool for X-ray diffraction data analysis.

    PubMed

    Thorn, Andrea; Parkhurst, James; Emsley, Paul; Nicholls, Robert A; Vollmar, Melanie; Evans, Gwyndaf; Murshudov, Garib N

    2017-09-01

    In this paper, AUSPEX, a new software tool for experimental X-ray data analysis, is presented. Exploring the behaviour of diffraction intensities and the associated estimated uncertainties facilitates the discovery of underlying problems and can help users to improve their data acquisition and processing in order to obtain better structural models. The program enables users to inspect the distribution of observed intensities (or amplitudes) against resolution as well as the associated estimated uncertainties (sigmas). It is demonstrated how AUSPEX can be used to visually and automatically detect ice-ring artefacts in integrated X-ray diffraction data. Such artefacts can hamper structure determination, but may be difficult to identify from the raw diffraction images produced by modern pixel detectors. The analysis suggests that a significant portion of the data sets deposited in the PDB contain ice-ring artefacts. Furthermore, it is demonstrated how other problems in experimental X-ray data caused, for example, by scaling and data-conversion procedures can be detected by AUSPEX.

  15. Integrating uncertainty propagation in GNSS radio occultation retrieval: from excess phase to atmospheric bending angle profiles

    NASA Astrophysics Data System (ADS)

    Schwarz, Jakob; Kirchengast, Gottfried; Schwaerz, Marc

    2018-05-01

    Global Navigation Satellite System (GNSS) radio occultation (RO) observations are highly accurate, long-term stable data sets and are globally available as a continuous record from 2001. Essential climate variables for the thermodynamic state of the free atmosphere - such as pressure, temperature, and tropospheric water vapor profiles (involving background information) - can be derived from these records, which therefore have the potential to serve as climate benchmark data. However, to exploit this potential, atmospheric profile retrievals need to be very accurate and the remaining uncertainties quantified and traced throughout the retrieval chain from raw observations to essential climate variables. The new Reference Occultation Processing System (rOPS) at the Wegener Center aims to deliver such an accurate RO retrieval chain with integrated uncertainty propagation. Here we introduce and demonstrate the algorithms implemented in the rOPS for uncertainty propagation from excess phase to atmospheric bending angle profiles, for estimated systematic and random uncertainties, including vertical error correlations and resolution estimates. We estimated systematic uncertainty profiles with the same operators as used for the basic state profiles retrieval. The random uncertainty is traced through covariance propagation and validated using Monte Carlo ensemble methods. The algorithm performance is demonstrated using test day ensembles of simulated data as well as real RO event data from the satellite missions CHAllenging Minisatellite Payload (CHAMP); Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC); and Meteorological Operational Satellite A (MetOp). The results of the Monte Carlo validation show that our covariance propagation delivers correct uncertainty quantification from excess phase to bending angle profiles. The results from the real RO event ensembles demonstrate that the new uncertainty estimation chain performs robustly. Together with the other parts of the rOPS processing chain this part is thus ready to provide integrated uncertainty propagation through the whole RO retrieval chain for the benefit of climate monitoring and other applications.

  16. The Relative Importance of the Vadose Zone in Multimedia Risk Assessment Modeling Applied at a National Scale: An Analysis of Benzene Using 3MRA

    NASA Astrophysics Data System (ADS)

    Babendreier, J. E.

    2002-05-01

    Evaluating uncertainty and parameter sensitivity in environmental models can be a difficult task, even for low-order, single-media constructs driven by a unique set of site-specific data. The challenge of examining ever more complex, integrated, higher-order models is a formidable one, particularly in regulatory settings applied on a national scale. Quantitative assessment of uncertainty and sensitivity within integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a systematic, comparative approach coupled with sufficient computational power. The Multimedia, Multipathway, and Multireceptor Risk Assessment Model (3MRA) is an important code being developed by the United States Environmental Protection Agency for use in site-scale risk assessment (e.g. hazardous waste management facilities). The model currently entails over 700 variables, 185 of which are explicitly stochastic. The 3MRA can start with a chemical concentration in a waste management unit (WMU). It estimates the release and transport of the chemical throughout the environment, and predicts associated exposure and risk. The 3MRA simulates multimedia (air, water, soil, sediments), pollutant fate and transport, multipathway exposure routes (food ingestion, water ingestion, soil ingestion, air inhalation, etc.), multireceptor exposures (resident, gardener, farmer, fisher, ecological habitats and populations), and resulting risk (human cancer and non-cancer effects, ecological population and community effects). The 3MRA collates the output for an overall national risk assessment, offering a probabilistic strategy as a basis for regulatory decisions. To facilitate model execution of 3MRA for purposes of conducting uncertainty and sensitivity analysis, a PC-based supercomputer cluster was constructed. Design of SuperMUSE, a 125 GHz Windows-based Supercomputer for Model Uncertainty and Sensitivity Evaluation is described, along with the conceptual layout of an accompanying java-based paralleling software toolset. Preliminary work is also reported for a scenario involving Benzene disposal that describes the relative importance of the vadose zone in driving risk levels for ecological receptors and human health. Incorporating landfills, waste piles, aerated tanks, surface impoundments, and land application units, the site-based data used in the analysis included 201 national facilities representing 419 site-WMU combinations.

  17. Applying Bayesian belief networks in rapid response situations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gibson, William L; Deborah, Leishman, A.; Van Eeckhout, Edward

    2008-01-01

    The authors have developed an enhanced Bayesian analysis tool called the Integrated Knowledge Engine (IKE) for monitoring and surveillance. The enhancements are suited for Rapid Response Situations where decisions must be made based on uncertain and incomplete evidence from many diverse and heterogeneous sources. The enhancements extend the probabilistic results of the traditional Bayesian analysis by (1) better quantifying uncertainty arising from model parameter uncertainty and uncertain evidence, (2) optimizing the collection of evidence to reach conclusions more quickly, and (3) allowing the analyst to determine the influence of the remaining evidence that cannot be obtained in the time allowed.more » These extended features give the analyst and decision maker a better comprehension of the adequacy of the acquired evidence and hence the quality of the hurried decisions. They also describe two example systems where the above features are highlighted.« less

  18. Decision making from economic and signal detection perspectives: development of an integrated framework

    PubMed Central

    Lynn, Spencer K.; Wormwood, Jolie B.; Barrett, Lisa F.; Quigley, Karen S.

    2015-01-01

    Behavior is comprised of decisions made from moment to moment (i.e., to respond one way or another). Often, the decision maker cannot be certain of the value to be accrued from the decision (i.e., the outcome value). Decisions made under outcome value uncertainty form the basis of the economic framework of decision making. Behavior is also based on perception—perception of the external physical world and of the internal bodily milieu, which both provide cues that guide decision making. These perceptual signals are also often uncertain: another person's scowling facial expression may indicate threat or intense concentration, alternatives that require different responses from the perceiver. Decisions made under perceptual uncertainty form the basis of the signals framework of decision making. Traditional behavioral economic approaches to decision making focus on the uncertainty that comes from variability in possible outcome values, and typically ignore the influence of perceptual uncertainty. Conversely, traditional signal detection approaches to decision making focus on the uncertainty that arises from variability in perceptual signals and typically ignore the influence of outcome value uncertainty. Here, we compare and contrast the economic and signals frameworks that guide research in decision making, with the aim of promoting their integration. We show that an integrated framework can expand our ability to understand a wider variety of decision-making behaviors, in particular the complexly determined real-world decisions we all make every day. PMID:26217275

  19. SCALE Code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jessee, Matthew Anderson

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.SCALE 6.2 provides many new capabilities and significant improvements of existing features.New capabilities include:• ENDF/B-VII.1 nuclear data libraries CE and MG with enhanced group structures,• Neutron covariance data based on ENDF/B-VII.1 and supplemented with ORNL data,• Covariance data for fission product yields and decay constants,• Stochastic uncertainty and correlation quantification for any SCALE sequence with Sampler,• Parallel calculations with KENO,• Problem-dependent temperature corrections for CE calculations,• CE shielding and criticality accident alarm system analysis with MAVRIC,• CE depletion with TRITON (T5-DEPL/T6-DEPL),• CE sensitivity/uncertainty analysis with TSUNAMI-3D,• Simplified and efficient LWR lattice physics with Polaris,• Large scale detailed spent fuel characterization with ORIGAMI and ORIGAMI Automator,• Advanced fission source convergence acceleration capabilities with Sourcerer,• Nuclear data library generation with AMPX, and• Integrated user interface with Fulcrum.Enhanced capabilities include:• Accurate and efficient CE Monte Carlo methods for eigenvalue and fixed source calculations,• Improved MG resonance self-shielding methodologies and data,• Resonance self-shielding with modernized and efficient XSProc integrated into most sequences,• Accelerated calculations with TRITON/NEWT (generally 4x faster than SCALE 6.1),• Spent fuel characterization with 1470 new reactor-specific libraries for ORIGEN,• Modernization of ORIGEN (Chebyshev Rational Approximation Method [CRAM] solver, API for high-performance depletion, new keyword input format)• Extension of the maximum mixture number to values well beyond the previous limit of 2147 to ~2 billion,• Nuclear data formats enabling the use of more than 999 energy groups,• Updated standard composition library to provide more accurate use of natural abundances, andvi• Numerous other enhancements for improved usability and stability.« less

  20. Automating calibration, sensitivity and uncertainty analysis of complex models using the R package Flexible Modeling Environment (FME): SWAT as an example

    USGS Publications Warehouse

    Wu, Y.; Liu, S.

    2012-01-01

    Parameter optimization and uncertainty issues are a great challenge for the application of large environmental models like the Soil and Water Assessment Tool (SWAT), which is a physically-based hydrological model for simulating water and nutrient cycles at the watershed scale. In this study, we present a comprehensive modeling environment for SWAT, including automated calibration, and sensitivity and uncertainty analysis capabilities through integration with the R package Flexible Modeling Environment (FME). To address challenges (e.g., calling the model in R and transferring variables between Fortran and R) in developing such a two-language coupling framework, 1) we converted the Fortran-based SWAT model to an R function (R-SWAT) using the RFortran platform, and alternatively 2) we compiled SWAT as a Dynamic Link Library (DLL). We then wrapped SWAT (via R-SWAT) with FME to perform complex applications including parameter identifiability, inverse modeling, and sensitivity and uncertainty analysis in the R environment. The final R-SWAT-FME framework has the following key functionalities: automatic initialization of R, running Fortran-based SWAT and R commands in parallel, transferring parameters and model output between SWAT and R, and inverse modeling with visualization. To examine this framework and demonstrate how it works, a case study simulating streamflow in the Cedar River Basin in Iowa in the United Sates was used, and we compared it with the built-in auto-calibration tool of SWAT in parameter optimization. Results indicate that both methods performed well and similarly in searching a set of optimal parameters. Nonetheless, the R-SWAT-FME is more attractive due to its instant visualization, and potential to take advantage of other R packages (e.g., inverse modeling and statistical graphics). The methods presented in the paper are readily adaptable to other model applications that require capability for automated calibration, and sensitivity and uncertainty analysis.

  1. And yet it moves! Involving transient flow conditions is the logical next step for WHPA analysis

    NASA Astrophysics Data System (ADS)

    Rodriguez-Pretelin, A.; Nowak, W.

    2017-12-01

    As the first line of defense among different safety measures, Wellhead Protection Areas (WHPAs) have been broadly used to protect drinking water wells against sources of pollution. In most cases, their implementation relies on simplifications, such as assuming homogeneous or zonated aquifer conditions or considering steady-state flow scenarios. Obviously, both assumptions inevitably invoke errors. However, while uncertainty due to aquifer heterogeneity has been extensively studied in the literature, the impact of transient flow conditions have received yet very little attention. For instance, WHPA maps in the offices of water supply companies are fixed maps derived from steady-state models although the actual catchment out there are transient. To mitigate high computational costs, we approximate transiency by means of a dynamic superposition of steady-state flow solutions. Then, we analyze four transient drivers that often appear on the seasonal scale: (I) regional groundwater flow direction, (II) strength of the regional hydraulic gradient, (III) natural recharge to the groundwater and (IV) pumping rate. The integration of transiency in WHPA analysis leads to time-frequency maps. They express for each location the temporal frequency of catchment membership. Furthermore, we account for the uncertainty due to incomplete knowledge on geological and transiency conditions, solved through Monte Carlo simulations. The main contribution of this study, is to show the need of enhancing groundwater well protection by considering transient flow considerations during WHPA analysis. To support and complement our statement, we demonstrate that 1) each transient driver imprints an individual spatial pattern in the required WHPA, ranking their influence through a global sensitivity analysis. 2) We compare the influence of transient conditions compared to geological uncertainty in terms of areal WHPA demand. 3) We show that considering geological uncertainty alone is insufficient in the presence of transient conditions. 4) We propose a practical decision rule for selecting a proper reliability level protection in the presence of both transiency and geological uncertainty.

  2. A probabilistic approach to aircraft design emphasizing stability and control uncertainties

    NASA Astrophysics Data System (ADS)

    Delaurentis, Daniel Andrew

    In order to address identified deficiencies in current approaches to aerospace systems design, a new method has been developed. This new method for design is based on the premise that design is a decision making activity, and that deterministic analysis and synthesis can lead to poor, or misguided decision making. This is due to a lack of disciplinary knowledge of sufficient fidelity about the product, to the presence of uncertainty at multiple levels of the aircraft design hierarchy, and to a failure to focus on overall affordability metrics as measures of goodness. Design solutions are desired which are robust to uncertainty and are based on the maximum knowledge possible. The new method represents advances in the two following general areas. 1. Design models and uncertainty. The research performed completes a transition from a deterministic design representation to a probabilistic one through a modeling of design uncertainty at multiple levels of the aircraft design hierarchy, including: (1) Consistent, traceable uncertainty classification and representation; (2) Concise mathematical statement of the Probabilistic Robust Design problem; (3) Variants of the Cumulative Distribution Functions (CDFs) as decision functions for Robust Design; (4) Probabilistic Sensitivities which identify the most influential sources of variability. 2. Multidisciplinary analysis and design. Imbedded in the probabilistic methodology is a new approach for multidisciplinary design analysis and optimization (MDA/O), employing disciplinary analysis approximations formed through statistical experimentation and regression. These approximation models are a function of design variables common to the system level as well as other disciplines. For aircraft, it is proposed that synthesis/sizing is the proper avenue for integrating multiple disciplines. Research hypotheses are translated into a structured method, which is subsequently tested for validity. Specifically, the implementation involves the study of the relaxed static stability technology for a supersonic commercial transport aircraft. The probabilistic robust design method is exercised resulting in a series of robust design solutions based on different interpretations of "robustness". Insightful results are obtained and the ability of the method to expose trends in the design space are noted as a key advantage.

  3. Forest management under climatic and social uncertainty: trade-offs between reducing climate change impacts and fostering adaptive capacity.

    PubMed

    Seidl, Rupert; Lexer, Manfred J

    2013-01-15

    The unabated continuation of anthropogenic greenhouse gas emissions and the lack of an international consensus on a stringent climate change mitigation policy underscore the importance of adaptation for coping with the all but inevitable changes in the climate system. Adaptation measures in forestry have particularly long lead times. A timely implementation is thus crucial for reducing the considerable climate vulnerability of forest ecosystems. However, since future environmental conditions as well as future societal demands on forests are inherently uncertain, a core requirement for adaptation is robustness to a wide variety of possible futures. Here we explicitly address the roles of climatic and social uncertainty in forest management, and tackle the question of robustness of adaptation measures in the context of multi-objective sustainable forest management (SFM). We used the Austrian Federal Forests (AFF) as a case study, and employed a comprehensive vulnerability assessment framework based on ecosystem modeling, multi-criteria decision analysis, and practitioner participation. We explicitly considered climate uncertainty by means of three climate change scenarios, and accounted for uncertainty in future social demands by means of three societal preference scenarios regarding SFM indicators. We found that the effects of climatic and social uncertainty on the projected performance of management were in the same order of magnitude, underlining the notion that climate change adaptation requires an integrated social-ecological perspective. Furthermore, our analysis of adaptation measures revealed considerable trade-offs between reducing adverse impacts of climate change and facilitating adaptive capacity. This finding implies that prioritization between these two general aims of adaptation is necessary in management planning, which we suggest can draw on uncertainty analysis: Where the variation induced by social-ecological uncertainty renders measures aiming to reduce climate change impacts statistically insignificant (i.e., for approximately one third of the investigated management units of the AFF case study), fostering adaptive capacity is suggested as the preferred pathway for adaptation. We conclude that climate change adaptation needs to balance between anticipating expected future conditions and building the capacity to address unknowns and surprises. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Managing the uncertainties of the streamflow data produced by the French national hydrological services

    NASA Astrophysics Data System (ADS)

    Puechberty, Rachel; Bechon, Pierre-Marie; Le Coz, Jérôme; Renard, Benjamin

    2015-04-01

    The French national hydrological services (NHS) manage the production of streamflow time series throughout the national territory. The hydrological data are made available to end-users through different web applications and the national hydrological archive (Banque Hydro). Providing end-users with qualitative and quantitative information on the uncertainty of the hydrological data is key to allow them drawing relevant conclusions and making appropriate decisions. Due to technical and organisational issues that are specific to the field of hydrometry, quantifying the uncertainty of hydrological measurements is still challenging and not yet standardized. The French NHS have made progress on building a consistent strategy to assess the uncertainty of their streamflow data. The strategy consists of addressing the uncertainties produced and propagated at each step of the data production with uncertainty analysis tools that are compatible with each other and compliant with international uncertainty guidance and standards. Beyond the necessary research and methodological developments, operational software tools and procedures are absolutely necessary to the data management and uncertainty analysis by field hydrologists. A first challenge is to assess, and if possible reduce, the uncertainty of streamgauging data, i.e. direct stage-discharge measurements. Interlaboratory experiments proved to be a very efficient way to empirically measure the uncertainty of a given streamgauging technique in given measurement conditions. The Q+ method (Le Coz et al., 2012) was developed to improve the uncertainty propagation method proposed in the ISO748 standard for velocity-area gaugings. Both empirical or computed (with Q+) uncertainty values can now be assigned in BAREME, which is the software used by the French NHS for managing streamgauging measurements. A second pivotal step is to quantify the uncertainty related to stage-discharge rating curves and their application to water level records to produce continuous discharge time series. The management of rating curves is also done using BAREME. The BaRatin method (Le Coz et al., 2014) was developed as a Bayesian approach of rating curve development and uncertainty analysis. Since BaRatin accounts for the individual uncertainties of gauging data used to build the rating curve, it was coupled with BAREME. The BaRatin method is still undergoing development and research, in particular to address non univocal or time-varying stage-discharge relations, due to hysteresis, variable backwater, rating shifts, etc. A new interface including new options is under development. The next steps are now to propagate the uncertainties of water level records, through uncertain rating curves, up to discharge time series and derived variables (e.g. annual mean flow) and statistics (e.g. flood quantiles). Bayesian tools are already available for both tasks but further validation and development is necessary for their integration in the operational data workflow of the French NHS. References Le Coz, J., Camenen, B., Peyrard, X., Dramais, G., 2012. Uncertainty in open-channel discharges measured with the velocity-area method. Flow Measurement and Instrumentation 26, 18-29. Le Coz, J., Renard, B., Bonnifait, L., Branger, F., Le Boursicaud, R., 2014. Combining hydraulic knowledge and uncertain gaugings in the estimation of hydrometric rating curves: a Bayesian approach, Journal of Hydrology, 509, 573-587.

  5. Using FOSM-Based Data Worth Analyses to Design Geophysical Surveys to Reduce Uncertainty in a Regional Groundwater Model Update

    NASA Astrophysics Data System (ADS)

    Smith, B. D.; White, J.; Kress, W. H.; Clark, B. R.; Barlow, J.

    2016-12-01

    Hydrogeophysical surveys have become an integral part of understanding hydrogeological frameworks used in groundwater models. Regional models cover a large area where water well data is, at best, scattered and irregular. Since budgets are finite, priorities must be assigned to select optimal areas for geophysical surveys. For airborne electromagnetic (AEM) geophysical surveys, optimization of mapping depth and line spacing needs to take in account the objectives of the groundwater models. The approach discussed here uses a first-order, second-moment (FOSM) uncertainty analyses which assumes an approximate linear relation between model parameters and observations. This assumption allows FOSM analyses to be applied to estimate the value of increased parameter knowledge to reduce forecast uncertainty. FOSM is used to facilitate optimization of yet-to-be-completed geophysical surveying to reduce model forecast uncertainty. The main objective of geophysical surveying is assumed to estimate values and spatial variation in hydrologic parameters (i.e. hydraulic conductivity) as well as map lower permeability layers that influence the spatial distribution of recharge flux. The proposed data worth analysis was applied to Mississippi Embayment Regional Aquifer Study (MERAS) which is being updated. The objective of MERAS is to assess the ground-water availability (status and trends) of the Mississippi embayment aquifer system. The study area covers portions of eight states including Alabama, Arkansas, Illinois, Kentucky, Louisiana, Mississippi, Missouri, and Tennessee. The active model grid covers approximately 70,000 square miles, and incorporates some 6,000 miles of major rivers and over 100,000 water wells. In the FOSM analysis, a dense network of pilot points was used to capture uncertainty in hydraulic conductivity and recharge. To simulate the effect of AEM flight lines, the prior uncertainty for hydraulic conductivity and recharge pilots along potential flight lines was reduced. The FOSM forecast uncertainty estimates were then recalculated and compared to the base forecast uncertainty estimates. The resulting reduction in forecast uncertainty is a measure of the effect on the model from the AEM survey. Iterations through this process, results in optimization of flight line location.

  6. Dynamic Stability of Uncertain Laminated Beams Under Subtangential Loads

    NASA Technical Reports Server (NTRS)

    Goyal, Vijay K.; Kapania, Rakesh K.; Adelman, Howard (Technical Monitor); Horta, Lucas (Technical Monitor)

    2002-01-01

    Because of the inherent complexity of fiber-reinforced laminated composites, it can be challenging to manufacture composite structures according to their exact design specifications, resulting in unwanted material and geometric uncertainties. In this research, we focus on the deterministic and probabilistic stability analysis of laminated structures subject to subtangential loading, a combination of conservative and nonconservative tangential loads, using the dynamic criterion. Thus a shear-deformable laminated beam element, including warping effects, is derived to study the deterministic and probabilistic response of laminated beams. This twenty-one degrees of freedom element can be used for solving both static and dynamic problems. In the first-order shear deformable model used here we have employed a more accurate method to obtain the transverse shear correction factor. The dynamic version of the principle of virtual work for laminated composites is expressed in its nondimensional form and the element tangent stiffness and mass matrices are obtained using analytical integration The stability is studied by giving the structure a small disturbance about an equilibrium configuration, and observing if the resulting response remains small. In order to study the dynamic behavior by including uncertainties into the problem, three models were developed: Exact Monte Carlo Simulation, Sensitivity Based Monte Carlo Simulation, and Probabilistic FEA. These methods were integrated into the developed finite element analysis. Also, perturbation and sensitivity analysis have been used to study nonconservative problems, as well as to study the stability analysis, using the dynamic criterion.

  7. The Impact of Model and Rainfall Forcing Errors on Characterizing Soil Moisture Uncertainty in Land Surface Modeling

    NASA Technical Reports Server (NTRS)

    Maggioni, V.; Anagnostou, E. N.; Reichle, R. H.

    2013-01-01

    The contribution of rainfall forcing errors relative to model (structural and parameter) uncertainty in the prediction of soil moisture is investigated by integrating the NASA Catchment Land Surface Model (CLSM), forced with hydro-meteorological data, in the Oklahoma region. Rainfall-forcing uncertainty is introduced using a stochastic error model that generates ensemble rainfall fields from satellite rainfall products. The ensemble satellite rain fields are propagated through CLSM to produce soil moisture ensembles. Errors in CLSM are modeled with two different approaches: either by perturbing model parameters (representing model parameter uncertainty) or by adding randomly generated noise (representing model structure and parameter uncertainty) to the model prognostic variables. Our findings highlight that the method currently used in the NASA GEOS-5 Land Data Assimilation System to perturb CLSM variables poorly describes the uncertainty in the predicted soil moisture, even when combined with rainfall model perturbations. On the other hand, by adding model parameter perturbations to rainfall forcing perturbations, a better characterization of uncertainty in soil moisture simulations is observed. Specifically, an analysis of the rank histograms shows that the most consistent ensemble of soil moisture is obtained by combining rainfall and model parameter perturbations. When rainfall forcing and model prognostic perturbations are added, the rank histogram shows a U-shape at the domain average scale, which corresponds to a lack of variability in the forecast ensemble. The more accurate estimation of the soil moisture prediction uncertainty obtained by combining rainfall and parameter perturbations is encouraging for the application of this approach in ensemble data assimilation systems.

  8. An Improved Strong Tracking Cubature Kalman Filter for GPS/INS Integrated Navigation Systems.

    PubMed

    Feng, Kaiqiang; Li, Jie; Zhang, Xi; Zhang, Xiaoming; Shen, Chong; Cao, Huiliang; Yang, Yanyu; Liu, Jun

    2018-06-12

    The cubature Kalman filter (CKF) is widely used in the application of GPS/INS integrated navigation systems. However, its performance may decline in accuracy and even diverge in the presence of process uncertainties. To solve the problem, a new algorithm named improved strong tracking seventh-degree spherical simplex-radial cubature Kalman filter (IST-7thSSRCKF) is proposed in this paper. In the proposed algorithm, the effect of process uncertainty is mitigated by using the improved strong tracking Kalman filter technique, in which the hypothesis testing method is adopted to identify the process uncertainty and the prior state estimate covariance in the CKF is further modified online according to the change in vehicle dynamics. In addition, a new seventh-degree spherical simplex-radial rule is employed to further improve the estimation accuracy of the strong tracking cubature Kalman filter. In this way, the proposed comprehensive algorithm integrates the advantage of 7thSSRCKF’s high accuracy and strong tracking filter’s strong robustness against process uncertainties. The GPS/INS integrated navigation problem with significant dynamic model errors is utilized to validate the performance of proposed IST-7thSSRCKF. Results demonstrate that the improved strong tracking cubature Kalman filter can achieve higher accuracy than the existing CKF and ST-CKF, and is more robust for the GPS/INS integrated navigation system.

  9. A COMPREHENSIVE ANALYSIS OF UNCERTAINTIES AFFECTING THE STELLAR MASS-HALO MASS RELATION FOR 0 < z < 4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Behroozi, Peter S.; Wechsler, Risa H.; Conroy, Charlie

    2010-07-01

    We conduct a comprehensive analysis of the relationship between central galaxies and their host dark matter halos, as characterized by the stellar mass-halo mass (SM-HM) relation, with rigorous consideration of uncertainties. Our analysis focuses on results from the abundance matching technique, which assumes that every dark matter halo or subhalo above a specific mass threshold hosts one galaxy. We provide a robust estimate of the SM-HM relation for 0 < z < 1 and discuss the quantitative effects of uncertainties in observed galaxy stellar mass functions (including stellar mass estimates and counting uncertainties), halo mass functions (including cosmology and uncertaintiesmore » from substructure), and the abundance matching technique used to link galaxies to halos (including scatter in this connection). Our analysis results in a robust estimate of the SM-HM relation and its evolution from z = 0 to z = 4. The shape and the evolution are well constrained for z < 1. The largest uncertainties at these redshifts are due to stellar mass estimates (0.25 dex uncertainty in normalization); however, failure to account for scatter in stellar masses at fixed halo mass can lead to errors of similar magnitude in the SM-HM relation for central galaxies in massive halos. We also investigate the SM-HM relation to z = 4, although the shape of the relation at higher redshifts remains fairly unconstrained when uncertainties are taken into account. We find that the integrated star formation at a given halo mass peaks at 10%-20% of available baryons for all redshifts from 0 to 4. This peak occurs at a halo mass of 7 x 10{sup 11} M{sub sun} at z = 0 and this mass increases by a factor of 5 to z = 4. At lower and higher masses, star formation is substantially less efficient, with stellar mass scaling as M{sub *} {approx} M {sup 2.3}{sub h} at low masses and M{sub *} {approx} M {sup 0.29}{sub h} at high masses. The typical stellar mass for halos with mass less than 10{sup 12} M{sub sun} has increased by 0.3-0.45 dex for halos since z {approx} 1. These results will provide a powerful tool to inform galaxy evolution models.« less

  10. Incorporating uncertainty into mercury-offset decisions with a probabilistic network for National Pollutant Discharge Elimination System permit holders: an interim report

    USGS Publications Warehouse

    Wood, Alexander

    2004-01-01

    This interim report describes an alternative approach for evaluating the efficacy of using mercury (Hg) offsets to improve water quality. Hg-offset programs may allow dischargers facing higher-pollution control costs to meet their regulatory obligations by making more cost effective pollutant-reduction decisions. Efficient Hg management requires methods to translate that science and economics into a regulatory decision framework. This report documents the work in progress by the U.S. Geological Surveys Western Geographic Science Center in collaboration with Stanford University toward developing this decision framework to help managers, regulators, and other stakeholders decide whether offsets can cost effectively meet the Hg total maximum daily load (TMDL) requirements in the Sacramento River watershed. Two key approaches being considered are: (1) a probabilistic approach that explicitly incorporates scientific uncertainty, cost information, and value judgments; and (2) a quantitative approach that captures uncertainty in testing the feasibility of Hg offsets. Current fate and transport-process models commonly attempt to predict chemical transformations and transport pathways deterministically. However, the physical, chemical, and biologic processes controlling the fate and transport of Hg in aquatic environments are complex and poorly understood. Deterministic models of Hg environmental behavior contain large uncertainties, reflecting this lack of understanding. The uncertainty in these underlying physical processes may produce similarly large uncertainties in the decisionmaking process. However, decisions about control strategies are still being made despite the large uncertainties in current Hg loadings, the relations between total Hg (HgT) loading and methylmercury (MeHg) formation, and the relations between control efforts and Hg content in fish. The research presented here focuses on an alternative analytical approach to the current use of safety factors and deterministic methods for Hg TMDL decision support, one that is fully compatible with an adaptive management approach. This alternative approach uses empirical data and informed judgment to provide a scientific and technical basis for helping National Pollutant Discharge Elimination System (NPDES) permit holders make management decisions. An Hg-offset system would be an option if a wastewater-treatment plant could not achieve NPDES permit requirements for HgT reduction. We develop a probabilistic decision-analytical model consisting of three submodels for HgT loading, MeHg, and cost mitigation within a Bayesian network that integrates information of varying rigor and detail into a simple model of a complex system. Hg processes are identified and quantified by using a combination of historical data, statistical models, and expert judgment. Such an integrated approach to uncertainty analysis allows easy updating of prediction and inference when observations of model variables are made. We demonstrate our approach with data from the Cache Creek watershed (a subbasin of the Sacramento River watershed). The empirical models used to generate the needed probability distributions are based on the same empirical models currently being used by the Central Valley Regional Water Quality Control Cache Creek Hg TMDL working group. The significant difference is that input uncertainty and error are explicitly included in the model and propagated throughout its algorithms. This work demonstrates how to integrate uncertainty into the complex and highly uncertain Hg TMDL decisionmaking process. The various sources of uncertainty are propagated as decision risk that allows decisionmakers to simultaneously consider uncertainties in remediation/implementation costs while attempting to meet environmental/ecologic targets. We must note that this research is on going. As more data are collected, the HgT and cost-mitigation submodels are updated and the uncer

  11. Developing large-scale forcing data for single-column and cloud-resolving models from the Mixed-Phase Arctic Cloud Experiment

    DOE PAGES

    Xie, Shaocheng; Klein, Stephen A.; Zhang, Minghua; ...

    2006-10-05

    [1] This study represents an effort to develop Single-Column Model (SCM) and Cloud-Resolving Model large-scale forcing data from a sounding array in the high latitudes. An objective variational analysis approach is used to process data collected from the Atmospheric Radiation Measurement Program (ARM) Mixed-Phase Arctic Cloud Experiment (M-PACE), which was conducted over the North Slope of Alaska in October 2004. In this method the observed surface and top of atmosphere measurements are used as constraints to adjust the sounding data from M-PACE in order to conserve column-integrated mass, heat, moisture, and momentum. Several important technical and scientific issues related tomore » the data analysis are discussed. It is shown that the analyzed data reasonably describe the dynamic and thermodynamic features of the Arctic cloud systems observed during M-PACE. Uncertainties in the analyzed forcing fields are roughly estimated by examining the sensitivity of those fields to uncertainties in the upper-air data and surface constraints that are used in the analysis. Impacts of the uncertainties in the analyzed forcing data on SCM simulations are discussed. Results from the SCM tests indicate that the bulk features of the observed Arctic cloud systems can be captured qualitatively well using the forcing data derived in this study, and major model errors can be detected despite the uncertainties that exist in the forcing data as illustrated by the sensitivity tests. Lastly, the possibility of using the European Center for Medium-Range Weather Forecasts analysis data to derive the large-scale forcing over the Arctic region is explored.« less

  12. Jet energy measurement and its systematic uncertainty in proton-proton collisions at TeV with the ATLAS detector

    NASA Astrophysics Data System (ADS)

    Aad, G.; Abajyan, T.; Abbott, B.; Abdallah, J.; Abdel Khalek, S.; Abdinov, O.; Aben, R.; Abi, B.; Abolins, M.; AbouZeid, O. S.; Abramowicz, H.; Abreu, H.; Abulaiti, Y.; Acharya, B. S.; Adamczyk, L.; Adams, D. L.; Addy, T. N.; Adelman, J.; Adomeit, S.; Adye, T.; Aefsky, S.; Agatonovic-Jovin, T.; Aguilar-Saavedra, J. A.; Agustoni, M.; Ahlen, S. P.; Ahmad, A.; Ahmadov, F.; Aielli, G.; Åkesson, T. P. A.; Akimoto, G.; Akimov, A. V.; Alam, M. A.; Albert, J.; Albrand, S.; Alconada Verzini, M. J.; Aleksa, M.; Aleksandrov, I. N.; Alessandria, F.; Alexa, C.; Alexander, G.; Alexandre, G.; Alexopoulos, T.; Alhroob, M.; Aliev, M.; Alimonti, G.; Alio, L.; Alison, J.; Allbrooke, B. M. M.; Allison, L. J.; Allport, P. P.; Allwood-Spiers, S. E.; Almond, J.; Aloisio, A.; Alon, R.; Alonso, A.; Alonso, F.; Altheimer, A.; Alvarez Gonzalez, B.; Alviggi, M. G.; Amako, K.; Amaral Coutinho, Y.; Amelung, C.; Ammosov, V. V.; Amor Dos Santos, S. P.; Amorim, A.; Amoroso, S.; Amram, N.; Amundsen, G.; Anastopoulos, C.; Ancu, L. S.; Andari, N.; Andeen, T.; Anders, C. F.; Anders, G.; Anderson, K. J.; Andreazza, A.; Andrei, V.; Anduaga, X. S.; Angelidakis, S.; Anger, P.; Angerami, A.; Anghinolfi, F.; Anisenkov, A. V.; Anjos, N.; Annovi, A.; Antonaki, A.; Antonelli, M.; Antonov, A.; Antos, J.; Anulli, F.; Aoki, M.; Aperio Bella, L.; Apolle, R.; Arabidze, G.; Aracena, I.; Arai, Y.; Arce, A. T. H.; Arfaoui, S.; Arguin, J.-F.; Argyropoulos, S.; Arik, E.; Arik, M.; Armbruster, A. J.; Arnaez, O.; Arnal, V.; Arslan, O.; Artamonov, A.; Artoni, G.; Asai, S.; Asbah, N.; Ask, S.; Åsman, B.; Asquith, L.; Assamagan, K.; Astalos, R.; Astbury, A.; Atkinson, M.; Atlay, N. B.; Auerbach, B.; Auge, E.; Augsten, K.; Aurousseau, M.; Avolio, G.; Azuelos, G.; Azuma, Y.; Baak, M. A.; Bacci, C.; Bach, A. M.; Bachacou, H.; Bachas, K.; Backes, M.; Backhaus, M.; Backus Mayes, J.; Badescu, E.; Bagiacchi, P.; Bagnaia, P.; Bai, Y.; Bailey, D. C.; Bain, T.; Baines, J. T.; Baker, O. K.; Baker, S.; Balek, P.; Balli, F.; Banas, E.; Banerjee, Sw.; Banfi, D.; Bangert, A.; Bansal, V.; Bansil, H. S.; Barak, L.; Baranov, S. P.; Barber, T.; Barberio, E. L.; Barberis, D.; Barbero, M.; Barillari, T.; Barisonzi, M.; Barklow, T.; Barlow, N.; Barnett, B. M.; Barnett, R. M.; Baroncelli, A.; Barone, G.; Barr, A. J.; Barreiro, F.; Barreiro Guimarães da Costa, J.; Bartoldus, R.; Barton, A. E.; Bartos, P.; Bartsch, V.; Bassalat, A.; Basye, A.; Bates, R. L.; Batkova, L.; Batley, J. R.; Battistin, M.; Bauer, F.; Bawa, H. S.; Beau, T.; Beauchemin, P. H.; Beccherle, R.; Bechtle, P.; Beck, H. P.; Becker, K.; Becker, S.; Beckingham, M.; Beddall, A. J.; Beddall, A.; Bedikian, S.; Bednyakov, V. A.; Bee, C. P.; Beemster, L. J.; Beermann, T. A.; Begel, M.; Behr, K.; Belanger-Champagne, C.; Bell, P. J.; Bell, W. H.; Bella, G.; Bellagamba, L.; Bellerive, A.; Bellomo, M.; Belloni, A.; Beloborodova, O. L.; Belotskiy, K.; Beltramello, O.; Benary, O.; Benchekroun, D.; Bendtz, K.; Benekos, N.; Benhammou, Y.; Benhar Noccioli, E.; Benitez Garcia, J. A.; Benjamin, D. P.; Bensinger, J. R.; Benslama, K.; Bentvelsen, S.; Berge, D.; Bergeaas Kuutmann, E.; Berger, N.; Berghaus, F.; Berglund, E.; Beringer, J.; Bernard, C.; Bernat, P.; Bernhard, R.; Bernius, C.; Bernlochner, F. U.; Berry, T.; Berta, P.; Bertella, C.; Bertolucci, F.; Besana, M. I.; Besjes, G. J.; Bessidskaia, O.; Besson, N.; Bethke, S.; Bhimji, W.; Bianchi, R. M.; Bianchini, L.; Bianco, M.; Biebel, O.; Bieniek, S. P.; Bierwagen, K.; Biesiada, J.; Biglietti, M.; Bilbao De Mendizabal, J.; Bilokon, H.; Bindi, M.; Binet, S.; Bingul, A.; Bini, C.; Bittner, B.; Black, C. W.; Black, J. E.; Black, K. M.; Blackburn, D.; Blair, R. E.; Blanchard, J.-B.; Blazek, T.; Bloch, I.; Blocker, C.; Blocki, J.; Blum, W.; Blumenschein, U.; Bobbink, G. J.; Bobrovnikov, V. S.; Bocchetta, S. S.; Bocci, A.; Boddy, C. R.; Boehler, M.; Boek, J.; Boek, T. T.; Boelaert, N.; Bogaerts, J. A.; Bogdanchikov, A. G.; Bogouch, A.; Bohm, C.; Bohm, J.; Boisvert, V.; Bold, T.; Boldea, V.; Boldyrev, A. S.; Bolnet, N. M.; Bomben, M.; Bona, M.; Boonekamp, M.; Bordoni, S.; Borer, C.; Borisov, A.; Borissov, G.; Borri, M.; Borroni, S.; Bortfeldt, J.; Bortolotto, V.; Bos, K.; Boscherini, D.; Bosman, M.; Boterenbrood, H.; Bouchami, J.; Boudreau, J.; Bouhova-Thacker, E. V.; Boumediene, D.; Bourdarios, C.; Bousson, N.; Boutouil, S.; Boveia, A.; Boyd, J.; Boyko, I. R.; Bozovic-Jelisavcic, I.; Bracinik, J.; Branchini, P.; Brandt, A.; Brandt, G.; Brandt, O.; Bratzler, U.; Brau, B.; Brau, J. E.; Braun, H. M.; Brazzale, S. F.; Brelier, B.; Brendlinger, K.; Brenner, R.; Bressler, S.; Bristow, T. M.; Britton, D.; Brochu, F. M.; Brock, I.; Brock, R.; Broggi, F.; Bromberg, C.; Bronner, J.; Brooijmans, G.; Brooks, T.; Brooks, W. K.; Brosamer, J.; Brost, E.; Brown, G.; Brown, J.; Bruckman de Renstrom, P. A.; Bruncko, D.; Bruneliere, R.; Brunet, S.; Bruni, A.; Bruni, G.; Bruschi, M.; Bryngemark, L.; Buanes, T.; Buat, Q.; Bucci, F.; Buchholz, P.; Buckingham, R. M.; Buckley, A. G.; Buda, S. I.; Budagov, I. A.; Budick, B.; Buehrer, F.; Bugge, L.; Bugge, M. K.; Bulekov, O.; Bundock, A. C.; Bunse, M.; Burckhart, H.; Burdin, S.; Burgess, T.; Burghgrave, B.; Burke, S.; Burmeister, I.; Busato, E.; Büscher, V.; Bussey, P.; Buszello, C. P.; Butler, B.; Butler, J. M.; Butt, A. I.; Buttar, C. M.; Butterworth, J. M.; Buttinger, W.; Buzatu, A.; Byszewski, M.; Cabrera Urbán, S.; Caforio, D.; Cakir, O.; Calafiura, P.; Calderini, G.; Calfayan, P.; Calkins, R.; Caloba, L. P.; Caloi, R.; Calvet, D.; Calvet, S.; Camacho Toro, R.; Camarri, P.; Cameron, D.; Caminada, L. M.; Caminal Armadans, R.; Campana, S.; Campanelli, M.; Canale, V.; Canelli, F.; Canepa, A.; Cantero, J.; Cantrill, R.; Cao, T.; Capeans Garrido, M. D. M.; Caprini, I.; Caprini, M.; Capua, M.; Caputo, R.; Cardarelli, R.; Carli, T.; Carlino, G.; Carminati, L.; Caron, S.; Carquin, E.; Carrillo-Montoya, G. D.; Carter, A. A.; Carter, J. R.; Carvalho, J.; Casadei, D.; Casado, M. P.; Caso, C.; Castaneda-Miranda, E.; Castelli, A.; Castillo Gimenez, V.; Castro, N. F.; Catastini, P.; Catinaccio, A.; Catmore, J. R.; Cattai, A.; Cattani, G.; Caughron, S.; Cavaliere, V.; Cavalli, D.; Cavalli-Sforza, M.; Cavasinni, V.; Ceradini, F.; Cerio, B.; Cerny, K.; Cerqueira, A. S.; Cerri, A.; Cerrito, L.; Cerutti, F.; Cervelli, A.; Cetin, S. A.; Chafaq, A.; Chakraborty, D.; Chalupkova, I.; Chan, K.; Chang, P.; Chapleau, B.; Chapman, J. D.; Charfeddine, D.; Charlton, D. G.; Chavda, V.; Chavez Barajas, C. A.; Cheatham, S.; Chekanov, S.; Chekulaev, S. V.; Chelkov, G. A.; Chelstowska, M. A.; Chen, C.; Chen, H.; Chen, K.; Chen, L.; Chen, S.; Chen, X.; Chen, Y.; Cheng, Y.; Cheplakov, A.; Cherkaoui El Moursli, R.; Chernyatin, V.; Cheu, E.; Chevalier, L.; Chiarella, V.; Chiefari, G.; Childers, J. T.; Chilingarov, A.; Chiodini, G.; Chisholm, A. S.; Chislett, R. T.; Chitan, A.; Chizhov, M. V.; Chouridou, S.; Chow, B. K. B.; Christidi, I. A.; Chromek-Burckhart, D.; Chu, M. L.; Chudoba, J.; Ciapetti, G.; Ciftci, A. K.; Ciftci, R.; Cinca, D.; Cindro, V.; Ciocio, A.; Cirilli, M.; Cirkovic, P.; Citron, Z. H.; Citterio, M.; Ciubancan, M.; Clark, A.; Clark, P. J.; Clarke, R. N.; Cleland, W.; Clemens, J. C.; Clement, B.; Clement, C.; Coadou, Y.; Cobal, M.; Coccaro, A.; Cochran, J.; Coelli, S.; Coffey, L.; Cogan, J. G.; Coggeshall, J.; Colas, J.; Cole, B.; Cole, S.; Colijn, A. P.; Collins-Tooth, C.; Collot, J.; Colombo, T.; Colon, G.; Compostella, G.; Conde Muiño, P.; Coniavitis, E.; Conidi, M. C.; Connelly, I. A.; Consonni, S. M.; Consorti, V.; Constantinescu, S.; Conta, C.; Conti, G.; Conventi, F.; Cooke, M.; Cooper, B. D.; Cooper-Sarkar, A. M.; Cooper-Smith, N. J.; Copic, K.; Cornelissen, T.; Corradi, M.; Corriveau, F.; Corso-Radu, A.; Cortes-Gonzalez, A.; Cortiana, G.; Costa, G.; Costa, M. J.; Costanzo, D.; Côté, D.; Cottin, G.; Courneyea, L.; Cowan, G.; Cox, B. E.; Cranmer, K.; Cree, G.; Crépé-Renaudin, S.; Crescioli, F.; Crispin Ortuzar, M.; Cristinziani, M.; Crosetti, G.; Cuciuc, C.-M.; Cuenca Almenar, C.; Cuhadar Donszelmann, T.; Cummings, J.; Curatolo, M.; Cuthbert, C.; Czirr, H.; Czodrowski, P.; Czyczula, Z.; D'Auria, S.; D'Onofrio, M.; D'Orazio, A.; Da Cunha Sargedas De Sousa, M. J.; Da Via, C.; Dabrowski, W.; Dafinca, A.; Dai, T.; Dallaire, F.; Dallapiccola, C.; Dam, M.; Daniells, A. C.; Dano Hoffmann, M.; Dao, V.; Darbo, G.; Darlea, G. L.; Darmora, S.; Dassoulas, J. A.; Davey, W.; David, C.; Davidek, T.; Davies, E.; Davies, M.; Davignon, O.; Davison, A. R.; Davygora, Y.; Dawe, E.; Dawson, I.; Daya-Ishmukhametova, R. K.; De, K.; de Asmundis, R.; De Castro, S.; De Cecco, S.; de Graat, J.; De Groot, N.; de Jong, P.; De La Taille, C.; De la Torre, H.; De Lorenzi, F.; De Nooij, L.; De Pedis, D.; De Salvo, A.; De Sanctis, U.; De Santo, A.; De Vivie De Regie, J. B.; De Zorzi, G.; Dearnaley, W. J.; Debbe, R.; Debenedetti, C.; Dechenaux, B.; Dedovich, D. V.; Degenhardt, J.; Del Peso, J.; Del Prete, T.; Delemontex, T.; Deliot, F.; Deliyergiyev, M.; Dell'Acqua, A.; Dell'Asta, L.; Della Pietra, M.; della Volpe, D.; Delmastro, M.; Delsart, P. A.; Deluca, C.; Demers, S.; Demichev, M.; Demilly, A.; Demirkoz, B.; Denisov, S. P.; Derendarz, D.; Derkaoui, J. E.; Derue, F.; Dervan, P.; Desch, K.; Deviveiros, P. O.; Dewhurst, A.; DeWilde, B.; Dhaliwal, S.; Dhullipudi, R.; Di Ciaccio, A.; Di Ciaccio, L.; Di Domenico, A.; Di Donato, C.; Di Girolamo, A.; Di Girolamo, B.; Di Mattia, A.; Di Micco, B.; Di Nardo, R.; Di Simone, A.; Di Sipio, R.; Di Valentino, D.; Diaz, M. A.; Diehl, E. B.; Dietrich, J.; Dietzsch, T. A.; Diglio, S.; Dindar Yagci, K.; Dingfelder, J.; Dionisi, C.; Dita, P.; Dita, S.; Dittus, F.; Djama, F.; Djobava, T.; do Vale, M. A. B.; Do Valle Wemans, A.; Doan, T. K. O.; Dobos, D.; Dobson, E.; Dodd, J.; Doglioni, C.; Doherty, T.; Dohmae, T.; Dolejsi, J.; Dolezal, Z.; Dolgoshein, B. A.; Donadelli, M.; Donati, S.; Dondero, P.; Donini, J.; Dopke, J.; Doria, A.; Dos Anjos, A.; Dotti, A.; Dova, M. T.; Doyle, A. T.; Dris, M.; Dubbert, J.; Dube, S.; Dubreuil, E.; Duchovni, E.; Duckeck, G.; Ducu, O. A.; Duda, D.; Dudarev, A.; Dudziak, F.; Duflot, L.; Duguid, L.; Dührssen, M.; Dunford, M.; Duran Yildiz, H.; Düren, M.; Dwuznik, M.; Ebke, J.; Edson, W.; Edwards, C. A.; Edwards, N. C.; Ehrenfeld, W.; Eifert, T.; Eigen, G.; Einsweiler, K.; Eisenhandler, E.; Ekelof, T.; El Kacimi, M.; Ellert, M.; Elles, S.; Ellinghaus, F.; Ellis, K.; Ellis, N.; Elmsheuser, J.; Elsing, M.; Emeliyanov, D.; Enari, Y.; Endner, O. C.; Endo, M.; Engelmann, R.; Erdmann, J.; Ereditato, A.; Eriksson, D.; Ernis, G.; Ernst, J.; Ernst, M.; Ernwein, J.; Errede, D.; Errede, S.; Ertel, E.; Escalier, M.; Esch, H.; Escobar, C.; Espinal Curull, X.; Esposito, B.; Etienne, F.; Etienvre, A. I.; Etzion, E.; Evangelakou, D.; Evans, H.; Fabbri, L.; Facini, G.; Fakhrutdinov, R. M.; Falciano, S.; Fang, Y.; Fanti, M.; Farbin, A.; Farilla, A.; Farooque, T.; Farrell, S.; Farrington, S. M.; Farthouat, P.; Fassi, F.; Fassnacht, P.; Fassouliotis, D.; Fatholahzadeh, B.; Favareto, A.; Fayard, L.; Federic, P.; Fedin, O. L.; Fedorko, W.; Fehling-Kaschek, M.; Feligioni, L.; Feng, C.; Feng, E. J.; Feng, H.; Fenyuk, A. B.; Fernando, W.; Ferrag, S.; Ferrando, J.; Ferrara, V.; Ferrari, A.; Ferrari, P.; Ferrari, R.; Ferreira de Lima, D. E.; Ferrer, A.; Ferrere, D.; Ferretti, C.; Ferretto Parodi, A.; Fiascaris, M.; Fiedler, F.; Filipčič, A.; Filipuzzi, M.; Filthaut, F.; Fincke-Keeler, M.; Finelli, K. D.; Fiolhais, M. C. N.; Fiorini, L.; Firan, A.; Fischer, J.; Fisher, M. J.; Fitzgerald, E. A.; Flechl, M.; Fleck, I.; Fleischmann, P.; Fleischmann, S.; Fletcher, G. T.; Fletcher, G.; Flick, T.; Floderus, A.; Flores Castillo, L. R.; Florez Bustos, A. C.; Flowerdew, M. J.; Fonseca Martin, T.; Formica, A.; Forti, A.; Fortin, D.; Fournier, D.; Fox, H.; Francavilla, P.; Franchini, M.; Franchino, S.; Francis, D.; Franklin, M.; Franz, S.; Fraternali, M.; Fratina, S.; French, S. T.; Friedrich, C.; Friedrich, F.; Froidevaux, D.; Frost, J. A.; Fukunaga, C.; Fullana Torregrosa, E.; Fulsom, B. G.; Fuster, J.; Gabaldon, C.; Gabizon, O.; Gabrielli, A.; Gabrielli, A.; Gadatsch, S.; Gadfort, T.; Gadomski, S.; Gagliardi, G.; Gagnon, P.; Galea, C.; Galhardo, B.; Gallas, E. J.; Gallo, V.; Gallop, B. J.; Gallus, P.; Galster, G.; Gan, K. K.; Gandrajula, R. P.; Gao, J.; Gao, Y. S.; Garay Walls, F. M.; Garberson, F.; García, C.; García Navarro, J. E.; Garcia-Sciveres, M.; Gardner, R. W.; Garelli, N.; Garonne, V.; Gatti, C.; Gaudio, G.; Gaur, B.; Gauthier, L.; Gauzzi, P.; Gavrilenko, I. L.; Gay, C.; Gaycken, G.; Gazis, E. N.; Ge, P.; Gecse, Z.; Gee, C. N. P.; Geerts, D. A. A.; Geich-Gimbel, Ch.; Gellerstedt, K.; Gemme, C.; Gemmell, A.; Genest, M. H.; Gentile, S.; George, M.; George, S.; Gerbaudo, D.; Gershon, A.; Ghazlane, H.; Ghodbane, N.; Giacobbe, B.; Giagu, S.; Giangiobbe, V.; Giannetti, P.; Gianotti, F.; Gibbard, B.; Gibson, S. M.; Gilchriese, M.; Gillam, T. P. S.; Gillberg, D.; Gillman, A. R.; Gingrich, D. M.; Giokaris, N.; Giordani, M. P.; Giordano, R.; Giorgi, F. M.; Giovannini, P.; Giraud, P. F.; Giugni, D.; Giuliani, C.; Giunta, M.; Gjelsten, B. K.; Gkialas, I.; Gladilin, L. K.; Glasman, C.; Glatzer, J.; Glazov, A.; Glonti, G. L.; Goblirsch-Kolb, M.; Goddard, J. R.; Godfrey, J.; Godlewski, J.; Goeringer, C.; Goldfarb, S.; Golling, T.; Golubkov, D.; Gomes, A.; Gomez Fajardo, L. S.; Gonçalo, R.; Goncalves Pinto Firmino Da Costa, J.; Gonella, L.; González de la Hoz, S.; Gonzalez Parra, G.; Gonzalez Silva, M. L.; Gonzalez-Sevilla, S.; Goodson, J. J.; Goossens, L.; Gorbounov, P. A.; Gordon, H. A.; Gorelov, I.; Gorfine, G.; Gorini, B.; Gorini, E.; Gorišek, A.; Gornicki, E.; Goshaw, A. T.; Gössling, C.; Gostkin, M. I.; Gouighri, M.; Goujdami, D.; Goulette, M. P.; Goussiou, A. G.; Goy, C.; Gozpinar, S.; Grabas, H. M. X.; Graber, L.; Grabowska-Bold, I.; Grafström, P.; Grahn, K.-J.; Gramling, J.; Gramstad, E.; Grancagnolo, F.; Grancagnolo, S.; Grassi, V.; Gratchev, V.; Gray, H. M.; Gray, J. A.; Graziani, E.; Grebenyuk, O. G.; Greenwood, Z. D.; Gregersen, K.; Gregor, I. M.; Grenier, P.; Griffiths, J.; Grigalashvili, N.; Grillo, A. A.; Grimm, K.; Grinstein, S.; Gris, Ph.; Grishkevich, Y. V.; Grivaz, J.-F.; Grohs, J. P.; Grohsjean, A.; Gross, E.; Grosse-Knetter, J.; Grossi, G. C.; Groth-Jensen, J.; Grout, Z. J.; Grybel, K.; Guescini, F.; Guest, D.; Gueta, O.; Guicheney, C.; Guido, E.; Guillemin, T.; Guindon, S.; Gul, U.; Gumpert, C.; Gunther, J.; Guo, J.; Gupta, S.; Gutierrez, P.; Gutierrez Ortiz, N. G.; Gutschow, C.; Guttman, N.; Guyot, C.; Gwenlan, C.; Gwilliam, C. B.; Haas, A.; Haber, C.; Hadavand, H. K.; Haefner, P.; Hageboeck, S.; Hajduk, Z.; Hakobyan, H.; Haleem, M.; Hall, D.; Halladjian, G.; Hamacher, K.; Hamal, P.; Hamano, K.; Hamer, M.; Hamilton, A.; Hamilton, S.; Han, L.; Hanagaki, K.; Hanawa, K.; Hance, M.; Hanke, P.; Hansen, J. R.; Hansen, J. B.; Hansen, J. D.; Hansen, P. H.; Hansson, P.; Hara, K.; Hard, A. S.; Harenberg, T.; Harkusha, S.; Harper, D.; Harrington, R. D.; Harris, O. M.; Harrison, P. F.; Hartjes, F.; Harvey, A.; Hasegawa, S.; Hasegawa, Y.; Hassani, S.; Haug, S.; Hauschild, M.; Hauser, R.; Havranek, M.; Hawkes, C. M.; Hawkings, R. J.; Hawkins, A. D.; Hayashi, T.; Hayden, D.; Hays, C. P.; Hayward, H. S.; Haywood, S. J.; Head, S. J.; Heck, T.; Hedberg, V.; Heelan, L.; Heim, S.; Heinemann, B.; Heisterkamp, S.; Hejbal, J.; Helary, L.; Heller, C.; Heller, M.; Hellman, S.; Hellmich, D.; Helsens, C.; Henderson, J.; Henderson, R. C. W.; Henrichs, A.; Henriques Correia, A. M.; Henrot-Versille, S.; Hensel, C.; Herbert, G. H.; Hernandez, C. M.; Hernández Jiménez, Y.; Herrberg-Schubert, R.; Herten, G.; Hertenberger, R.; Hervas, L.; Hesketh, G. G.; Hessey, N. P.; Hickling, R.; Higón-Rodriguez, E.; Hill, J. C.; Hiller, K. H.; Hillert, S.; Hillier, S. J.; Hinchliffe, I.; Hines, E.; Hirose, M.; Hirschbuehl, D.; Hobbs, J.; Hod, N.; Hodgkinson, M. C.; Hodgson, P.; Hoecker, A.; Hoeferkamp, M. R.; Hoffman, J.; Hoffmann, D.; Hofmann, J. I.; Hohlfeld, M.; Holmes, T. R.; Hong, T. M.; Hooft van Huysduynen, L.; Hostachy, J.-Y.; Hou, S.; Hoummada, A.; Howard, J.; Howarth, J.; Hrabovsky, M.; Hristova, I.; Hrivnac, J.; Hryn'ova, T.; Hsu, P. J.; Hsu, S.-C.; Hu, D.; Hu, X.; Huang, Y.; Hubacek, Z.; Hubaut, F.; Huegging, F.; Huettmann, A.; Huffman, T. B.; Hughes, E. W.; Hughes, G.; Huhtinen, M.; Hülsing, T. A.; Hurwitz, M.; Huseynov, N.; Huston, J.; Huth, J.; Iacobucci, G.; Iakovidis, G.; Ibragimov, I.; Iconomidou-Fayard, L.; Idarraga, J.; Ideal, E.; Iengo, P.; Igonkina, O.; Iizawa, T.; Ikegami, Y.; Ikematsu, K.; Ikeno, M.; Iliadis, D.; Ilic, N.; Inamaru, Y.; Ince, T.; Ioannou, P.; Iodice, M.; Iordanidou, K.; Ippolito, V.; Irles Quiles, A.; Isaksson, C.; Ishino, M.; Ishitsuka, M.; Ishmukhametov, R.; Issever, C.; Istin, S.; Ivashin, A. V.; Iwanski, W.; Iwasaki, H.; Izen, J. M.; Izzo, V.; Jackson, B.; Jackson, J. N.; Jackson, M.; Jackson, P.; Jaekel, M. R.; Jain, V.; Jakobs, K.; Jakobsen, S.; Jakoubek, T.; Jakubek, J.; Jamin, D. O.; Jana, D. K.; Jansen, E.; Jansen, H.; Janssen, J.; Janus, M.; Jared, R. C.; Jarlskog, G.; Jeanty, L.; Jeng, G.-Y.; Jen-La Plante, I.; Jennens, D.; Jenni, P.; Jentzsch, J.; Jeske, C.; Jézéquel, S.; Jha, M. K.; Ji, H.; Ji, W.; Jia, J.; Jiang, Y.; Jimenez Belenguer, M.; Jin, S.; Jinaru, A.; Jinnouchi, O.; Joergensen, M. D.; Joffe, D.; Johansson, K. E.; Johansson, P.; Johns, K. A.; Jon-And, K.; Jones, G.; Jones, R. W. L.; Jones, T. J.; Jorge, P. M.; Joshi, K. D.; Jovicevic, J.; Ju, X.; Jung, C. A.; Jungst, R. M.; Jussel, P.; Juste Rozas, A.; Kaci, M.; Kaczmarska, A.; Kadlecik, P.; Kado, M.; Kagan, H.; Kagan, M.; Kajomovitz, E.; Kalinin, S.; Kama, S.; Kanaya, N.; Kaneda, M.; Kaneti, S.; Kanno, T.; Kantserov, V. A.; Kanzaki, J.; Kaplan, B.; Kapliy, A.; Kar, D.; Karakostas, K.; Karastathis, N.; Karnevskiy, M.; Karpov, S. N.; Karthik, K.; Kartvelishvili, V.; Karyukhin, A. N.; Kashif, L.; Kasieczka, G.; Kass, R. D.; Kastanas, A.; Kataoka, Y.; Katre, A.; Katzy, J.; Kaushik, V.; Kawagoe, K.; Kawamoto, T.; Kawamura, G.; Kazama, S.; Kazanin, V. F.; Kazarinov, M. Y.; Keeler, R.; Keener, P. T.; Kehoe, R.; Keil, M.; Keller, J. S.; Keoshkerian, H.; Kepka, O.; Kerševan, B. P.; Kersten, S.; Kessoku, K.; Keung, J.; Khalil-zada, F.; Khandanyan, H.; Khanov, A.; Kharchenko, D.; Khodinov, A.; Khomich, A.; Khoo, T. J.; Khoriauli, G.; Khoroshilov, A.; Khovanskiy, V.; Khramov, E.; Khubua, J.; Kim, H.; Kim, S. H.; Kimura, N.; Kind, O.; King, B. T.; King, M.; King, R. S. B.; King, S. B.; Kirk, J.; Kiryunin, A. E.; Kishimoto, T.; Kisielewska, D.; Kitamura, T.; Kittelmann, T.; Kiuchi, K.; Kladiva, E.; Klein, M.; Klein, U.; Kleinknecht, K.; Klimek, P.; Klimentov, A.; Klingenberg, R.; Klinger, J. A.; Klinkby, E. B.; Klioutchnikova, T.; Klok, P. F.; Kluge, E.-E.; Kluit, P.; Kluth, S.; Kneringer, E.; Knoops, E. B. F. G.; Knue, A.; Kobayashi, T.; Kobel, M.; Kocian, M.; Kodys, P.; Koenig, S.; Koevesarki, P.; Koffas, T.; Koffeman, E.; Kogan, L. A.; Kohlmann, S.; Kohout, Z.; Kohriki, T.; Koi, T.; Kolanoski, H.; Koletsou, I.; Koll, J.; Komar, A. A.; Komori, Y.; Kondo, T.; Köneke, K.; König, A. C.; Kono, T.; Konoplich, R.; Konstantinidis, N.; Kopeliansky, R.; Koperny, S.; Köpke, L.; Kopp, A. K.; Korcyl, K.; Kordas, K.; Korn, A.; Korol, A. A.; Korolkov, I.; Korolkova, E. V.; Korotkov, V. A.; Kortner, O.; Kortner, S.; Kostyukhin, V. V.; Kotov, S.; Kotov, V. M.; Kotwal, A.; Kourkoumelis, C.; Kouskoura, V.; Koutsman, A.; Kowalewski, R.; Kowalski, T. Z.; Kozanecki, W.; Kozhin, A. S.; Kral, V.; Kramarenko, V. A.; Kramberger, G.; Krasny, M. W.; Krasznahorkay, A.; Kraus, J. K.; Kravchenko, A.; Kreiss, S.; Kretzschmar, J.; Kreutzfeldt, K.; Krieger, N.; Krieger, P.; Kroeninger, K.; Kroha, H.; Kroll, J.; Kroseberg, J.; Krstic, J.; Kruchonak, U.; Krüger, H.; Kruker, T.; Krumnack, N.; Krumshteyn, Z. V.; Kruse, A.; Kruse, M. C.; Kruskal, M.; Kubota, T.; Kuday, S.; Kuehn, S.; Kugel, A.; Kuhl, T.; Kukhtin, V.; Kulchitsky, Y.; Kuleshov, S.; Kuna, M.; Kunkle, J.; Kupco, A.; Kurashige, H.; Kurata, M.; Kurochkin, Y. A.; Kurumida, R.; Kus, V.; Kuwertz, E. S.; Kuze, M.; Kvita, J.; Kwee, R.; La Rosa, A.; La Rotonda, L.; Labarga, L.; Lablak, S.; Lacasta, C.; Lacava, F.; Lacey, J.; Lacker, H.; Lacour, D.; Lacuesta, V. R.; Ladygin, E.; Lafaye, R.; Laforge, B.; Lagouri, T.; Lai, S.; Laier, H.; Laisne, E.; Lambourne, L.; Lampen, C. L.; Lampl, W.; Lançon, E.; Landgraf, U.; Landon, M. P. J.; Lang, V. S.; Lange, C.; Lankford, A. J.; Lanni, F.; Lantzsch, K.; Lanza, A.; Laplace, S.; Lapoire, C.; Laporte, J. F.; Lari, T.; Larner, A.; Lassnig, M.; Laurelli, P.; Lavorini, V.; Lavrijsen, W.; Laycock, P.; Le, B. T.; Le Dortz, O.; Le Guirriec, E.; Le Menedeu, E.; LeCompte, T.; Ledroit-Guillon, F.; Lee, C. A.; Lee, H.; Lee, J. S. H.; Lee, S. C.; Lee, L.; Lefebvre, G.; Lefebvre, M.; Legger, F.; Leggett, C.; Lehan, A.; Lehmacher, M.; Lehmann Miotto, G.; Leister, A. G.; Leite, M. A. L.; Leitner, R.; Lellouch, D.; Lemmer, B.; Lendermann, V.; Leney, K. J. C.; Lenz, T.; Lenzen, G.; Lenzi, B.; Leone, R.; Leonhardt, K.; Leontsinis, S.; Leroy, C.; Lessard, J.-R.; Lester, C. G.; Lester, C. M.; Levêque, J.; Levin, D.; Levinson, L. J.; Lewis, A.; Lewis, G. H.; Leyko, A. M.; Leyton, M.; Li, B.; Li, B.; Li, H.; Li, H. L.; Li, S.; Li, X.; Liang, Z.; Liao, H.; Liberti, B.; Lichard, P.; Lie, K.; Liebal, J.; Liebig, W.; Limbach, C.; Limosani, A.; Limper, M.; Lin, S. C.; Linde, F.; Lindquist, B. E.; Linnemann, J. T.; Lipeles, E.; Lipniacka, A.; Lisovyi, M.; Liss, T. M.; Lissauer, D.; Lister, A.; Litke, A. M.; Liu, B.; Liu, D.; Liu, J. B.; Liu, K.; Liu, L.; Liu, M.; Liu, M.; Liu, Y.; Livan, M.; Livermore, S. S. A.; Lleres, A.; Llorente Merino, J.; Lloyd, S. L.; Lo Sterzo, F.; Lobodzinska, E.; Loch, P.; Lockman, W. S.; Loddenkoetter, T.; Loebinger, F. K.; Loevschall-Jensen, A. E.; Loginov, A.; Loh, C. W.; Lohse, T.; Lohwasser, K.; Lokajicek, M.; Lombardo, V. P.; Long, J. D.; Long, R. E.; Lopes, L.; Lopez Mateos, D.; Lopez Paredes, B.; Lorenz, J.; Lorenzo Martinez, N.; Losada, M.; Loscutoff, P.; Losty, M. J.; Lou, X.; Lounis, A.; Love, J.; Love, P. A.; Lowe, A. J.; Lu, F.; Lubatti, H. J.; Luci, C.; Lucotte, A.; Ludwig, D.; Ludwig, I.; Luehring, F.; Lukas, W.; Luminari, L.; Lund, E.; Lundberg, J.; Lundberg, O.; Lund-Jensen, B.; Lungwitz, M.; Lynn, D.; Lysak, R.; Lytken, E.; Ma, H.; Ma, L. L.; Maccarrone, G.; Macchiolo, A.; Maček, B.; Machado Miguens, J.; Macina, D.; Mackeprang, R.; Madar, R.; Madaras, R. J.; Maddocks, H. J.; Mader, W. F.; Madsen, A.; Maeno, M.; Maeno, T.; Magnoni, L.; Magradze, E.; Mahboubi, K.; Mahlstedt, J.; Mahmoud, S.; Mahout, G.; Maiani, C.; Maidantchik, C.; Maio, A.; Majewski, S.; Makida, Y.; Makovec, N.; Mal, P.; Malaescu, B.; Malecki, Pa.; Maleev, V. P.; Malek, F.; Mallik, U.; Malon, D.; Malone, C.; Maltezos, S.; Malyshev, V. M.; Malyukov, S.; Mamuzic, J.; Mandelli, L.; Mandić, I.; Mandrysch, R.; Maneira, J.; Manfredini, A.; Manhaes de Andrade Filho, L.; Manjarres Ramos, J. A.; Mann, A.; Manning, P. M.; Manousakis-Katsikakis, A.; Mansoulie, B.; Mantifel, R.; Mapelli, L.; March, L.; Marchand, J. F.; Marchese, F.; Marchiori, G.; Marcisovsky, M.; Marino, C. P.; Marques, C. N.; Marroquim, F.; Marshall, Z.; Marti, L. F.; Marti-Garcia, S.; Martin, B.; Martin, B.; Martin, J. P.; Martin, T. A.; Martin, V. J.; Martin dit Latour, B.; Martinez, H.; Martinez, M.; Martin-Haugh, S.; Martyniuk, A. C.; Marx, M.; Marzano, F.; Marzin, A.; Masetti, L.; Mashimo, T.; Mashinistov, R.; Masik, J.; Maslennikov, A. L.; Massa, I.; Massol, N.; Mastrandrea, P.; Mastroberardino, A.; Masubuchi, T.; Matsunaga, H.; Matsushita, T.; Mättig, P.; Mättig, S.; Mattmann, J.; Mattravers, C.; Maurer, J.; Maxfield, S. J.; Maximov, D. A.; Mazini, R.; Mazzaferro, L.; Mazzanti, M.; Mc Goldrick, G.; Mc Kee, S. P.; McCarn, A.; McCarthy, R. L.; McCarthy, T. G.; McCubbin, N. A.; McFarlane, K. W.; Mcfayden, J. A.; Mchedlidze, G.; Mclaughlan, T.; McMahon, S. J.; McPherson, R. A.; Meade, A.; Mechnich, J.; Mechtel, M.; Medinnis, M.; Meehan, S.; Meera-Lebbai, R.; Mehlhase, S.; Mehta, A.; Meier, K.; Meineck, C.; Meirose, B.; Melachrinos, C.; Mellado Garcia, B. R.; Meloni, F.; Mendoza Navas, L.; Mengarelli, A.; Menke, S.; Meoni, E.; Mercurio, K. M.; Mergelmeyer, S.; Meric, N.; Mermod, P.; Merola, L.; Meroni, C.; Merritt, F. S.; Merritt, H.; Messina, A.; Metcalfe, J.; Mete, A. S.; Meyer, C.; Meyer, C.; Meyer, J.-P.; Meyer, J.; Meyer, J.; Michal, S.; Middleton, R. P.; Migas, S.; Mijović, L.; Mikenberg, G.; Mikestikova, M.; Mikuž, M.; Miller, D. W.; Mills, C.; Milov, A.; Milstead, D. A.; Milstein, D.; Minaenko, A. A.; Miñano Moya, M.; Minashvili, I. A.; Mincer, A. I.; Mindur, B.; Mineev, M.; Ming, Y.; Mir, L. M.; Mirabelli, G.; Mitani, T.; Mitrevski, J.; Mitsou, V. A.; Mitsui, S.; Miyagawa, P. S.; Mjörnmark, J. U.; Moa, T.; Moeller, V.; Mohapatra, S.; Mohr, W.; Molander, S.; Moles-Valls, R.; Molfetas, A.; Mönig, K.; Monini, C.; Monk, J.; Monnier, E.; Montejo Berlingen, J.; Monticelli, F.; Monzani, S.; Moore, R. W.; Mora Herrera, C.; Moraes, A.; Morange, N.; Morel, J.; Moreno, D.; Moreno Llácer, M.; Morettini, P.; Morgenstern, M.; Morii, M.; Moritz, S.; Morley, A. K.; Mornacchi, G.; Morris, J. D.; Morvaj, L.; Moser, H. G.; Mosidze, M.; Moss, J.; Mount, R.; Mountricha, E.; Mouraviev, S. V.; Moyse, E. J. W.; Mudd, R. D.; Mueller, F.; Mueller, J.; Mueller, K.; Mueller, T.; Mueller, T.; Muenstermann, D.; Munwes, Y.; Murillo Quijada, J. A.; Murray, W. J.; Mussche, I.; Musto, E.; Myagkov, A. G.; Myska, M.; Nackenhorst, O.; Nadal, J.; Nagai, K.; Nagai, R.; Nagai, Y.; Nagano, K.; Nagarkar, A.; Nagasaka, Y.; Nagel, M.; Nairz, A. M.; Nakahama, Y.; Nakamura, K.; Nakamura, T.; Nakano, I.; Namasivayam, H.; Nanava, G.; Napier, A.; Narayan, R.; Nash, M.; Nattermann, T.; Naumann, T.; Navarro, G.; Neal, H. A.; Nechaeva, P. Yu.; Neep, T. J.; Negri, A.; Negri, G.; Negrini, M.; Nektarijevic, S.; Nelson, A.; Nelson, T. K.; Nemecek, S.; Nemethy, P.; Nepomuceno, A. A.; Nessi, M.; Neubauer, M. S.; Neumann, M.; Neusiedl, A.; Neves, R. M.; Nevski, P.; Newcomer, F. M.; Newman, P. R.; Nguyen, D. H.; Nguyen Thi Hong, V.; Nickerson, R. B.; Nicolaidou, R.; Nicquevert, B.; Nielsen, J.; Nikiforou, N.; Nikiforov, A.; Nikolaenko, V.; Nikolic-Audit, I.; Nikolics, K.; Nikolopoulos, K.; Nilsson, P.; Ninomiya, Y.; Nisati, A.; Nisius, R.; Nobe, T.; Nodulman, L.; Nomachi, M.; Nomidis, I.; Norberg, S.; Nordberg, M.; Novakova, J.; Nozaki, M.; Nozka, L.; Ntekas, K.; Nuncio-Quiroz, A.-E.; Nunes Hanninger, G.; Nunnemann, T.; Nurse, E.; O'Brien, B. J.; O'Grady, F.; O'Neil, D. C.; O'Shea, V.; Oakes, L. B.; Oakham, F. G.; Oberlack, H.; Ocariz, J.; Ochi, A.; Ochoa, M. I.; Oda, S.; Odaka, S.; Ogren, H.; Oh, A.; Oh, S. H.; Ohm, C. C.; Ohshima, T.; Okamura, W.; Okawa, H.; Okumura, Y.; Okuyama, T.; Olariu, A.; Olchevski, A. G.; Olivares Pino, S. A.; Oliveira, M.; Oliveira Damazio, D.; Oliver Garcia, E.; Olivito, D.; Olszewski, A.; Olszowska, J.; Onofre, A.; Onyisi, P. U. E.; Oram, C. J.; Oreglia, M. J.; Oren, Y.; Orestano, D.; Orlando, N.; Oropeza Barrera, C.; Orr, R. S.; Osculati, B.; Ospanov, R.; Otero y Garzon, G.; Otono, H.; Ouchrif, M.; Ouellette, E. A.; Ould-Saada, F.; Ouraou, A.; Oussoren, K. P.; Ouyang, Q.; Ovcharova, A.; Owen, M.; Owen, S.; Ozcan, V. E.; Ozturk, N.; Pachal, K.; Pacheco Pages, A.; Padilla Aranda, C.; Pagan Griso, S.; Paganis, E.; Pahl, C.; Paige, F.; Pais, P.; Pajchel, K.; Palacino, G.; Palestini, S.; Pallin, D.; Palma, A.; Palmer, J. D.; Pan, Y. B.; Panagiotopoulou, E.; Panduro Vazquez, J. G.; Pani, P.; Panikashvili, N.; Panitkin, S.; Pantea, D.; Papadopoulou, Th. D.; Papageorgiou, K.; Paramonov, A.; Paredes Hernandez, D.; Parker, M. A.; Parodi, F.; Parsons, J. A.; Parzefall, U.; Pashapour, S.; Pasqualucci, E.; Passaggio, S.; Passeri, A.; Pastore, F.; Pastore, Fr.; Pásztor, G.; Pataraia, S.; Patel, N. D.; Pater, J. R.; Patricelli, S.; Pauly, T.; Pearce, J.; Pedersen, M.; Pedraza Lopez, S.; Pedro, R.; Peleganchuk, S. V.; Pelikan, D.; Peng, H.; Penning, B.; Penwell, J.; Perepelitsa, D. V.; Perez Cavalcanti, T.; Perez Codina, E.; Pérez García-Estañ, M. T.; Perez Reale, V.; Perini, L.; Pernegger, H.; Perrino, R.; Peschke, R.; Peshekhonov, V. D.; Peters, K.; Peters, R. F. Y.; Petersen, B. A.; Petersen, J.; Petersen, T. C.; Petit, E.; Petridis, A.; Petridou, C.; Petrolo, E.; Petrucci, F.; Petteni, M.; Pezoa, R.; Phillips, P. W.; Piacquadio, G.; Pianori, E.; Picazio, A.; Piccaro, E.; Piccinini, M.; Piec, S. M.; Piegaia, R.; Pignotti, D. T.; Pilcher, J. E.; Pilkington, A. D.; Pina, J.; Pinamonti, M.; Pinder, A.; Pinfold, J. L.; Pingel, A.; Pinto, B.; Pizio, C.; Pleier, M.-A.; Pleskot, V.; Plotnikova, E.; Plucinski, P.; Poddar, S.; Podlyski, F.; Poettgen, R.; Poggioli, L.; Pohl, D.; Pohl, M.; Polesello, G.; Policicchio, A.; Polifka, R.; Polini, A.; Pollard, C. S.; Polychronakos, V.; Pomeroy, D.; Pommès, K.; Pontecorvo, L.; Pope, B. G.; Popeneciu, G. A.; Popovic, D. S.; Poppleton, A.; Portell Bueso, X.; Pospelov, G. E.; Pospisil, S.; Potamianos, K.; Potrap, I. N.; Potter, C. J.; Potter, C. T.; Poulard, G.; Poveda, J.; Pozdnyakov, V.; Prabhu, R.; Pralavorio, P.; Pranko, A.; Prasad, S.; Pravahan, R.; Prell, S.; Price, D.; Price, J.; Price, L. E.; Prieur, D.; Primavera, M.; Proissl, M.; Prokofiev, K.; Prokoshin, F.; Protopapadaki, E.; Protopopescu, S.; Proudfoot, J.; Prudent, X.; Przybycien, M.; Przysiezniak, H.; Psoroulas, S.; Ptacek, E.; Pueschel, E.; Puldon, D.; Purohit, M.; Puzo, P.; Pylypchenko, Y.; Qian, J.; Quadt, A.; Quarrie, D. R.; Quayle, W. B.; Quilty, D.; Radeka, V.; Radescu, V.; Radhakrishnan, S. K.; Radloff, P.; Ragusa, F.; Rahal, G.; Rajagopalan, S.; Rammensee, M.; Rammes, M.; Randle-Conde, A. S.; Rangel-Smith, C.; Rao, K.; Rauscher, F.; Rave, T. C.; Ravenscroft, T.; Raymond, M.; Read, A. L.; Rebuzzi, D. M.; Redelbach, A.; Redlinger, G.; Reece, R.; Reeves, K.; Reinsch, A.; Reisin, H.; Reisinger, I.; Relich, M.; Rembser, C.; Ren, Z. L.; Renaud, A.; Rescigno, M.; Resconi, S.; Resende, B.; Reznicek, P.; Rezvani, R.; Richter, R.; Ridel, M.; Rieck, P.; Rijssenbeek, M.; Rimoldi, A.; Rinaldi, L.; Ritsch, E.; Riu, I.; Rivoltella, G.; Rizatdinova, F.; Rizvi, E.; Robertson, S. H.; Robichaud-Veronneau, A.; Robinson, D.; Robinson, J. E. M.; Robson, A.; Rocha de Lima, J. G.; Roda, C.; Roda Dos Santos, D.; Rodrigues, L.; Roe, S.; Røhne, O.; Rolli, S.; Romaniouk, A.; Romano, M.; Romeo, G.; Romero Adam, E.; Rompotis, N.; Roos, L.; Ros, E.; Rosati, S.; Rosbach, K.; Rose, A.; Rose, M.; Rosendahl, P. L.; Rosenthal, O.; Rossetti, V.; Rossi, E.; Rossi, L. P.; Rosten, R.; Rotaru, M.; Roth, I.; Rothberg, J.; Rousseau, D.; Royon, C. R.; Rozanov, A.; Rozen, Y.; Ruan, X.; Rubbo, F.; Rubinskiy, I.; Rud, V. I.; Rudolph, C.; Rudolph, M. S.; Rühr, F.; Ruiz-Martinez, A.; Rumyantsev, L.; Rurikova, Z.; Rusakovich, N. A.; Ruschke, A.; Rutherfoord, J. P.; Ruthmann, N.; Ruzicka, P.; Ryabov, Y. F.; Rybar, M.; Rybkin, G.; Ryder, N. C.; Saavedra, A. F.; Sacerdoti, S.; Saddique, A.; Sadeh, I.; Sadrozinski, H. F.-W.; Sadykov, R.; Safai Tehrani, F.; Sakamoto, H.; Sakurai, Y.; Salamanna, G.; Salamon, A.; Saleem, M.; Salek, D.; Sales De Bruin, P. H.; Salihagic, D.; Salnikov, A.; Salt, J.; Salvachua Ferrando, B. M.; Salvatore, D.; Salvatore, F.; Salvucci, A.; Salzburger, A.; Sampsonidis, D.; Sanchez, A.; Sánchez, J.; Sanchez Martinez, V.; Sandaker, H.; Sander, H. G.; Sanders, M. P.; Sandhoff, M.; Sandoval, T.; Sandoval, C.; Sandstroem, R.; Sankey, D. P. C.; Sansoni, A.; Santoni, C.; Santonico, R.; Santos, H.; Santoyo Castillo, I.; Sapp, K.; Sapronov, A.; Saraiva, J. G.; Sarkisyan-Grinbaum, E.; Sarrazin, B.; Sartisohn, G.; Sasaki, O.; Sasaki, Y.; Sasao, N.; Satsounkevitch, I.; Sauvage, G.; Sauvan, E.; Sauvan, J. B.; Savard, P.; Savinov, V.; Savu, D. O.; Sawyer, C.; Sawyer, L.; Saxon, D. H.; Saxon, J.; Sbarra, C.; Sbrizzi, A.; Scanlon, T.; Scannicchio, D. A.; Scarcella, M.; Schaarschmidt, J.; Schacht, P.; Schaefer, D.; Schaelicke, A.; Schaepe, S.; Schaetzel, S.; Schäfer, U.; Schaffer, A. C.; Schaile, D.; Schamberger, R. D.; Scharf, V.; Schegelsky, V. A.; Scheirich, D.; Schernau, M.; Scherzer, M. I.; Schiavi, C.; Schieck, J.; Schillo, C.; Schioppa, M.; Schlenker, S.; Schmidt, E.; Schmieden, K.; Schmitt, C.; Schmitt, C.; Schmitt, S.; Schneider, B.; Schnellbach, Y. J.; Schnoor, U.; Schoeffel, L.; Schoening, A.; Schoenrock, B. D.; Schorlemmer, A. L. S.; Schott, M.; Schouten, D.; Schovancova, J.; Schram, M.; Schramm, S.; Schreyer, M.; Schroeder, C.; Schroer, N.; Schuh, N.; Schultens, M. J.; Schultz-Coulon, H.-C.; Schulz, H.; Schumacher, M.; Schumm, B. A.; Schune, Ph.; Schwartzman, A.; Schwegler, Ph.; Schwemling, Ph.; Schwienhorst, R.; Schwindling, J.; Schwindt, T.; Schwoerer, M.; Sciacca, F. G.; Scifo, E.; Sciolla, G.; Scott, W. G.; Scutti, F.; Searcy, J.; Sedov, G.; Sedykh, E.; Seidel, S. C.; Seiden, A.; Seifert, F.; Seixas, J. M.; Sekhniaidze, G.; Sekula, S. J.; Selbach, K. E.; Seliverstov, D. M.; Sellers, G.; Seman, M.; Semprini-Cesari, N.; Serfon, C.; Serin, L.; Serkin, L.; Serre, T.; Seuster, R.; Severini, H.; Sforza, F.; Sfyrla, A.; Shabalina, E.; Shamim, M.; Shan, L. Y.; Shank, J. T.; Shao, Q. T.; Shapiro, M.; Shatalov, P. B.; Shaw, K.; Sherwood, P.; Shimizu, S.; Shimojima, M.; Shin, T.; Shiyakova, M.; Shmeleva, A.; Shochet, M. J.; Short, D.; Shrestha, S.; Shulga, E.; Shupe, M. A.; Shushkevich, S.; Sicho, P.; Sidorov, D.; Sidoti, A.; Siegert, F.; Sijacki, Dj.; Silbert, O.; Silva, J.; Silver, Y.; Silverstein, D.; Silverstein, S. B.; Simak, V.; Simard, O.; Simic, Lj.; Simion, S.; Simioni, E.; Simmons, B.; Simoniello, R.; Simonyan, M.; Sinervo, P.; Sinev, N. B.; Sipica, V.; Siragusa, G.; Sircar, A.; Sisakyan, A. N.; Sivoklokov, S. Yu.; Sjölin, J.; Sjursen, T. B.; Skinnari, L. A.; Skottowe, H. P.; Skovpen, K. Yu.; Skubic, P.; Slater, M.; Slavicek, T.; Sliwa, K.; Smakhtin, V.; Smart, B. H.; Smestad, L.; Smirnov, S. Yu.; Smirnov, Y.; Smirnova, L. N.; Smirnova, O.; Smith, K. M.; Smizanska, M.; Smolek, K.; Snesarev, A. A.; Snidero, G.; Snow, J.; Snyder, S.; Sobie, R.; Socher, F.; Sodomka, J.; Soffer, A.; Soh, D. A.; Solans, C. A.; Solar, M.; Solc, J.; Soldatov, E. Yu.; Soldevila, U.; Solfaroli Camillocci, E.; Solodkov, A. A.; Solovyanov, O. V.; Solovyev, V.; Soni, N.; Sood, A.; Sopko, V.; Sopko, B.; Sosebee, M.; Soualah, R.; Soueid, P.; Soukharev, A. M.; South, D.; Spagnolo, S.; Spanò, F.; Spearman, W. R.; Spighi, R.; Spigo, G.; Spousta, M.; Spreitzer, T.; Spurlock, B.; St. Denis, R. D.; Stahlman, J.; Stamen, R.; Stanecka, E.; Stanek, R. W.; Stanescu, C.; Stanescu-Bellu, M.; Stanitzki, M. M.; Stapnes, S.; Starchenko, E. A.; Stark, J.; Staroba, P.; Starovoitov, P.; Staszewski, R.; Stavina, P.; Steele, G.; Steinbach, P.; Steinberg, P.; Stekl, I.; Stelzer, B.; Stelzer, H. J.; Stelzer-Chilton, O.; Stenzel, H.; Stern, S.; Stewart, G. A.; Stillings, J. A.; Stockton, M. C.; Stoebe, M.; Stoerig, K.; Stoicea, G.; Stonjek, S.; Stradling, A. R.; Straessner, A.; Strandberg, J.; Strandberg, S.; Strandlie, A.; Strauss, E.; Strauss, M.; Strizenec, P.; Ströhmer, R.; Strom, D. M.; Stroynowski, R.; Stucci, S. A.; Stugu, B.; Stumer, I.; Stupak, J.; Sturm, P.; Styles, N. A.; Su, D.; Su, J.; Subramania, HS.; Subramaniam, R.; Succurro, A.; Sugaya, Y.; Suhr, C.; Suk, M.; Sulin, V. V.; Sultansoy, S.; Sumida, T.; Sun, X.; Sundermann, J. E.; Suruliz, K.; Susinno, G.; Sutton, M. R.; Suzuki, Y.; Svatos, M.; Swedish, S.; Swiatlowski, M.; Sykora, I.; Sykora, T.; Ta, D.; Tackmann, K.; Taenzer, J.; Taffard, A.; Tafirout, R.; Taiblum, N.; Takahashi, Y.; Takai, H.; Takashima, R.; Takeda, H.; Takeshita, T.; Takubo, Y.; Talby, M.; Talyshev, A. A.; Tam, J. Y. C.; Tamsett, M. C.; Tan, K. G.; Tanaka, J.; Tanaka, R.; Tanaka, S.; Tanaka, S.; Tanasijczuk, A. J.; Tani, K.; Tannoury, N.; Tapprogge, S.; Tarem, S.; Tarrade, F.; Tartarelli, G. F.; Tas, P.; Tasevsky, M.; Tashiro, T.; Tassi, E.; Tavares Delgado, A.; Tayalati, Y.; Taylor, C.; Taylor, F. E.; Taylor, G. N.; Taylor, W.; Teischinger, F. A.; Teixeira Dias Castanheira, M.; Teixeira-Dias, P.; Temming, K. K.; Ten Kate, H.; Teng, P. K.; Terada, S.; Terashi, K.; Terron, J.; Terzo, S.; Testa, M.; Teuscher, R. J.; Therhaag, J.; Theveneaux-Pelzer, T.; Thoma, S.; Thomas, J. P.; Thompson, E. N.; Thompson, P. D.; Thompson, P. D.; Thompson, A. S.; Thomsen, L. A.; Thomson, E.; Thomson, M.; Thong, W. M.; Thun, R. P.; Tian, F.; Tibbetts, M. J.; Tic, T.; Tikhomirov, V. O.; Tikhonov, Yu. A.; Timoshenko, S.; Tiouchichine, E.; Tipton, P.; Tisserant, S.; Todorov, T.; Todorova-Nova, S.; Toggerson, B.; Tojo, J.; Tokár, S.; Tokushuku, K.; Tollefson, K.; Tomlinson, L.; Tomoto, M.; Tompkins, L.; Toms, K.; Topilin, N. D.; Torrence, E.; Torres, H.; Torró Pastor, E.; Toth, J.; Touchard, F.; Tovey, D. R.; Tran, H. L.; Trefzger, T.; Tremblet, L.; Tricoli, A.; Trigger, I. M.; Trincaz-Duvoid, S.; Tripiana, M. F.; Triplett, N.; Trischuk, W.; Trocmé, B.; Troncon, C.; Trottier-McDonald, M.; Trovatelli, M.; True, P.; Trzebinski, M.; Trzupek, A.; Tsarouchas, C.; Tseng, J. C.-L.; Tsiareshka, P. V.; Tsionou, D.; Tsipolitis, G.; Tsirintanis, N.; Tsiskaridze, S.; Tsiskaridze, V.; Tskhadadze, E. G.; Tsukerman, I. I.; Tsulaia, V.; Tsung, J.-W.; Tsuno, S.; Tsybychev, D.; Tua, A.; Tudorache, A.; Tudorache, V.; Tuggle, J. M.; Tuna, A. N.; Tupputi, S. A.; Turchikhin, S.; Turecek, D.; Turk Cakir, I.; Turra, R.; Tuts, P. M.; Tykhonov, A.; Tylmad, M.; Tyndel, M.; Uchida, K.; Ueda, I.; Ueno, R.; Ughetto, M.; Ugland, M.; Uhlenbrock, M.; Ukegawa, F.; Unal, G.; Undrus, A.; Unel, G.; Ungaro, F. C.; Unno, Y.; Urbaniec, D.; Urquijo, P.; Usai, G.; Usanova, A.; Vacavant, L.; Vacek, V.; Vachon, B.; Valencic, N.; Valentinetti, S.; Valero, A.; Valery, L.; Valkar, S.; Valladolid Gallego, E.; Vallecorsa, S.; Valls Ferrer, J. A.; Van Berg, R.; Van Der Deijl, P. C.; van der Geer, R.; van der Graaf, H.; Van Der Leeuw, R.; van der Ster, D.; van Eldik, N.; van Gemmeren, P.; Van Nieuwkoop, J.; van Vulpen, I.; van Woerden, M. C.; Vanadia, M.; Vandelli, W.; Vaniachine, A.; Vankov, P.; Vannucci, F.; Vardanyan, G.; Vari, R.; Varnes, E. W.; Varol, T.; Varouchas, D.; Vartapetian, A.; Varvell, K. E.; Vassilakopoulos, V. I.; Vazeille, F.; Vazquez Schroeder, T.; Veatch, J.; Veloso, F.; Veneziano, S.; Ventura, A.; Ventura, D.; Venturi, M.; Venturi, N.; Venturini, A.; Vercesi, V.; Verducci, M.; Verkerke, W.; Vermeulen, J. C.; Vest, A.; Vetterli, M. C.; Viazlo, O.; Vichou, I.; Vickey, T.; Vickey Boeriu, O. E.; Viehhauser, G. H. A.; Viel, S.; Vigne, R.; Villa, M.; Villaplana Perez, M.; Vilucchi, E.; Vincter, M. G.; Vinogradov, V. B.; Virzi, J.; Vitells, O.; Viti, M.; Vivarelli, I.; Vives Vaque, F.; Vlachos, S.; Vladoiu, D.; Vlasak, M.; Vogel, A.; Vokac, P.; Volpi, G.; Volpi, M.; Volpini, G.; von der Schmitt, H.; von Radziewski, H.; von Toerne, E.; Vorobel, V.; Vos, M.; Voss, R.; Vossebeld, J. H.; Vranjes, N.; Vranjes Milosavljevic, M.; Vrba, V.; Vreeswijk, M.; Vu Anh, T.; Vuillermet, R.; Vukotic, I.; Vykydal, Z.; Wagner, W.; Wagner, P.; Wahrmund, S.; Wakabayashi, J.; Walch, S.; Walder, J.; Walker, R.; Walkowiak, W.; Wall, R.; Waller, P.; Walsh, B.; Wang, C.; Wang, H.; Wang, H.; Wang, J.; Wang, J.; Wang, K.; Wang, R.; Wang, S. M.; Wang, T.; Wang, X.; Warburton, A.; Ward, C. P.; Wardrope, D. R.; Warsinsky, M.; Washbrook, A.; Wasicki, C.; Watanabe, I.; Watkins, P. M.; Watson, A. T.; Watson, I. J.; Watson, M. F.; Watts, G.; Watts, S.; Waugh, A. T.; Waugh, B. M.; Webb, S.; Weber, M. S.; Weber, S. W.; Webster, J. S.; Weidberg, A. R.; Weigell, P.; Weingarten, J.; Weiser, C.; Weits, H.; Wells, P. S.; Wenaus, T.; Wendland, D.; Weng, Z.; Wengler, T.; Wenig, S.; Wermes, N.; Werner, M.; Werner, P.; Wessels, M.; Wetter, J.; Whalen, K.; White, A.; White, M. J.; White, R.; White, S.; Whiteson, D.; Whittington, D.; Wicke, D.; Wickens, F. J.; Wiedenmann, W.; Wielers, M.; Wienemann, P.; Wiglesworth, C.; Wiik-Fuchs, L. A. M.; Wijeratne, P. A.; Wildauer, A.; Wildt, M. A.; Wilhelm, I.; Wilkens, H. G.; Will, J. Z.; Williams, H. H.; Williams, S.; Willis, W.; Willocq, S.; Wilson, J. A.; Wilson, A.; Wingerter-Seez, I.; Winkelmann, S.; Winklmeier, F.; Wittgen, M.; Wittig, T.; Wittkowski, J.; Wollstadt, S. J.; Wolter, M. W.; Wolters, H.; Wong, W. C.; Wosiek, B. K.; Wotschack, J.; Woudstra, M. J.; Wozniak, K. W.; Wraight, K.; Wright, M.; Wu, S. L.; Wu, X.; Wu, Y.; Wulf, E.; Wyatt, T. R.; Wynne, B. M.; Xella, S.; Xiao, M.; Xu, C.; Xu, D.; Xu, L.; Yabsley, B.; Yacoob, S.; Yamada, M.; Yamaguchi, H.; Yamaguchi, Y.; Yamamoto, A.; Yamamoto, K.; Yamamoto, S.; Yamamura, T.; Yamanaka, T.; Yamauchi, K.; Yamazaki, Y.; Yan, Z.; Yang, H.; Yang, H.; Yang, U. K.; Yang, Y.; Yanush, S.; Yao, L.; Yasu, Y.; Yatsenko, E.; Yau Wong, K. H.; Ye, J.; Ye, S.; Yen, A. L.; Yildirim, E.; Yilmaz, M.; Yoosoofmiya, R.; Yorita, K.; Yoshida, R.; Yoshihara, K.; Young, C.; Young, C. J. S.; Youssef, S.; Yu, D. R.; Yu, J.; Yu, J.; Yuan, L.; Yurkewicz, A.; Zabinski, B.; Zaidan, R.; Zaitsev, A. M.; Zaman, A.; Zambito, S.; Zanello, L.; Zanzi, D.; Zaytsev, A.; Zeitnitz, C.; Zeman, M.; Zemla, A.; Zengel, K.; Zenin, O.; Ženiš, T.; Zerwas, D.; Zevi della Porta, G.; Zhang, D.; Zhang, H.; Zhang, J.; Zhang, L.; Zhang, X.; Zhang, Z.; Zhao, Z.; Zhemchugov, A.; Zhong, J.; Zhou, B.; Zhou, L.; Zhou, N.; Zhu, C. G.; Zhu, H.; Zhu, J.; Zhu, Y.; Zhuang, X.; Zibell, A.; Zieminska, D.; Zimine, N. I.; Zimmermann, C.; Zimmermann, R.; Zimmermann, S.; Zimmermann, S.; Zinonos, Z.; Ziolkowski, M.; Zitoun, R.; Zobernig, G.; Zoccoli, A.; zur Nedden, M.; Zurzolo, G.; Zutshi, V.; Zwalinski, L.

    2015-01-01

    The jet energy scale (JES) and its systematic uncertainty are determined for jets measured with the ATLAS detector using proton-proton collision data with a centre-of-mass energy of TeV corresponding to an integrated luminosity of . Jets are reconstructed from energy deposits forming topological clusters of calorimeter cells using the anti- algorithm with distance parameters or , and are calibrated using MC simulations. A residual JES correction is applied to account for differences between data and MC simulations. This correction and its systematic uncertainty are estimated using a combination of in situ techniques exploiting the transverse momentum balance between a jet and a reference object such as a photon or a boson, for and pseudorapidities . The effect of multiple proton-proton interactions is corrected for, and an uncertainty is evaluated using in situ techniques. The smallest JES uncertainty of less than 1 % is found in the central calorimeter region () for jets with . For central jets at lower , the uncertainty is about 3 %. A consistent JES estimate is found using measurements of the calorimeter response of single hadrons in proton-proton collisions and test-beam data, which also provide the estimate for TeV. The calibration of forward jets is derived from dijet balance measurements. The resulting uncertainty reaches its largest value of 6 % for low- jets at . Additional JES uncertainties due to specific event topologies, such as close-by jets or selections of event samples with an enhanced content of jets originating from light quarks or gluons, are also discussed. The magnitude of these uncertainties depends on the event sample used in a given physics analysis, but typically amounts to 0.5-3 %.

  13. Application of probabilistic modelling for the uncertainty evaluation of alignment measurements of large accelerator magnets assemblies

    NASA Astrophysics Data System (ADS)

    Doytchinov, I.; Tonnellier, X.; Shore, P.; Nicquevert, B.; Modena, M.; Mainaud Durand, H.

    2018-05-01

    Micrometric assembly and alignment requirements for future particle accelerators, and especially large assemblies, create the need for accurate uncertainty budgeting of alignment measurements. Measurements and uncertainties have to be accurately stated and traceable, to international standards, for metre-long sized assemblies, in the range of tens of µm. Indeed, these hundreds of assemblies will be produced and measured by several suppliers around the world, and will have to be integrated into a single machine. As part of the PACMAN project at CERN, we proposed and studied a practical application of probabilistic modelling of task-specific alignment uncertainty by applying a simulation by constraints calibration method. Using this method, we calibrated our measurement model using available data from ISO standardised tests (10360 series) for the metrology equipment. We combined this model with reference measurements and analysis of the measured data to quantify the actual specific uncertainty of each alignment measurement procedure. Our methodology was successfully validated against a calibrated and traceable 3D artefact as part of an international inter-laboratory study. The validated models were used to study the expected alignment uncertainty and important sensitivity factors in measuring the shortest and longest of the compact linear collider study assemblies, 0.54 m and 2.1 m respectively. In both cases, the laboratory alignment uncertainty was within the targeted uncertainty budget of 12 µm (68% confidence level). It was found that the remaining uncertainty budget for any additional alignment error compensations, such as the thermal drift error due to variation in machine operation heat load conditions, must be within 8.9 µm and 9.8 µm (68% confidence level) respectively.

  14. Modelling of the X,Y,Z positioning errors and uncertainty evaluation for the LNE’s mAFM using the Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Ceria, Paul; Ducourtieux, Sebastien; Boukellal, Younes; Allard, Alexandre; Fischer, Nicolas; Feltin, Nicolas

    2017-03-01

    In order to evaluate the uncertainty budget of the LNE’s mAFM, a reference instrument dedicated to the calibration of nanoscale dimensional standards, a numerical model has been developed to evaluate the measurement uncertainty of the metrology loop involved in the XYZ positioning of the tip relative to the sample. The objective of this model is to overcome difficulties experienced when trying to evaluate some uncertainty components which cannot be experimentally determined and more specifically, the one linked to the geometry of the metrology loop. The model is based on object-oriented programming and developed under Matlab. It integrates one hundred parameters that allow the control of the geometry of the metrology loop without using analytical formulae. The created objects, mainly the reference and the mobile prism and their mirrors, the interferometers and their laser beams, can be moved and deformed freely to take into account several error sources. The Monte Carlo method is then used to determine the positioning uncertainty of the instrument by randomly drawing the parameters according to their associated tolerances and their probability density functions (PDFs). The whole process follows Supplement 2 to ‘The Guide to the Expression of the Uncertainty in Measurement’ (GUM). Some advanced statistical tools like Morris design and Sobol indices are also used to provide a sensitivity analysis by identifying the most influential parameters and quantifying their contribution to the XYZ positioning uncertainty. The approach validated in the paper shows that the actual positioning uncertainty is about 6 nm. As the final objective is to reach 1 nm, we engage in a discussion to estimate the most effective way to reduce the uncertainty.

  15. Predictions of space radiation fatality risk for exploration missions

    NASA Astrophysics Data System (ADS)

    Cucinotta, Francis A.; To, Khiet; Cacao, Eliedonna

    2017-05-01

    In this paper we describe revisions to the NASA Space Cancer Risk (NSCR) model focusing on updates to probability distribution functions (PDF) representing the uncertainties in the radiation quality factor (QF) model parameters and the dose and dose-rate reduction effectiveness factor (DDREF). We integrate recent heavy ion data on liver, colorectal, intestinal, lung, and Harderian gland tumors with other data from fission neutron experiments into the model analysis. In an earlier work we introduced distinct QFs for leukemia and solid cancer risk predictions, and here we consider liver cancer risks separately because of the higher RBE's reported in mouse experiments compared to other tumors types, and distinct risk factors for liver cancer for astronauts compared to the U.S. population. The revised model is used to make predictions of fatal cancer and circulatory disease risks for 1-year deep space and International Space Station (ISS) missions, and a 940 day Mars mission. We analyzed the contribution of the various model parameter uncertainties to the overall uncertainty, which shows that the uncertainties in relative biological effectiveness (RBE) factors at high LET due to statistical uncertainties and differences across tissue types and mouse strains are the dominant uncertainty. NASA's exposure limits are approached or exceeded for each mission scenario considered. Two main conclusions are made: 1) Reducing the current estimate of about a 3-fold uncertainty to a 2-fold or lower uncertainty will require much more expansive animal carcinogenesis studies in order to reduce statistical uncertainties and understand tissue, sex and genetic variations. 2) Alternative model assumptions such as non-targeted effects, increased tumor lethality and decreased latency at high LET, and non-cancer mortality risks from circulatory diseases could significantly increase risk estimates to several times higher than the NASA limits.

  16. A parallel calibration utility for WRF-Hydro on high performance computers

    NASA Astrophysics Data System (ADS)

    Wang, J.; Wang, C.; Kotamarthi, V. R.

    2017-12-01

    A successful modeling of complex hydrological processes comprises establishing an integrated hydrological model which simulates the hydrological processes in each water regime, calibrates and validates the model performance based on observation data, and estimates the uncertainties from different sources especially those associated with parameters. Such a model system requires large computing resources and often have to be run on High Performance Computers (HPC). The recently developed WRF-Hydro modeling system provides a significant advancement in the capability to simulate regional water cycles more completely. The WRF-Hydro model has a large range of parameters such as those in the input table files — GENPARM.TBL, SOILPARM.TBL and CHANPARM.TBL — and several distributed scaling factors such as OVROUGHRTFAC. These parameters affect the behavior and outputs of the model and thus may need to be calibrated against the observations in order to obtain a good modeling performance. Having a parameter calibration tool specifically for automate calibration and uncertainty estimates of WRF-Hydro model can provide significant convenience for the modeling community. In this study, we developed a customized tool using the parallel version of the model-independent parameter estimation and uncertainty analysis tool, PEST, to enabled it to run on HPC with PBS and SLURM workload manager and job scheduler. We also developed a series of PEST input file templates that are specifically for WRF-Hydro model calibration and uncertainty analysis. Here we will present a flood case study occurred in April 2013 over Midwest. The sensitivity and uncertainties are analyzed using the customized PEST tool we developed.

  17. A facility location model for municipal solid waste management system under uncertain environment.

    PubMed

    Yadav, Vinay; Bhurjee, A K; Karmakar, Subhankar; Dikshit, A K

    2017-12-15

    In municipal solid waste management system, decision makers have to develop an insight into the processes namely, waste generation, collection, transportation, processing, and disposal methods. Many parameters (e.g., waste generation rate, functioning costs of facilities, transportation cost, and revenues) in this system are associated with uncertainties. Often, these uncertainties of parameters need to be modeled under a situation of data scarcity for generating probability distribution function or membership function for stochastic mathematical programming or fuzzy mathematical programming respectively, with only information of extreme variations. Moreover, if uncertainties are ignored, then the problems like insufficient capacities of waste management facilities or improper utilization of available funds may be raised. To tackle uncertainties of these parameters in a more efficient manner an algorithm, based on interval analysis, has been developed. This algorithm is applied to find optimal solutions for a facility location model, which is formulated to select economically best locations of transfer stations in a hypothetical urban center. Transfer stations are an integral part of contemporary municipal solid waste management systems, and economic siting of transfer stations ensures financial sustainability of this system. The model is written in a mathematical programming language AMPL with KNITRO as a solver. The developed model selects five economically best locations out of ten potential locations with an optimum overall cost of [394,836, 757,440] Rs. 1 /day ([5906, 11,331] USD/day) approximately. Further, the requirement of uncertainty modeling is explained based on the results of sensitivity analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. SCALE Code System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less

  19. SCALE Code System 6.2.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely-used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor and lattice physics, radiation shielding, spent fuel and radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including three deterministicmore » and three Monte Carlo radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results.« less

  20. A TIERED APPROACH TO PERFORMING UNCERTAINTY ANALYSIS IN CONDUCTING EXPOSURE ANALYSIS FOR CHEMICALS

    EPA Science Inventory

    The WHO/IPCS draft Guidance Document on Characterizing and Communicating Uncertainty in Exposure Assessment provides guidance on recommended strategies for conducting uncertainty analysis as part of human exposure analysis. Specifically, a tiered approach to uncertainty analysis ...

  1. Design for cyclic loading endurance of composites

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Murthy, Pappu L. N.; Chamis, Christos C.; Liaw, Leslie D. G.

    1993-01-01

    The application of the computer code IPACS (Integrated Probabilistic Assessment of Composite Structures) to aircraft wing type structures is described. The code performs a complete probabilistic analysis for composites taking into account the uncertainties in geometry, boundary conditions, material properties, laminate lay-ups, and loads. Results of the analysis are presented in terms of cumulative distribution functions (CDF) and probability density function (PDF) of the fatigue life of a wing type composite structure under different hygrothermal environments subjected to the random pressure. The sensitivity of the fatigue life to a number of critical structural/material variables is also computed from the analysis.

  2. Systems and methods for analyzing building operations sensor data

    DOEpatents

    Mezic, Igor; Eisenhower, Bryan A.

    2015-05-26

    Systems and methods are disclosed for analyzing building sensor information and decomposing the information therein to a more manageable and more useful form. Certain embodiments integrate energy-based and spectral-based analysis methods with parameter sampling and uncertainty/sensitivity analysis to achieve a more comprehensive perspective of building behavior. The results of this analysis may be presented to a user via a plurality of visualizations and/or used to automatically adjust certain building operations. In certain embodiments, advanced spectral techniques, including Koopman-based operations, are employed to discern features from the collected building sensor data.

  3. Probabilistic Aeroelastic Analysis Developed for Turbomachinery Components

    NASA Technical Reports Server (NTRS)

    Reddy, T. S. R.; Mital, Subodh K.; Stefko, George L.; Pai, Shantaram S.

    2003-01-01

    Aeroelastic analyses for advanced turbomachines are being developed for use at the NASA Glenn Research Center and industry. However, these analyses at present are used for turbomachinery design with uncertainties accounted for by using safety factors. This approach may lead to overly conservative designs, thereby reducing the potential of designing higher efficiency engines. An integration of the deterministic aeroelastic analysis methods with probabilistic analysis methods offers the potential to design efficient engines with fewer aeroelastic problems and to make a quantum leap toward designing safe reliable engines. In this research, probabilistic analysis is integrated with aeroelastic analysis: (1) to determine the parameters that most affect the aeroelastic characteristics (forced response and stability) of a turbomachine component such as a fan, compressor, or turbine and (2) to give the acceptable standard deviation on the design parameters for an aeroelastically stable system. The approach taken is to combine the aeroelastic analysis of the MISER (MIStuned Engine Response) code with the FPI (fast probability integration) code. The role of MISER is to provide the functional relationships that tie the structural and aerodynamic parameters (the primitive variables) to the forced response amplitudes and stability eigenvalues (the response properties). The role of FPI is to perform probabilistic analyses by utilizing the response properties generated by MISER. The results are a probability density function for the response properties. The probabilistic sensitivities of the response variables to uncertainty in primitive variables are obtained as a byproduct of the FPI technique. The combined analysis of aeroelastic and probabilistic analysis is applied to a 12-bladed cascade vibrating in bending and torsion. Out of the total 11 design parameters, 6 are considered as having probabilistic variation. The six parameters are space-to-chord ratio (SBYC), stagger angle (GAMA), elastic axis (ELAXS), Mach number (MACH), mass ratio (MASSR), and frequency ratio (WHWB). The cascade is considered to be in subsonic flow with Mach 0.7. The results of the probabilistic aeroelastic analysis are the probability density function of predicted aerodynamic damping and frequency for flutter and the response amplitudes for forced response.

  4. iSCHRUNK--In Silico Approach to Characterization and Reduction of Uncertainty in the Kinetic Models of Genome-scale Metabolic Networks.

    PubMed

    Andreozzi, Stefano; Miskovic, Ljubisa; Hatzimanikatis, Vassily

    2016-01-01

    Accurate determination of physiological states of cellular metabolism requires detailed information about metabolic fluxes, metabolite concentrations and distribution of enzyme states. Integration of fluxomics and metabolomics data, and thermodynamics-based metabolic flux analysis contribute to improved understanding of steady-state properties of metabolism. However, knowledge about kinetics and enzyme activities though essential for quantitative understanding of metabolic dynamics remains scarce and involves uncertainty. Here, we present a computational methodology that allow us to determine and quantify the kinetic parameters that correspond to a certain physiology as it is described by a given metabolic flux profile and a given metabolite concentration vector. Though we initially determine kinetic parameters that involve a high degree of uncertainty, through the use of kinetic modeling and machine learning principles we are able to obtain more accurate ranges of kinetic parameters, and hence we are able to reduce the uncertainty in the model analysis. We computed the distribution of kinetic parameters for glucose-fed E. coli producing 1,4-butanediol and we discovered that the observed physiological state corresponds to a narrow range of kinetic parameters of only a few enzymes, whereas the kinetic parameters of other enzymes can vary widely. Furthermore, this analysis suggests which are the enzymes that should be manipulated in order to engineer the reference state of the cell in a desired way. The proposed approach also sets up the foundations of a novel type of approaches for efficient, non-asymptotic, uniform sampling of solution spaces. Copyright © 2015 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pritychenko, B.; Mughabghab, S.F.

    We present calculations of neutron thermal cross sections, Westcott factors, resonance integrals, Maxwellian-averaged cross sections and astrophysical reaction rates for 843 ENDF materials using data from the major evaluated nuclear libraries and European activation file. Extensive analysis of newly-evaluated neutron reaction cross sections, neutron covariances, and improvements in data processing techniques motivated us to calculate nuclear industry and neutron physics quantities, produce s-process Maxwellian-averaged cross sections and astrophysical reaction rates, systematically calculate uncertainties, and provide additional insights on currently available neutron-induced reaction data. Nuclear reaction calculations are discussed and new results are presented. Due to space limitations, the present papermore » contains only calculated Maxwellian-averaged cross sections and their uncertainties. The complete data sets for all results are published in the Brookhaven National Laboratory report.« less

  6. An Applied Framework for Incorporating Multiple Sources of Uncertainty in Fisheries Stock Assessments.

    PubMed

    Scott, Finlay; Jardim, Ernesto; Millar, Colin P; Cerviño, Santiago

    2016-01-01

    Estimating fish stock status is very challenging given the many sources and high levels of uncertainty surrounding the biological processes (e.g. natural variability in the demographic rates), model selection (e.g. choosing growth or stock assessment models) and parameter estimation. Incorporating multiple sources of uncertainty in a stock assessment allows advice to better account for the risks associated with proposed management options, promoting decisions that are more robust to such uncertainty. However, a typical assessment only reports the model fit and variance of estimated parameters, thereby underreporting the overall uncertainty. Additionally, although multiple candidate models may be considered, only one is selected as the 'best' result, effectively rejecting the plausible assumptions behind the other models. We present an applied framework to integrate multiple sources of uncertainty in the stock assessment process. The first step is the generation and conditioning of a suite of stock assessment models that contain different assumptions about the stock and the fishery. The second step is the estimation of parameters, including fitting of the stock assessment models. The final step integrates across all of the results to reconcile the multi-model outcome. The framework is flexible enough to be tailored to particular stocks and fisheries and can draw on information from multiple sources to implement a broad variety of assumptions, making it applicable to stocks with varying levels of data availability The Iberian hake stock in International Council for the Exploration of the Sea (ICES) Divisions VIIIc and IXa is used to demonstrate the framework, starting from length-based stock and indices data. Process and model uncertainty are considered through the growth, natural mortality, fishing mortality, survey catchability and stock-recruitment relationship. Estimation uncertainty is included as part of the fitting process. Simple model averaging is used to integrate across the results and produce a single assessment that considers the multiple sources of uncertainty.

  7. A Nuclear Waste Management Cost Model for Policy Analysis

    NASA Astrophysics Data System (ADS)

    Barron, R. W.; Hill, M. C.

    2017-12-01

    Although integrated assessments of climate change policy have frequently identified nuclear energy as a promising alternative to fossil fuels, these studies have often treated nuclear waste disposal very simply. Simple assumptions about nuclear waste are problematic because they may not be adequate to capture relevant costs and uncertainties, which could result in suboptimal policy choices. Modeling nuclear waste management costs is a cross-disciplinary, multi-scale problem that involves economic, geologic and environmental processes that operate at vastly different temporal scales. Similarly, the climate-related costs and benefits of nuclear energy are dependent on environmental sensitivity to CO2 emissions and radiation, nuclear energy's ability to offset carbon emissions, and the risk of nuclear accidents, factors which are all deeply uncertain. Alternative value systems further complicate the problem by suggesting different approaches to valuing intergenerational impacts. Effective policy assessment of nuclear energy requires an integrated approach to modeling nuclear waste management that (1) bridges disciplinary and temporal gaps, (2) supports an iterative, adaptive process that responds to evolving understandings of uncertainties, and (3) supports a broad range of value systems. This work develops the Nuclear Waste Management Cost Model (NWMCM). NWMCM provides a flexible framework for evaluating the cost of nuclear waste management across a range of technology pathways and value systems. We illustrate how NWMCM can support policy analysis by estimating how different nuclear waste disposal scenarios developed using the NWMCM framework affect the results of a recent integrated assessment study of alternative energy futures and their effects on the cost of achieving carbon abatement targets. Results suggest that the optimism reflected in previous works is fragile: Plausible nuclear waste management costs and discount rates appropriate for intergenerational cost-benefit analysis produce many scenarios where nuclear energy is economically unattractive.

  8. An integrative cross-design synthesis approach to estimate the cost of illness: an applied case to the cost of depression in Catalonia.

    PubMed

    Bendeck, Murielle; Serrano-Blanco, Antoni; García-Alonso, Carlos; Bonet, Pere; Jordà, Esther; Sabes-Figuera, Ramon; Salvador-Carulla, Luis

    2013-04-01

    Cost of illness (COI) studies are carried out under conditions of uncertainty and with incomplete information. There are concerns regarding their generalisability, accuracy and usability in evidence-informed care. A hybrid methodology is used to estimate the regional costs of depression in Catalonia (Spain) following an integrative approach. The cross-design synthesis included nominal groups and quantitative analysis of both top-down and bottom-up studies, and incorporated primary and secondary data from different sources of information in Catalonia. Sensitivity analysis used probabilistic Monte Carlo simulation modelling. A dissemination strategy was planned, including a standard form adapted from cost-effectiveness studies to summarise methods and results. The method used allows for a comprehensive estimate of the cost of depression in Catalonia. Health officers and decision-makers concluded that this methodology provided useful information and knowledge for evidence-informed planning in mental health. The mix of methods, combined with a simulation model, contributed to a reduction in data gaps and, in conditions of uncertainty, supplied more complete information on the costs of depression in Catalonia. This approach to COI should be differentiated from other COI designs to allow like-with-like comparisons. A consensus on COI typology, procedures and dissemination is needed.

  9. Integrating uncertainty into public energy research and development decisions

    NASA Astrophysics Data System (ADS)

    Anadón, Laura Díaz; Baker, Erin; Bosetti, Valentina

    2017-05-01

    Public energy research and development (R&D) is recognized as a key policy tool for transforming the world's energy system in a cost-effective way. However, managing the uncertainty surrounding technological change is a critical challenge for designing robust and cost-effective energy policies. The design of such policies is particularly important if countries are going to both meet the ambitious greenhouse-gas emissions reductions goals set by the Paris Agreement and achieve the required harmonization with the broader set of objectives dictated by the Sustainable Development Goals. The complexity of informing energy technology policy requires, and is producing, a growing collaboration between different academic disciplines and practitioners. Three analytical components have emerged to support the integration of technological uncertainty into energy policy: expert elicitations, integrated assessment models, and decision frameworks. Here we review efforts to incorporate all three approaches to facilitate public energy R&D decision-making under uncertainty. We highlight emerging insights that are robust across elicitations, models, and frameworks, relating to the allocation of public R&D investments, and identify gaps and challenges that remain.

  10. Application of an Integrated HPC Reliability Prediction Framework to HMMWV Suspension System

    DTIC Science & Technology

    2010-09-13

    model number M966 (TOW Missle Carrier, Basic Armor without weapons), since they were available. Tires used for all simulations were the bias-type...vehicle fleet, including consideration of all kinds of uncertainty, especially including model uncertainty. The end result will be a tool to use...building an adequate vehicle reliability prediction framework for military vehicles is the accurate modeling of the integration of various types of

  11. Nuclear Data Needs for the Neutronic Design of MYRRHA Fast Spectrum Research Reactor

    NASA Astrophysics Data System (ADS)

    Stankovskiy, A.; Malambu, E.; Van den Eynde, G.; Díez, C. J.

    2014-04-01

    A global sensitivity analysis of effective neutron multiplication factor to the change of nuclear data library has been performed. It revealed that the test version of JEFF-3.2 neutron-induced evaluated data library produces closer results to ENDF/B-VII.1 than JEFF-3.1.2 does. The analysis of contributions of individual evaluations into keff sensitivity resulted in the priority list of nuclides, uncertainties on cross sections and fission neutron multiplicities of which have to be improved by setting up dedicated differential and integral experiments.

  12. Analysis and Synthesis of Load Forecasting Data for Renewable Integration Studies: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steckler, N.; Florita, A.; Zhang, J.

    2013-11-01

    As renewable energy constitutes greater portions of the generation fleet, the importance of modeling uncertainty as part of integration studies also increases. In pursuit of optimal system operations, it is important to capture not only the definitive behavior of power plants, but also the risks associated with systemwide interactions. This research examines the dependence of load forecast errors on external predictor variables such as temperature, day type, and time of day. The analysis was utilized to create statistically relevant instances of sequential load forecasts with only a time series of historic, measured load available. The creation of such load forecastsmore » relies on Bayesian techniques for informing and updating the model, thus providing a basis for networked and adaptive load forecast models in future operational applications.« less

  13. Leaf area index uncertainty estimates for model-data fusion applications

    Treesearch

    Andrew D. Richardson; D. Bryan Dail; D.Y. Hollinger

    2011-01-01

    Estimates of data uncertainties are required to integrate different observational data streams as model constraints using model-data fusion. We describe an approach with which random and systematic uncertainties in optical measurements of leaf area index [LAI] can be quantified. We use data from a measurement campaign at the spruce-dominated Howland Forest AmeriFlux...

  14. Uncertainty Budget Analysis for Dimensional Inspection Processes (U)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Valdez, Lucas M.

    2012-07-26

    This paper is intended to provide guidance and describe how to prepare an uncertainty analysis of a dimensional inspection process through the utilization of an uncertainty budget analysis. The uncertainty analysis is stated in the same methodology as that of the ISO GUM standard for calibration and testing. There is a specific distinction between how Type A and Type B uncertainty analysis is used in a general and specific process. All theory and applications are utilized to represent both a generalized approach to estimating measurement uncertainty and how to report and present these estimations for dimensional measurements in a dimensionalmore » inspection process. The analysis of this uncertainty budget shows that a well-controlled dimensional inspection process produces a conservative process uncertainty, which can be attributed to the necessary assumptions in place for best possible results.« less

  15. Advanced Stirling Convertor Heater Head Durability and Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Krause, David L.; Shah, Ashwin R.; Korovaichuk, Igor; Kalluri, Sreeramesh

    2008-01-01

    The National Aeronautics and Space Administration (NASA) has identified the high efficiency Advanced Stirling Radioisotope Generator (ASRG) as a candidate power source for long duration Science missions, such as lunar applications, Mars rovers, and deep space missions, that require reliable design lifetimes of up to 17 years. Resistance to creep deformation of the MarM-247 heater head (HH), a structurally critical component of the ASRG Advanced Stirling Convertor (ASC), under high temperatures (up to 850 C) is a key design driver for durability. Inherent uncertainties in the creep behavior of the thin-walled HH and the variations in the wall thickness, control temperature, and working gas pressure need to be accounted for in the life and reliability prediction. Due to the availability of very limited test data, assuring life and reliability of the HH is a challenging task. The NASA Glenn Research Center (GRC) has adopted an integrated approach combining available uniaxial MarM-247 material behavior testing, HH benchmark testing and advanced analysis in order to demonstrate the integrity, life and reliability of the HH under expected mission conditions. The proposed paper describes analytical aspects of the deterministic and probabilistic approaches and results. The deterministic approach involves development of the creep constitutive model for the MarM-247 (akin to the Oak Ridge National Laboratory master curve model used previously for Inconel 718 (Special Metals Corporation)) and nonlinear finite element analysis to predict the mean life. The probabilistic approach includes evaluation of the effect of design variable uncertainties in material creep behavior, geometry and operating conditions on life and reliability for the expected life. The sensitivity of the uncertainties in the design variables on the HH reliability is also quantified, and guidelines to improve reliability are discussed.

  16. Phylogenetic analysis of molecular and morphological data highlights uncertainty in the relationships of fossil and living species of Elopomorpha (Actinopterygii: Teleostei).

    PubMed

    Dornburg, Alex; Friedman, Matt; Near, Thomas J

    2015-08-01

    Elopomorpha is one of the three main clades of living teleost fishes and includes a range of disparate lineages including eels, tarpons, bonefishes, and halosaurs. Elopomorphs were among the first groups of fishes investigated using Hennigian phylogenetic methods and continue to be the object of intense phylogenetic scrutiny due to their economic significance, diversity, and crucial evolutionary status as the sister group of all other teleosts. While portions of the phylogenetic backbone for Elopomorpha are consistent between studies, the relationships among Albula, Pterothrissus, Notacanthiformes, and Anguilliformes remain contentious and difficult to evaluate. This lack of phylogenetic resolution is problematic as fossil lineages are often described and placed taxonomically based on an assumed sister group relationship between Albula and Pterothrissus. In addition, phylogenetic studies using morphological data that sample elopomorph fossil lineages often do not include notacanthiform or anguilliform lineages, potentially introducing a bias toward interpreting fossils as members of the common stem of Pterothrissus and Albula. Here we provide a phylogenetic analysis of DNA sequences sampled from multiple nuclear genes that include representative taxa from Albula, Pterothrissus, Notacanthiformes and Anguilliformes. We integrate our molecular dataset with a morphological character matrix that spans both living and fossil elopomorph lineages. Our results reveal substantial uncertainty in the placement of Pterothrissus as well as all sampled fossil lineages, questioning the stability of the taxonomy of fossil Elopomorpha. However, despite topological uncertainty, our integration of fossil lineages into a Bayesian time calibrated framework provides divergence time estimates for the clade that are consistent with previously published age estimates based on the elopomorph fossil record and molecular estimates resulting from traditional node-dating methods. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Characterizing permafrost active layer dynamics and sensitivity to landscape spatial heterogeneity in Alaska

    NASA Astrophysics Data System (ADS)

    Yi, Yonghong; Kimball, John S.; Chen, Richard H.; Moghaddam, Mahta; Reichle, Rolf H.; Mishra, Umakant; Zona, Donatella; Oechel, Walter C.

    2018-01-01

    An important feature of the Arctic is large spatial heterogeneity in active layer conditions, which is generally poorly represented by global models and can lead to large uncertainties in predicting regional ecosystem responses and climate feedbacks. In this study, we developed a spatially integrated modeling and analysis framework combining field observations, local-scale ( ˜ 50 m resolution) active layer thickness (ALT) and soil moisture maps derived from low-frequency (L + P-band) airborne radar measurements, and global satellite environmental observations to investigate the ALT sensitivity to recent climate trends and landscape heterogeneity in Alaska. Modeled ALT results show good correspondence with in situ measurements in higher-permafrost-probability (PP ≥ 70 %) areas (n = 33; R = 0.60; mean bias = 1.58 cm; RMSE = 20.32 cm), but with larger uncertainty in sporadic and discontinuous permafrost areas. The model results also reveal widespread ALT deepening since 2001, with smaller ALT increases in northern Alaska (mean trend = 0.32±1.18 cm yr-1) and much larger increases (> 3 cm yr-1) across interior and southern Alaska. The positive ALT trend coincides with regional warming and a longer snow-free season (R = 0.60 ± 0.32). A spatially integrated analysis of the radar retrievals and model sensitivity simulations demonstrated that uncertainty in the spatial and vertical distribution of soil organic carbon (SOC) was the largest factor affecting modeled ALT accuracy, while soil moisture played a secondary role. Potential improvements in characterizing SOC heterogeneity, including better spatial sampling of soil conditions and advances in remote sensing of SOC and soil moisture, will enable more accurate predictions of active layer conditions and refinement of the modeling framework across a larger domain.

  18. Joint Peru/United States report on Peru/United States cooperative energy assessment. Volume 1. Executive summary, main report and appendices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1979-08-01

    In 1978, the US and Peru conducted a comprehensive assessment of Peru's energy resources, needs, and uses and developed several alternative energy strategies that utilize the available resources to meet their energy requirements. This Volume I reports the findings of the assessment and contains the executive summary, the main report, and five appendices of information that support the integrated energy supply and demand analysis. The following chapters are included: The Energy Situation in Peru (economic context and background, energy resources and production, energy consumption patterns); Reference Supply and Demand Projection (approach, procedures, and assumptions; economic projections; energy demand and supplymore » projections; supply/demand integration; uncertainties); and The Development of Strategies and Options (the analysis of options; strategies; increased use of renewables, hydropower, coal; increased energy efficiency; and financial analysis of strategies).« less

  19. Uncertainties in Integrated Climate Change Impact Assessments by Sub-setting GCMs Based on Annual as well as Crop Growing Period under Rice Based Farming System of Indo-Gangetic Plains of India

    NASA Astrophysics Data System (ADS)

    Pillai, S. N.; Singh, H.; Panwar, A. S.; Meena, M. S.; Singh, S. V.; Singh, B.; Paudel, G. P.; Baigorria, G. A.; Ruane, A. C.; McDermid, S.; Boote, K. J.; Porter, C.; Valdivia, R. O.

    2016-12-01

    Integrated assessment of climate change impact on agricultural productivity is a challenge to the scientific community due to uncertainties of input data, particularly the climate, soil, crop calibration and socio-economic dataset. However, the uncertainty due to selection of GCMs is the major source due to complex underlying processes involved in initial as well as the boundary conditions dealt in solving the air-sea interactions. Under Agricultural Modeling Intercomparison and Improvement Project (AgMIP), the Indo-Gangetic Plains Regional Research Team investigated the uncertainties caused due to selection of GCMs through sub-setting based on annual as well as crop-growth period of rice-wheat systems in AgMIP Integrated Assessment methodology. The AgMIP Phase II protocols were used to study the linking of climate-crop-economic models for two study sites Meerut and Karnal to analyse the sensitivity of current production systems to climate change. Climate Change Projections were made using 29 CMIP5 GCMs under RCP4.5 and RCP 8.5 during mid-century period (2040-2069). Two crop models (APSIM & DSSAT) were used. TOA-MD economic model was used for integrated assessment. Based on RAPs (Representative Agricultural Pathways), some of the parameters, which are not possible to get through modeling, derived from literature and interactions with stakeholders incorporated into the TOA-MD model for integrated assessment.

  20. Attributing runoff changes to climate variability and human activities: uncertainty analysis using four monthly water balance models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Shuai; Xiong, Lihua; Li, Hong-Yi

    2015-05-26

    Hydrological simulations to delineate the impacts of climate variability and human activities are subjected to uncertainties related to both parameter and structure of the hydrological models. To analyze the impact of these uncertainties on the model performance and to yield more reliable simulation results, a global calibration and multimodel combination method that integrates the Shuffled Complex Evolution Metropolis (SCEM) and Bayesian Model Averaging (BMA) of four monthly water balance models was proposed. The method was applied to the Weihe River Basin (WRB), the largest tributary of the Yellow River, to determine the contribution of climate variability and human activities tomore » runoff changes. The change point, which was used to determine the baseline period (1956-1990) and human-impacted period (1991-2009), was derived using both cumulative curve and Pettitt’s test. Results show that the combination method from SCEM provides more skillful deterministic predictions than the best calibrated individual model, resulting in the smallest uncertainty interval of runoff changes attributed to climate variability and human activities. This combination methodology provides a practical and flexible tool for attribution of runoff changes to climate variability and human activities by hydrological models.« less

  1. Modeling sustainability in renewable energy supply chain systems

    NASA Astrophysics Data System (ADS)

    Xie, Fei

    This dissertation aims at modeling sustainability of renewable fuel supply chain systems against emerging challenges. In particular, the dissertation focuses on the biofuel supply chain system design, and manages to develop advanced modeling framework and corresponding solution methods in tackling challenges in sustaining biofuel supply chain systems. These challenges include: (1) to integrate "environmental thinking" into the long-term biofuel supply chain planning; (2) to adopt multimodal transportation to mitigate seasonality in biofuel supply chain operations; (3) to provide strategies in hedging against uncertainty from conversion technology; and (4) to develop methodologies in long-term sequential planning of the biofuel supply chain under uncertainties. All models are mixed integer programs, which also involves multi-objective programming method and two-stage/multistage stochastic programming methods. In particular for the long-term sequential planning under uncertainties, to reduce the computational challenges due to the exponential expansion of the scenario tree, I also developed efficient ND-Max method which is more efficient than CPLEX and Nested Decomposition method. Through result analysis of four independent studies, it is found that the proposed modeling frameworks can effectively improve the economic performance, enhance environmental benefits and reduce risks due to systems uncertainties for the biofuel supply chain systems.

  2. Probabilistic cost estimates for climate change mitigation.

    PubMed

    Rogelj, Joeri; McCollum, David L; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan

    2013-01-03

    For more than a decade, the target of keeping global warming below 2 °C has been a key focus of the international climate debate. In response, the scientific community has published a number of scenario studies that estimate the costs of achieving such a target. Producing these estimates remains a challenge, particularly because of relatively well known, but poorly quantified, uncertainties, and owing to limited integration of scientific knowledge across disciplines. The integrated assessment community, on the one hand, has extensively assessed the influence of technological and socio-economic uncertainties on low-carbon scenarios and associated costs. The climate modelling community, on the other hand, has spent years improving its understanding of the geophysical response of the Earth system to emissions of greenhouse gases. This geophysical response remains a key uncertainty in the cost of mitigation scenarios but has been integrated with assessments of other uncertainties in only a rudimentary manner, that is, for equilibrium conditions. Here we bridge this gap between the two research communities by generating distributions of the costs associated with limiting transient global temperature increase to below specific values, taking into account uncertainties in four factors: geophysical, technological, social and political. We find that political choices that delay mitigation have the largest effect on the cost-risk distribution, followed by geophysical uncertainties, social factors influencing future energy demand and, lastly, technological uncertainties surrounding the availability of greenhouse gas mitigation options. Our information on temperature risk and mitigation costs provides crucial information for policy-making, because it clarifies the relative importance of mitigation costs, energy demand and the timing of global action in reducing the risk of exceeding a global temperature increase of 2 °C, or other limits such as 3 °C or 1.5 °C, across a wide range of scenarios.

  3. Forecasting Responses of a Northern Peatland Carbon Cycle to Elevated CO2 and a Gradient of Experimental Warming

    NASA Astrophysics Data System (ADS)

    Jiang, Jiang; Huang, Yuanyuan; Ma, Shuang; Stacy, Mark; Shi, Zheng; Ricciuto, Daniel M.; Hanson, Paul J.; Luo, Yiqi

    2018-03-01

    The ability to forecast ecological carbon cycling is imperative to land management in a world where past carbon fluxes are no longer a clear guide in the Anthropocene. However, carbon-flux forecasting has not been practiced routinely like numerical weather prediction. This study explored (1) the relative contributions of model forcing data and parameters to uncertainty in forecasting flux- versus pool-based carbon cycle variables and (2) the time points when temperature and CO2 treatments may cause statistically detectable differences in those variables. We developed an online forecasting workflow (Ecological Platform for Assimilation of Data (EcoPAD)), which facilitates iterative data-model integration. EcoPAD automates data transfer from sensor networks, data assimilation, and ecological forecasting. We used the Spruce and Peatland Responses Under Changing Experiments data collected from 2011 to 2014 to constrain the parameters in the Terrestrial Ecosystem Model, forecast carbon cycle responses to elevated CO2 and a gradient of warming from 2015 to 2024, and specify uncertainties in the model output. Our results showed that data assimilation substantially reduces forecasting uncertainties. Interestingly, we found that the stochasticity of future external forcing contributed more to the uncertainty of forecasting future dynamics of C flux-related variables than model parameters. However, the parameter uncertainty primarily contributes to the uncertainty in forecasting C pool-related response variables. Given the uncertainties in forecasting carbon fluxes and pools, our analysis showed that statistically different responses of fast-turnover pools to various CO2 and warming treatments were observed sooner than slow-turnover pools. Our study has identified the sources of uncertainties in model prediction and thus leads to improve ecological carbon cycling forecasts in the future.

  4. DGSA: A Matlab toolbox for distance-based generalized sensitivity analysis of geoscientific computer experiments

    NASA Astrophysics Data System (ADS)

    Park, Jihoon; Yang, Guang; Satija, Addy; Scheidt, Céline; Caers, Jef

    2016-12-01

    Sensitivity analysis plays an important role in geoscientific computer experiments, whether for forecasting, data assimilation or model calibration. In this paper we focus on an extension of a method of regionalized sensitivity analysis (RSA) to applications typical in the Earth Sciences. Such applications involve the building of large complex spatial models, the application of computationally extensive forward modeling codes and the integration of heterogeneous sources of model uncertainty. The aim of this paper is to be practical: 1) provide a Matlab code, 2) provide novel visualization methods to aid users in getting a better understanding in the sensitivity 3) provide a method based on kernel principal component analysis (KPCA) and self-organizing maps (SOM) to account for spatial uncertainty typical in Earth Science applications and 4) provide an illustration on a real field case where the above mentioned complexities present themselves. We present methods that extend the original RSA method in several ways. First we present the calculation of conditional effects, defined as the sensitivity of a parameter given a level of another parameters. Second, we show how this conditional effect can be used to choose nominal values or ranges to fix insensitive parameters aiming to minimally affect uncertainty in the response. Third, we develop a method based on KPCA and SOM to assign a rank to spatial models in order to calculate the sensitivity on spatial variability in the models. A large oil/gas reservoir case is used as illustration of these ideas.

  5. IMPACT2C: Quantifying projected impacts under 2°C warming

    NASA Astrophysics Data System (ADS)

    Jacob, D.; Kotova, L.; Impact2C Team

    2012-04-01

    Political discussions on the European goal to limit global warming to 2°C demand, that information is provided to society by the best available science on projected impacts and possible benefits. The new project IMPACT2C is supported by the European Commission's 7th Framework Programme as a 4 year large-scale integrating project. IMPACT2C is coordinated by the Climate Service Center, Helmholtz-Zentrum Geesthacht. IMPACT2C enhances knowledge, quantifies climate change impacts, and adopts a clear and logical structure, with climate and impacts modelling, vulnerabilities, risks and economic costs, as well as potential responses, within a pan-European sector based analysis. The project utilises a range of models within a multi-disciplinary international expert team and assesses effects on water, energy, infrastructure, coasts, tourism, forestry, agriculture, ecosystems services, and health and air quality-climate interactions. IMPACT2C introduces key innovations. First, harmonised socio-economic assumptions/scenarios will be used, to ensure that both individual and cross-sector assessments are aligned to the 2°C (1.5°C) scenario for both impacts and adaptation, e.g. in relation to land-use pressures between agriculture and forestry. Second, it has a core theme of uncertainty, and will develop a methodological framework integrating the uncertainties within and across the different sectors, in a consistent way. In so doing, analysis of adaptation responses under uncertainty will be enhanced. Finally, a cross-sectoral perspective is adopted to complement the sector analysis. A number of case studies will be developed for particularly vulnerable areas, subject to multiple impacts (e.g. the Mediterranean), with the focus being on cross-sectoral interactions (e.g. land use competition) and cross-cutting themes (e.g. cities). The project also assesses climate change impacts in some of the world's most vulnerable regions: Bangladesh, Africa (Nile and Niger basins), and the Maldives. An overview about the scientific goals and the structure of IMPACT2C will be presented.

  6. Representation of analysis results involving aleatory and epistemic uncertainty.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Jay Dean; Helton, Jon Craig; Oberkampf, William Louis

    2008-08-01

    Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for themore » representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.« less

  7. A climate robust integrated modelling framework for regional impact assessment of climate change

    NASA Astrophysics Data System (ADS)

    Janssen, Gijs; Bakker, Alexander; van Ek, Remco; Groot, Annemarie; Kroes, Joop; Kuiper, Marijn; Schipper, Peter; van Walsum, Paul; Wamelink, Wieger; Mol, Janet

    2013-04-01

    Decision making towards climate proofing the water management of regional catchments can benefit greatly from the availability of a climate robust integrated modelling framework, capable of a consistent assessment of climate change impacts on the various interests present in the catchments. In the Netherlands, much effort has been devoted to developing state-of-the-art regional dynamic groundwater models with a very high spatial resolution (25x25 m2). Still, these models are not completely satisfactory to decision makers because the modelling concepts do not take into account feedbacks between meteorology, vegetation/crop growth, and hydrology. This introduces uncertainties in forecasting the effects of climate change on groundwater, surface water, agricultural yields, and development of groundwater dependent terrestrial ecosystems. These uncertainties add to the uncertainties about the predictions on climate change itself. In order to create an integrated, climate robust modelling framework, we coupled existing model codes on hydrology, agriculture and nature that are currently in use at the different research institutes in the Netherlands. The modelling framework consists of the model codes MODFLOW (groundwater flow), MetaSWAP (vadose zone), WOFOST (crop growth), SMART2-SUMO2 (soil-vegetation) and NTM3 (nature valuation). MODFLOW, MetaSWAP and WOFOST are coupled online (i.e. exchange information on time step basis). Thus, changes in meteorology and CO2-concentrations affect crop growth and feedbacks between crop growth, vadose zone water movement and groundwater recharge are accounted for. The model chain WOFOST-MetaSWAP-MODFLOW generates hydrological input for the ecological prediction model combination SMART2-SUMO2-NTM3. The modelling framework was used to support the regional water management decision making process in the 267 km2 Baakse Beek-Veengoot catchment in the east of the Netherlands. Computations were performed for regionalized 30-year climate change scenarios developed by KNMI for precipitation and reference evapotranspiration according to Penman-Monteith. Special focus in the project was on the role of uncertainty. How valid is the information that is generated by this modelling framework? What are the most important uncertainties of the input data, how do they affect the results of the model chain and how can the uncertainties of the data, results, and model concepts be quantified and communicated? Besides these technical issues, an important part of the study was devoted to the perception of stakeholders. Stakeholder analysis and additional working sessions yielded insight into how the models, their results and the uncertainties are perceived, how the modelling framework and results connect to the stakeholders' information demands and what kind of additional information is needed for adequate support on decision making.

  8. Uncertainty quantification of surface-water/groundwater exchange estimates in large wetland systems using Python

    NASA Astrophysics Data System (ADS)

    Hughes, J. D.; Metz, P. A.

    2014-12-01

    Most watershed studies include observation-based water budget analyses to develop first-order estimates of significant flow terms. Surface-water/groundwater (SWGW) exchange is typically assumed to be equal to the residual of the sum of inflows and outflows in a watershed. These estimates of SWGW exchange, however, are highly uncertain as a result of the propagation of uncertainty inherent in the calculation or processing of the other terms of the water budget, such as stage-area-volume relations, and uncertainties associated with land-cover based evapotranspiration (ET) rate estimates. Furthermore, the uncertainty of estimated SWGW exchanges can be magnified in large wetland systems that transition from dry to wet during wet periods. Although it is well understood that observation-based estimates of SWGW exchange are uncertain it is uncommon for the uncertainty of these estimates to be directly quantified. High-level programming languages like Python can greatly reduce the effort required to (1) quantify the uncertainty of estimated SWGW exchange in large wetland systems and (2) evaluate how different approaches for partitioning land-cover data in a watershed may affect the water-budget uncertainty. We have used Python with the Numpy, Scipy.stats, and pyDOE packages to implement an unconstrained Monte Carlo approach with Latin Hypercube sampling to quantify the uncertainty of monthly estimates of SWGW exchange in the Floral City watershed of the Tsala Apopka wetland system in west-central Florida, USA. Possible sources of uncertainty in the water budget analysis include rainfall, ET, canal discharge, and land/bathymetric surface elevations. Each of these input variables was assigned a probability distribution based on observation error or spanning the range of probable values. The Monte Carlo integration process exposes the uncertainties in land-cover based ET rate estimates as the dominant contributor to the uncertainty in SWGW exchange estimates. We will discuss the uncertainty of SWGW exchange estimates using an ET model that partitions the watershed into open water and wetland land-cover types. We will also discuss the uncertainty of SWGW exchange estimates calculated using ET models partitioned into additional land-cover types.

  9. Measurement uncertainty analysis techniques applied to PV performance measurements

    NASA Astrophysics Data System (ADS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  10. Integrated Research on the Development of Global Climate Risk Management Strategies - Framework and Initial Results of the Research Project ICA-RUS

    NASA Astrophysics Data System (ADS)

    Emori, Seita; Takahashi, Kiyoshi; Yamagata, Yoshiki; Oki, Taikan; Mori, Shunsuke; Fujigaki, Yuko

    2013-04-01

    With the aim of proposing strategies of global climate risk management, we have launched a five-year research project called ICA-RUS (Integrated Climate Assessment - Risks, Uncertainties and Society). In this project with the phrase "risk management" in its title, we aspire for a comprehensive assessment of climate change risks, explicit consideration of uncertainties, utilization of best available information, and consideration of every possible conditions and options. We also regard the problem as one of decision-making at the human level, which involves social value judgments and adapts to future changes in circumstances. The ICA-RUS project consists of the following five themes: 1) Synthesis of global climate risk management strategies, 2) Optimization of land, water and ecosystem uses for climate risk management, 3) Identification and analysis of critical climate risks, 4) Evaluation of climate risk management options under technological, social and economic uncertainties and 5) Interactions between scientific and social rationalities in climate risk management (see also: http://www.nies.go.jp/ica-rus/en/). For the integration of quantitative knowledge of climate change risks and responses, we apply a tool named AIM/Impact [Policy], which consists of an energy-economic model, a simplified climate model and impact projection modules. At the same time, in order to make use of qualitative knowledge as well, we hold monthly project meetings for the discussion of risk management strategies and publish annual reports based on the quantitative and qualitative information. To enhance the comprehensiveness of the analyses, we maintain an inventory of risks and risk management options. The inventory is revised iteratively through interactive meetings with stakeholders such as policymakers, government officials and industrial representatives.

  11. Line-of-sight extrapolation noise in dust polarization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poh, Jason; Dodelson, Scott

    The B-modes of polarization at frequencies ranging from 50-1000 GHz are produced by Galactic dust, lensing of primordial E-modes in the cosmic microwave background (CMB) by intervening large scale structure, and possibly by primordial B-modes in the CMB imprinted by gravitational waves produced during inflation. The conventional method used to separate the dust component of the signal is to assume that the signal at high frequencies (e.g., 350 GHz) is due solely to dust and then extrapolate the signal down to lower frequency (e.g., 150 GHz) using the measured scaling of the polarized dust signal amplitude with frequency. For typicalmore » Galactic thermal dust temperatures of about 20K, these frequencies are not fully in the Rayleigh-Jeans limit. Therefore, deviations in the dust cloud temperatures from cloud to cloud will lead to different scaling factors for clouds of different temperatures. Hence, when multiple clouds of different temperatures and polarization angles contribute to the integrated line-of-sight polarization signal, the relative contribution of individual clouds to the integrated signal can change between frequencies. This can cause the integrated signal to be decorrelated in both amplitude and direction when extrapolating in frequency. Here we carry out a Monte Carlo analysis on the impact of this line-of-sight extrapolation noise, enabling us to quantify its effect. Using results from the Planck experiment, we find that this effect is small, more than an order of magnitude smaller than the current uncertainties. However, line-of-sight extrapolation noise may be a significant source of uncertainty in future low-noise primordial B-mode experiments. Scaling from Planck results, we find that accounting for this uncertainty becomes potentially important when experiments are sensitive to primordial B-mode signals with amplitude r < 0.0015 .« less

  12. Towards a Multi-Resolution Model of Seismic Risk in Central Asia. Challenge and perspectives

    NASA Astrophysics Data System (ADS)

    Pittore, M.; Wieland, M.; Bindi, D.; Parolai, S.

    2011-12-01

    Assessing seismic risk, defined as the probability of occurrence of economical and social losses as consequence of an earthquake, both at regional and at local scale is a challenging, multi-disciplinary task. In order to provide a reliable estimate, diverse information must be gathered by seismologists, geologists, engineers and civil authorities, and carefully integrated keeping into account the different levels of uncertainty. The research towards an integrated methodology, able to seamlessly describe seismic risk at different spatial scales is challenging, but discloses new application perspectives, particularly in those countries which suffer from a relevant seismic hazard but do not have resources for a standard assessment. Central Asian countries in particular, which exhibit one of the highest seismic hazard in the world, are experiencing a steady demographic growth, often accompanied by informal settlement and urban sprawling. A reliable evaluation of how these factors affect the seismic risk, together with a realistic assessment of the assets exposed to seismic hazard and their structural vulnerability is of particular importance, in order to undertake proper mitigation actions and to promptly and efficiently react to a catastrophic event. New strategies are needed to efficiently cope with systematic lack of information and uncertainties. An original approach is presented to assess seismic risk based on integration of information coming from remote-sensing and ground-based panoramic imaging, in situ measurements, expert knowledge and already available data. Efficient sampling strategies based on freely available medium-resolution multi-spectral satellite images are adopted to optimize data collection and validation, in a multi-scale approach. Panoramic imaging is also considered as a valuable ground-based visual data collection technique, suitable both for manual and automatic analysis. A full-probabilistic framework based on Bayes Network is proposed to integrate available information taking into account both aleatory and epistemic uncertainties. An improved risk model for the capital of Kyrgyz Republic, Biskek, has been developed following this approach and tested based on different earthquake scenarios. Preliminary results will be presented and discussed.

  13. Bayesian models for comparative analysis integrating phylogenetic uncertainty.

    PubMed

    de Villemereuil, Pierre; Wells, Jessie A; Edwards, Robert D; Blomberg, Simon P

    2012-06-28

    Uncertainty in comparative analyses can come from at least two sources: a) phylogenetic uncertainty in the tree topology or branch lengths, and b) uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow) and inflated significance in hypothesis testing (e.g. p-values will be too small). Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible general purpose tool for phylogenetic comparative analyses, particularly for modelling in the face of phylogenetic uncertainty and accounting for measurement error or individual variation in explanatory variables. Code for all models is provided in the BUGS model description language.

  14. Bayesian models for comparative analysis integrating phylogenetic uncertainty

    PubMed Central

    2012-01-01

    Background Uncertainty in comparative analyses can come from at least two sources: a) phylogenetic uncertainty in the tree topology or branch lengths, and b) uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow) and inflated significance in hypothesis testing (e.g. p-values will be too small). Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible general purpose tool for phylogenetic comparative analyses, particularly for modelling in the face of phylogenetic uncertainty and accounting for measurement error or individual variation in explanatory variables. Code for all models is provided in the BUGS model description language. PMID:22741602

  15. Algorithms and Object-Oriented Software for Distributed Physics-Based Modeling

    NASA Technical Reports Server (NTRS)

    Kenton, Marc A.

    2001-01-01

    The project seeks to develop methods to more efficiently simulate aerospace vehicles. The goals are to reduce model development time, increase accuracy (e.g.,by allowing the integration of multidisciplinary models), facilitate collaboration by geographically- distributed groups of engineers, support uncertainty analysis and optimization, reduce hardware costs, and increase execution speeds. These problems are the subject of considerable contemporary research (e.g., Biedron et al. 1999; Heath and Dick, 2000).

  16. Measuring the Performance and Intelligence of Systems: Proceedings of the 2001 PerMIS Workshop

    DTIC Science & Technology

    2001-09-04

    35 1.1 Interval Mathematics for Analysis of Multiresolutional Systems V. Kreinovich, Univ. of Texas, R. Alo, Univ. of Houston-Downtown...the possible combinations. In non-deterministic real- time systems , the problem is compounded by the uncertainty in the execution times of various...multiresolutional, multiscale ) in their essence because of multiresolutional character of the meaning of words [Rieger, 01]. In integrating systems , the presence of a

  17. Tracing catchment fine sediment sources using the new SIFT (SedIment Fingerprinting Tool) open source software.

    PubMed

    Pulley, S; Collins, A L

    2018-09-01

    The mitigation of diffuse sediment pollution requires reliable provenance information so that measures can be targeted. Sediment source fingerprinting represents one approach for supporting these needs, but recent methodological developments have resulted in an increasing complexity of data processing methods rendering the approach less accessible to non-specialists. A comprehensive new software programme (SIFT; SedIment Fingerprinting Tool) has therefore been developed which guides the user through critical data analysis decisions and automates all calculations. Multiple source group configurations and composite fingerprints are identified and tested using multiple methods of uncertainty analysis. This aims to explore the sediment provenance information provided by the tracers more comprehensively than a single model, and allows for model configurations with high uncertainties to be rejected. This paper provides an overview of its application to an agricultural catchment in the UK to determine if the approach used can provide a reduction in uncertainty and increase in precision. Five source group classifications were used; three formed using a k-means cluster analysis containing 2, 3 and 4 clusters, and two a-priori groups based upon catchment geology. Three different composite fingerprints were used for each classification and bi-plots, range tests, tracer variability ratios and virtual mixtures tested the reliability of each model configuration. Some model configurations performed poorly when apportioning the composition of virtual mixtures, and different model configurations could produce different sediment provenance results despite using composite fingerprints able to discriminate robustly between the source groups. Despite this uncertainty, dominant sediment sources were identified, and those in close proximity to each sediment sampling location were found to be of greatest importance. This new software, by integrating recent methodological developments in tracer data processing, guides users through key steps. Critically, by applying multiple model configurations and uncertainty assessment, it delivers more robust solutions for informing catchment management of the sediment problem than many previously used approaches. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  18. Evaluating critical uncertainty thresholds in a spatial model of forest pest invasion risk

    Treesearch

    Frank H. Koch; Denys Yemshanov; Daniel W. McKenney; William D. Smith

    2009-01-01

    Pest risk maps can provide useful decision support in invasive species management, but most do not adequately consider the uncertainty associated with predicted risk values. This study explores how increased uncertainty in a risk model’s numeric assumptions might affect the resultant risk map. We used a spatial stochastic model, integrating components for...

  19. Detailed Analysis of the Asteroid Pair (6070) Rheinland and (54827) 2001 NQ8

    NASA Astrophysics Data System (ADS)

    Vokrouhlický, David; Pravec, Petr; Ďurech, Josef; Hornoch, Kamil; Kušnirák, Peter; Galád, Adrián; Vraštil, Jan; Kučáková, Hana; Pollock, Joseph T.; Ortiz, Jose Luis; Morales, Nicolas; Gaftonyuk, Ninel M.; Pray, Donald P.; Krugly, Yurij N.; Inasaridze, Raguli Ya.; Ayvazian, Vova R.; Molotov, Igor E.; Colazo, Carlos A.

    2017-06-01

    The existence of asteroid pairs, two bodies on similar heliocentric orbits, reveals an ongoing process of rotational fission among asteroids. This newly found class of objects has not been studied in detail yet. Here we choose asteroids (6070) Rheinland and (54827) 2001 NQ8, the most suitable pair for an in-depth analysis. First, we use available optical photometry to determine their rotational state and convex shapes. Rotational pole of Rheinland is very near the south ecliptic pole with a latitude uncertainty of about 10°. There are two equivalent solutions for the pole of 2001 NQ8, either (72°, -49°) or (242°, -46°) (ecliptic longitude and latitude). In both cases, the longitude values have about 10° uncertainty and the latitude values have about 15° uncertainty (both 3σ uncertainties). The sidereal rotation period of 2001 NQ8 is 5.877186 ± 0.000002 hr. Second, we construct a precise numerical integrator to determine the past state vectors of the pair’s components, namely their heliocentric positions and velocities, and orientation of their spin vectors. Using this new tool, we investigate the origin of the (6070) Rheinland and (54827) 2001 NQ8 pair. We find a formal age solution of 16.34 ± 0.04 kyr. This includes effects of the most massive objects in the asteroid belt (Ceres, Pallas, and Vesta), but the unaccounted gravitational perturbations from other asteroids may imply that the realistic age uncertainty is slightly larger than its formal value. Analyzing results from our numerical simulation to 250 kya, we argue against a possibility that this pair would allow an older age. Initial spin vectors of the two asteroids, at the moment of their separation, were not collinear, but tilted by 38^\\circ +/- 12^\\circ .

  20. Integral-moment analysis of the BATSE gamma-ray burst intensity distribution

    NASA Technical Reports Server (NTRS)

    Horack, John M.; Emslie, A. Gordon

    1994-01-01

    We have applied the technique of integral-moment analysis to the intensity distribution of the first 260 gamma-ray bursts observed by the Burst and Transient Source Experiment (BATSE) on the Compton Gamma Ray Observatory. This technique provides direct measurement of properties such as the mean, variance, and skewness of the convolved luminosity-number density distribution, as well as associated uncertainties. Using this method, one obtains insight into the nature of the source distributions unavailable through computation of traditional single parameters such as V/V(sub max)). If the luminosity function of the gamma-ray bursts is strongly peaked, giving bursts only a narrow range of luminosities, these results are then direct probes of the radial distribution of sources, regardless of whether the bursts are a local phenomenon, are distributed in a galactic halo, or are at cosmological distances. Accordingly, an integral-moment analysis of the intensity distribution of the gamma-ray bursts provides for the most complete analytic description of the source distribution available from the data, and offers the most comprehensive test of the compatibility of a given hypothesized distribution with observation.

  1. Flat-plate solar array project. Volume 8: Project analysis and integration

    NASA Technical Reports Server (NTRS)

    Mcguire, P.; Henry, P.

    1986-01-01

    Project Analysis and Integration (PA&I) performed planning and integration activities to support management of the various Flat-Plate Solar Array (FSA) Project R&D activities. Technical and economic goals were established by PA&I for each R&D task within the project to coordinate the thrust toward the National Photovoltaic Program goals. A sophisticated computer modeling capability was developed to assess technical progress toward meeting the economic goals. These models included a manufacturing facility simulation, a photovoltaic power station simulation and a decision aid model incorporating uncertainty. This family of analysis tools was used to track the progress of the technology and to explore the effects of alternative technical paths. Numerous studies conducted by PA&I signaled the achievement of milestones or were the foundation of major FSA project and national program decisions. The most important PA&I activities during the project history are summarized. The PA&I planning function is discussed and how it relates to project direction and important analytical models developed by PA&I for its analytical and assessment activities are reviewed.

  2. A continental-scale hydrology and water quality model for Europe: Calibration and uncertainty of a high-resolution large-scale SWAT model

    NASA Astrophysics Data System (ADS)

    Abbaspour, K. C.; Rouholahnejad, E.; Vaghefi, S.; Srinivasan, R.; Yang, H.; Kløve, B.

    2015-05-01

    A combination of driving forces are increasing pressure on local, national, and regional water supplies needed for irrigation, energy production, industrial uses, domestic purposes, and the environment. In many parts of Europe groundwater quantity, and in particular quality, have come under sever degradation and water levels have decreased resulting in negative environmental impacts. Rapid improvements in the economy of the eastern European block of countries and uncertainties with regard to freshwater availability create challenges for water managers. At the same time, climate change adds a new level of uncertainty with regard to freshwater supplies. In this research we build and calibrate an integrated hydrological model of Europe using the Soil and Water Assessment Tool (SWAT) program. Different components of water resources are simulated and crop yield and water quality are considered at the Hydrological Response Unit (HRU) level. The water resources are quantified at subbasin level with monthly time intervals. Leaching of nitrate into groundwater is also simulated at a finer spatial level (HRU). The use of large-scale, high-resolution water resources models enables consistent and comprehensive examination of integrated system behavior through physically-based, data-driven simulation. In this article we discuss issues with data availability, calibration of large-scale distributed models, and outline procedures for model calibration and uncertainty analysis. The calibrated model and results provide information support to the European Water Framework Directive and lay the basis for further assessment of the impact of climate change on water availability and quality. The approach and methods developed are general and can be applied to any large region around the world.

  3. Evaluating Snow Data Assimilation Framework for Streamflow Forecasting Applications Using Hindcast Verification

    NASA Astrophysics Data System (ADS)

    Barik, M. G.; Hogue, T. S.; Franz, K. J.; He, M.

    2012-12-01

    Snow water equivalent (SWE) estimation is a key factor in producing reliable streamflow simulations and forecasts in snow dominated areas. However, measuring or predicting SWE has significant uncertainty. Sequential data assimilation, which updates states using both observed and modeled data based on error estimation, has been shown to reduce streamflow simulation errors but has had limited testing for forecasting applications. In the current study, a snow data assimilation framework integrated with the National Weather System River Forecasting System (NWSRFS) is evaluated for use in ensemble streamflow prediction (ESP). Seasonal water supply ESP hindcasts are generated for the North Fork of the American River Basin (NFARB) in northern California. Parameter sets from the California Nevada River Forecast Center (CNRFC), the Differential Evolution Adaptive Metropolis (DREAM) algorithm and the Multistep Automated Calibration Scheme (MACS) are tested both with and without sequential data assimilation. The traditional ESP method considers uncertainty in future climate conditions using historical temperature and precipitation time series to generate future streamflow scenarios conditioned on the current basin state. We include data uncertainty analysis in the forecasting framework through the DREAM-based parameter set which is part of a recently developed Integrated Uncertainty and Ensemble-based data Assimilation framework (ICEA). Extensive verification of all tested approaches is undertaken using traditional forecast verification measures, including root mean square error (RMSE), Nash-Sutcliffe efficiency coefficient (NSE), volumetric bias, joint distribution, rank probability score (RPS), and discrimination and reliability plots. In comparison to the RFC parameters, the DREAM and MACS sets show significant improvement in volumetric bias in flow. Use of assimilation improves hindcasts of higher flows but does not significantly improve performance in the mid flow and low flow categories.

  4. Investigating the Nexus of Climate, Energy, Water, and Land at Decision-Relevant Scales: The Platform for Regional Integrated Modeling and Analysis (PRIMA)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kraucunas, Ian P.; Clarke, Leon E.; Dirks, James A.

    2015-04-01

    The Platform for Regional Integrated Modeling and Analysis (PRIMA) is an innovative modeling system developed at Pacific Northwest National Laboratory (PNNL) to simulate interactions among natural and human systems at scales relevant to regional decision making. PRIMA brings together state-of-the-art models of regional climate, hydrology, agriculture, socioeconomics, and energy systems using a flexible coupling approach. The platform can be customized to inform a variety of complex questions and decisions, such as the integrated evaluation of mitigation and adaptation options across a range of sectors. Research into stakeholder decision support needs underpins the platform's application to regional issues, including uncertainty characterization.more » Ongoing numerical experiments are yielding new insights into the interactions among human and natural systems on regional scales with an initial focus on the energy-land-water nexus in the upper U.S. Midwest. This paper focuses on PRIMA’s functional capabilities and describes some lessons learned to date about integrated regional modeling.« less

  5. Reducing the Risk of Human Space Missions with INTEGRITY

    NASA Technical Reports Server (NTRS)

    Jones, Harry W.; Dillon-Merill, Robin L.; Tri, Terry O.; Henninger, Donald L.

    2003-01-01

    The INTEGRITY Program will design and operate a test bed facility to help prepare for future beyond-LEO missions. The purpose of INTEGRITY is to enable future missions by developing, testing, and demonstrating advanced human space systems. INTEGRITY will also implement and validate advanced management techniques including risk analysis and mitigation. One important way INTEGRITY will help enable future missions is by reducing their risk. A risk analysis of human space missions is important in defining the steps that INTEGRITY should take to mitigate risk. This paper describes how a Probabilistic Risk Assessment (PRA) of human space missions will help support the planning and development of INTEGRITY to maximize its benefits to future missions. PRA is a systematic methodology to decompose the system into subsystems and components, to quantify the failure risk as a function of the design elements and their corresponding probability of failure. PRA provides a quantitative estimate of the probability of failure of the system, including an assessment and display of the degree of uncertainty surrounding the probability. PRA provides a basis for understanding the impacts of decisions that affect safety, reliability, performance, and cost. Risks with both high probability and high impact are identified as top priority. The PRA of human missions beyond Earth orbit will help indicate how the risk of future human space missions can be reduced by integrating and testing systems in INTEGRITY.

  6. Flood risk analysis and adaptive strategy in context of uncertainties: a case study of Nhieu Loc Thi Nghe Basin, Ho Chi Minh City

    NASA Astrophysics Data System (ADS)

    Ho, Long-Phi; Chau, Nguyen-Xuan-Quang; Nguyen, Hong-Quan

    2013-04-01

    The Nhieu Loc - Thi Nghe basin is the most important administrative and business area of Ho Chi Minh City. Due to system complexity of the basin such as the increasing trend of rainfall intensity, (tidal) water level and land subsidence, the simulation of hydrological, hydraulic variables for flooding prediction seems rather not adequate in practical projects. The basin is still highly vulnerable despite of multi-million USD investment for urban drainage improvement projects since the last decade. In this paper, an integrated system analysis in both spatial and temporal aspects based on statistical, GIS and modelling approaches has been conducted in order to: (1) Analyse risks before and after projects, (2) Foresee water-related risk under uncertainties of unfavourable driving factors and (3) Develop a sustainable flood risk management strategy for the basin. The results show that given the framework of risk analysis and adaptive strategy, certain urban developing plans in the basin must be carefully revised and/or checked in order to reduce the highly unexpected loss in the future

  7. Evaluation of UK Integrated Care Pilots: research protocol

    PubMed Central

    Ling, Tom; Bardsley, Martin; Adams, John; Lewis, Richard; Roland, Martin

    2010-01-01

    Background In response to concerns that the needs of the aging population for well-integrated care were increasing, the English National Health Service (NHS) appointed 16 Integrated Care Pilots following a national competition. The pilots have a range of aims including development of new organisational structures to support integration, changes in staff roles, reducing unscheduled emergency hospital admissions, reduced length of hospital stay, increasing patient satisfaction, and reducing cost. This paper describes the evaluation of the initiative which has been commissioned. Study design and data collection methods A mixed methods approach has been adopted including interviews with staff and patients, non-participant observation of meetings, structured written feedback from sites, questionnaires to patients and staff, and analysis of routinely collected hospital utilisation data for patients/service users. The qualitative analysis aims to identify the approaches taken to integration by the sites, the benefits which result, the context in which benefits have resulted, and the mechanisms by which they occur. Methods of analysis The quantitative analysis adopts a ‘difference in differences’ approach comparing health care utilisation before and after the intervention with risk-matched controls. The qualitative data analysis adopts a ‘theory of change’ approach in which we triangulate data from the quantitative analysis with qualitative data in order to describe causal effects (what happens when an independent variable changes) and causal mechanisms (what connects causes to their effects). An economic analysis will identify what incremental resources are required to make integration succeed and how they can be combined efficiently to produce better outcomes for patients. Conclusion This evaluation will produce a portfolio of evidence aimed at strengthening the evidence base for integrated care, and in particular identifying the context in which interventions are likely to be effective. These data will support a series of evaluation judgements aimed at reducing uncertainties about the role of integrated care in improving the efficient and effective delivery of healthcare. PMID:20922068

  8. Optimization of monitoring networks based on uncertainty quantification of model predictions of contaminant transport

    NASA Astrophysics Data System (ADS)

    Vesselinov, V. V.; Harp, D.

    2010-12-01

    The process of decision making to protect groundwater resources requires a detailed estimation of uncertainties in model predictions. Various uncertainties associated with modeling a natural system, such as: (1) measurement and computational errors; (2) uncertainties in the conceptual model and model-parameter estimates; (3) simplifications in model setup and numerical representation of governing processes, contribute to the uncertainties in the model predictions. Due to this combination of factors, the sources of predictive uncertainties are generally difficult to quantify individually. Decision support related to optimal design of monitoring networks requires (1) detailed analyses of existing uncertainties related to model predictions of groundwater flow and contaminant transport, (2) optimization of the proposed monitoring network locations in terms of their efficiency to detect contaminants and provide early warning. We apply existing and newly-proposed methods to quantify predictive uncertainties and to optimize well locations. An important aspect of the analysis is the application of newly-developed optimization technique based on coupling of Particle Swarm and Levenberg-Marquardt optimization methods which proved to be robust and computationally efficient. These techniques and algorithms are bundled in a software package called MADS. MADS (Model Analyses for Decision Support) is an object-oriented code that is capable of performing various types of model analyses and supporting model-based decision making. The code can be executed under different computational modes, which include (1) sensitivity analyses (global and local), (2) Monte Carlo analysis, (3) model calibration, (4) parameter estimation, (5) uncertainty quantification, and (6) model selection. The code can be externally coupled with any existing model simulator through integrated modules that read/write input and output files using a set of template and instruction files (consistent with the PEST I/O protocol). MADS can also be internally coupled with a series of built-in analytical simulators. MADS provides functionality to work directly with existing control files developed for the code PEST (Doherty 2009). To perform the computational modes mentioned above, the code utilizes (1) advanced Latin-Hypercube sampling techniques (including Improved Distributed Sampling), (2) various gradient-based Levenberg-Marquardt optimization methods, (3) advanced global optimization methods (including Particle Swarm Optimization), and (4) a selection of alternative objective functions. The code has been successfully applied to perform various model analyses related to environmental management of real contamination sites. Examples include source identification problems, quantification of uncertainty, model calibration, and optimization of monitoring networks. The methodology and software codes are demonstrated using synthetic and real case studies where monitoring networks are optimized taking into account the uncertainty in model predictions of contaminant transport.

  9. Modeling Sea-Level Change using Errors-in-Variables Integrated Gaussian Processes

    NASA Astrophysics Data System (ADS)

    Cahill, Niamh; Parnell, Andrew; Kemp, Andrew; Horton, Benjamin

    2014-05-01

    We perform Bayesian inference on historical and late Holocene (last 2000 years) rates of sea-level change. The data that form the input to our model are tide-gauge measurements and proxy reconstructions from cores of coastal sediment. To accurately estimate rates of sea-level change and reliably compare tide-gauge compilations with proxy reconstructions it is necessary to account for the uncertainties that characterize each dataset. Many previous studies used simple linear regression models (most commonly polynomial regression) resulting in overly precise rate estimates. The model we propose uses an integrated Gaussian process approach, where a Gaussian process prior is placed on the rate of sea-level change and the data itself is modeled as the integral of this rate process. The non-parametric Gaussian process model is known to be well suited to modeling time series data. The advantage of using an integrated Gaussian process is that it allows for the direct estimation of the derivative of a one dimensional curve. The derivative at a particular time point will be representative of the rate of sea level change at that time point. The tide gauge and proxy data are complicated by multiple sources of uncertainty, some of which arise as part of the data collection exercise. Most notably, the proxy reconstructions include temporal uncertainty from dating of the sediment core using techniques such as radiocarbon. As a result of this, the integrated Gaussian process model is set in an errors-in-variables (EIV) framework so as to take account of this temporal uncertainty. The data must be corrected for land-level change known as glacio-isostatic adjustment (GIA) as it is important to isolate the climate-related sea-level signal. The correction for GIA introduces covariance between individual age and sea level observations into the model. The proposed integrated Gaussian process model allows for the estimation of instantaneous rates of sea-level change and accounts for all available sources of uncertainty in tide-gauge and proxy-reconstruction data. Our response variable is sea level after correction for GIA. By embedding the integrated process in an errors-in-variables (EIV) framework, and removing the estimate of GIA, we can quantify rates with better estimates of uncertainty than previously possible. The model provides a flexible fit and enables us to estimate rates of change at any given time point, thus observing how rates have been evolving from the past to present day.

  10. Optimal integrated abundances for chemical tagging of extragalactic globular clusters

    NASA Astrophysics Data System (ADS)

    Sakari, Charli M.; Venn, Kim; Shetrone, Matthew; Dotter, Aaron; Mackey, Dougal

    2014-09-01

    High-resolution integrated light (IL) spectroscopy provides detailed abundances of distant globular clusters whose stars cannot be resolved. Abundance comparisons with other systems (e.g. for chemical tagging) require understanding the systematic offsets that can occur between clusters, such as those due to uncertainties in the underlying stellar population. This paper analyses high-resolution IL spectra of the Galactic globular clusters 47 Tuc, M3, M13, NGC 7006, and M15 to (1) quantify potential systematic uncertainties in Fe, Ca, Ti, Ni, Ba, and Eu and (2) identify the most stable abundance ratios that will be useful in future analyses of unresolved targets. When stellar populations are well modelled, uncertainties are ˜0.1-0.2 dex based on sensitivities to the atmospheric parameters alone; in the worst-case scenarios, uncertainties can rise to 0.2-0.4 dex. The [Ca I/Fe I] ratio is identified as the optimal integrated [α/Fe] indicator (with offsets ≲ 0.1 dex), while [Ni I/Fe I] is also extremely stable to within ≲ 0.1 dex. The [Ba II/Eu II] ratios are also stable when the underlying populations are well modelled and may also be useful for chemical tagging.

  11. Hard and Soft Constraints in Reliability-Based Design Optimization

    NASA Technical Reports Server (NTRS)

    Crespo, L.uis G.; Giesy, Daniel P.; Kenny, Sean P.

    2006-01-01

    This paper proposes a framework for the analysis and design optimization of models subject to parametric uncertainty where design requirements in the form of inequality constraints are present. Emphasis is given to uncertainty models prescribed by norm bounded perturbations from a nominal parameter value and by sets of componentwise bounded uncertain variables. These models, which often arise in engineering problems, allow for a sharp mathematical manipulation. Constraints can be implemented in the hard sense, i.e., constraints must be satisfied for all parameter realizations in the uncertainty model, and in the soft sense, i.e., constraints can be violated by some realizations of the uncertain parameter. In regard to hard constraints, this methodology allows (i) to determine if a hard constraint can be satisfied for a given uncertainty model and constraint structure, (ii) to generate conclusive, formally verifiable reliability assessments that allow for unprejudiced comparisons of competing design alternatives and (iii) to identify the critical combination of uncertain parameters leading to constraint violations. In regard to soft constraints, the methodology allows the designer (i) to use probabilistic uncertainty models, (ii) to calculate upper bounds to the probability of constraint violation, and (iii) to efficiently estimate failure probabilities via a hybrid method. This method integrates the upper bounds, for which closed form expressions are derived, along with conditional sampling. In addition, an l(sub infinity) formulation for the efficient manipulation of hyper-rectangular sets is also proposed.

  12. A Bayesian Network Based Global Sensitivity Analysis Method for Identifying Dominant Processes in a Multi-physics Model

    NASA Astrophysics Data System (ADS)

    Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.

    2016-12-01

    Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.

  13. Short-Term Load Forecasting Error Distributions and Implications for Renewable Integration Studies: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hodge, B. M.; Lew, D.; Milligan, M.

    2013-01-01

    Load forecasting in the day-ahead timescale is a critical aspect of power system operations that is used in the unit commitment process. It is also an important factor in renewable energy integration studies, where the combination of load and wind or solar forecasting techniques create the net load uncertainty that must be managed by the economic dispatch process or with suitable reserves. An understanding of that load forecasting errors that may be expected in this process can lead to better decisions about the amount of reserves necessary to compensate errors. In this work, we performed a statistical analysis of themore » day-ahead (and two-day-ahead) load forecasting errors observed in two independent system operators for a one-year period. Comparisons were made with the normal distribution commonly assumed in power system operation simulations used for renewable power integration studies. Further analysis identified time periods when the load is more likely to be under- or overforecast.« less

  14. Collaboration in Complex Medical Systems

    NASA Technical Reports Server (NTRS)

    Xiao, Yan; Mankenzie, Colin F.

    1998-01-01

    Improving our understanding of collaborative work in complex environments has the potential for developing effective supporting technologies, personnel training paradigms, and design principles for multi-crew workplaces. USing a sophisticated audio-video-data acquisition system and a corresponding analysis system, the researchers at University of Maryland have been able to study in detail team performance during real trauma patient resuscitation. The first study reported here was on coordination mechanisms and on characteristics of coordination breakdowns. One of the key findings was that implicit communications were an important coordination mechanism (e.g. through the use of shared workspace and event space). The second study was on the sources of uncertainty during resuscitation. Although incoming trauma patients' status is inherently uncertain, the findings suggest that much of the uncertainty felt by care providers was related to communication and coordination. These two studies demonstrate the value of and need for creating a real-life laboratory for studying team performance with the use of comprehensive and integrated data acquisition and analysis tools.

  15. Collective Thomson scattering data analysis for Wendelstein 7-X

    NASA Astrophysics Data System (ADS)

    Abramovic, I.; Pavone, A.; Svensson, J.; Moseev, D.; Salewski, M.; Laqua, H. P.; Lopes Cardozo, N. J.; Wolf, R. C.

    2017-08-01

    Collective Thomson scattering (CTS) diagnostic is being installed on the Wendelstein 7-X stellarator to measure the bulk ion temperature in the upcoming experimental campaign. In order to prepare for the data analysis, a forward model of the diagnostic (eCTS) has been developed and integrated into the Bayesian data analysis framework Minerva. Synthetic spectra have been calculated with the forward model and inverted using Minerva in order to demonstrate the feasibility to measure the ion temperature in the presence of nuisance parameters that also influence CTS spectra. In this paper we report on the results of this anlysis and discuss the main sources of uncertainty in the CTS data analysis.

  16. ACCOUNTING FOR CALIBRATION UNCERTAINTIES IN X-RAY ANALYSIS: EFFECTIVE AREAS IN SPECTRAL FITTING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Hyunsook; Kashyap, Vinay L.; Drake, Jeremy J.

    2011-04-20

    While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here, we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can bemore » applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data. Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.« less

  17. Impact of uncertainty on modeling and testing

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.; Brown, Kendall K.

    1995-01-01

    A thorough understanding of the uncertainties associated with the modeling and testing of the Space Shuttle Main Engine (SSME) Engine will greatly aid decisions concerning hardware performance and future development efforts. This report will describe the determination of the uncertainties in the modeling and testing of the Space Shuttle Main Engine test program at the Technology Test Bed facility at Marshall Space Flight Center. Section 2 will present a summary of the uncertainty analysis methodology used and discuss the specific applications to the TTB SSME test program. Section 3 will discuss the application of the uncertainty analysis to the test program and the results obtained. Section 4 presents the results of the analysis of the SSME modeling effort from an uncertainty analysis point of view. The appendices at the end of the report contain a significant amount of information relative to the analysis, including discussions of venturi flowmeter data reduction and uncertainty propagation, bias uncertainty documentations, technical papers published, the computer code generated to determine the venturi uncertainties, and the venturi data and results used in the analysis.

  18. Optimal design of groundwater remediation system using a probabilistic multi-objective fast harmony search algorithm under uncertainty

    NASA Astrophysics Data System (ADS)

    Luo, Qiankun; Wu, Jianfeng; Yang, Yun; Qian, Jiazhong; Wu, Jichun

    2014-11-01

    This study develops a new probabilistic multi-objective fast harmony search algorithm (PMOFHS) for optimal design of groundwater remediation systems under uncertainty associated with the hydraulic conductivity (K) of aquifers. The PMOFHS integrates the previously developed deterministic multi-objective optimization method, namely multi-objective fast harmony search algorithm (MOFHS) with a probabilistic sorting technique to search for Pareto-optimal solutions to multi-objective optimization problems in a noisy hydrogeological environment arising from insufficient K data. The PMOFHS is then coupled with the commonly used flow and transport codes, MODFLOW and MT3DMS, to identify the optimal design of groundwater remediation systems for a two-dimensional hypothetical test problem and a three-dimensional Indiana field application involving two objectives: (i) minimization of the total remediation cost through the engineering planning horizon, and (ii) minimization of the mass remaining in the aquifer at the end of the operational period, whereby the pump-and-treat (PAT) technology is used to clean up contaminated groundwater. Also, Monte Carlo (MC) analysis is employed to evaluate the effectiveness of the proposed methodology. Comprehensive analysis indicates that the proposed PMOFHS can find Pareto-optimal solutions with low variability and high reliability and is a potentially effective tool for optimizing multi-objective groundwater remediation problems under uncertainty.

  19. Bayesian focalization: quantifying source localization with environmental uncertainty.

    PubMed

    Dosso, Stan E; Wilmut, Michael J

    2007-05-01

    This paper applies a Bayesian formulation to study ocean acoustic source localization as a function of uncertainty in environmental properties (water column and seabed) and of data information content [signal-to-noise ratio (SNR) and number of frequencies]. The approach follows that of the optimum uncertain field processor [A. M. Richardson and L. W. Nolte, J. Acoust. Soc. Am. 89, 2280-2284 (1991)], in that localization uncertainty is quantified by joint marginal probability distributions for source range and depth integrated over uncertain environmental properties. The integration is carried out here using Metropolis Gibbs' sampling for environmental parameters and heat-bath Gibbs' sampling for source location to provide efficient sampling over complicated parameter spaces. The approach is applied to acoustic data from a shallow-water site in the Mediterranean Sea where previous geoacoustic studies have been carried out. It is found that reliable localization requires a sufficient combination of prior (environmental) information and data information. For example, sources can be localized reliably for single-frequency data at low SNR (-3 dB) only with small environmental uncertainties, whereas successful localization with large environmental uncertainties requires higher SNR and/or multifrequency data.

  20. A Novel Robust H∞ Filter Based on Krein Space Theory in the SINS/CNS Attitude Reference System.

    PubMed

    Yu, Fei; Lv, Chongyang; Dong, Qianhui

    2016-03-18

    Owing to their numerous merits, such as compact, autonomous and independence, the strapdown inertial navigation system (SINS) and celestial navigation system (CNS) can be used in marine applications. What is more, due to the complementary navigation information obtained from two different kinds of sensors, the accuracy of the SINS/CNS integrated navigation system can be enhanced availably. Thus, the SINS/CNS system is widely used in the marine navigation field. However, the CNS is easily interfered with by the surroundings, which will lead to the output being discontinuous. Thus, the uncertainty problem caused by the lost measurement will reduce the system accuracy. In this paper, a robust H∞ filter based on the Krein space theory is proposed. The Krein space theory is introduced firstly, and then, the linear state and observation models of the SINS/CNS integrated navigation system are established reasonably. By taking the uncertainty problem into account, in this paper, a new robust H∞ filter is proposed to improve the robustness of the integrated system. At last, this new robust filter based on the Krein space theory is estimated by numerical simulations and actual experiments. Additionally, the simulation and experiment results and analysis show that the attitude errors can be reduced by utilizing the proposed robust filter effectively when the measurements are missing discontinuous. Compared to the traditional Kalman filter (KF) method, the accuracy of the SINS/CNS integrated system is improved, verifying the robustness and the availability of the proposed robust H∞ filter.

  1. Determination of Uncertainties for the New SSME Model

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.; Hawk, Clark W.

    1996-01-01

    This report discusses the uncertainty analysis performed in support of a new test analysis and performance prediction model for the Space Shuttle Main Engine. The new model utilizes uncertainty estimates for experimental data and for the analytical model to obtain the most plausible operating condition for the engine system. This report discusses the development of the data sets and uncertainty estimates to be used in the development of the new model. It also presents the application of uncertainty analysis to analytical models and the uncertainty analysis for the conservation of mass and energy balance relations is presented. A new methodology for the assessment of the uncertainty associated with linear regressions is presented.

  2. Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems.

    PubMed

    Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof

    2009-04-01

    Drinking water systems are vulnerable and subject to a wide range of risks. To avoid sub-optimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events. Fault tree analysis is a risk estimation tool with the ability to model interactions between events. Using fault tree analysis on an integrated level, a probabilistic risk analysis of a large drinking water system in Sweden was carried out. The primary aims of the study were: (1) to develop a method for integrated and probabilistic risk analysis of entire drinking water systems; and (2) to evaluate the applicability of Customer Minutes Lost (CML) as a measure of risk. The analysis included situations where no water is delivered to the consumer (quantity failure) and situations where water is delivered but does not comply with water quality standards (quality failure). Hard data as well as expert judgements were used to estimate probabilities of events and uncertainties in the estimates. The calculations were performed using Monte Carlo simulations. CML is shown to be a useful measure of risks associated with drinking water systems. The method presented provides information on risk levels, probabilities of failure, failure rates and downtimes of the system. This information is available for the entire system as well as its different sub-systems. Furthermore, the method enables comparison of the results with performance targets and acceptable levels of risk. The method thus facilitates integrated risk analysis and consequently helps decision-makers to minimise sub-optimisation of risk-reduction options.

  3. Incorporating rainfall uncertainty in a SWAT model: the river Zenne basin (Belgium) case study

    NASA Astrophysics Data System (ADS)

    Tolessa Leta, Olkeba; Nossent, Jiri; van Griensven, Ann; Bauwens, Willy

    2013-04-01

    The European Union Water Framework Directive (EU-WFD) called its member countries to achieve a good ecological status for all inland and coastal water bodies by 2015. According to recent studies, the river Zenne (Belgium) is far from this objective. Therefore, an interuniversity and multidisciplinary project "Towards a Good Ecological Status in the river Zenne (GESZ)" was launched to evaluate the effects of wastewater management plans on the river. In this project, different models have been developed and integrated using the Open Modelling Interface (OpenMI). The hydrologic, semi-distributed Soil and Water Assessment Tool (SWAT) is hereby used as one of the model components in the integrated modelling chain in order to model the upland catchment processes. The assessment of the uncertainty of SWAT is an essential aspect of the decision making process, in order to design robust management strategies that take the predicted uncertainties into account. Model uncertainty stems from the uncertainties on the model parameters, the input data (e.g, rainfall), the calibration data (e.g., stream flows) and on the model structure itself. The objective of this paper is to assess the first three sources of uncertainty in a SWAT model of the river Zenne basin. For the assessment of rainfall measurement uncertainty, first, we identified independent rainfall periods, based on the daily precipitation and stream flow observations and using the Water Engineering Time Series PROcessing tool (WETSPRO). Secondly, we assigned a rainfall multiplier parameter for each of the independent rainfall periods, which serves as a multiplicative input error corruption. Finally, we treated these multipliers as latent parameters in the model optimization and uncertainty analysis (UA). For parameter uncertainty assessment, due to the high number of parameters of the SWAT model, first, we screened out its most sensitive parameters using the Latin Hypercube One-factor-At-a-Time (LH-OAT) technique. Subsequently, we only considered the most sensitive parameters for parameter optimization and UA. To explicitly account for the stream flow uncertainty, we assumed that the stream flow measurement error increases linearly with the stream flow value. To assess the uncertainty and infer posterior distributions of the parameters, we used a Markov Chain Monte Carlo (MCMC) sampler - differential evolution adaptive metropolis (DREAM) that uses sampling from an archive of past states to generate candidate points in each individual chain. It is shown that the marginal posterior distributions of the rainfall multipliers vary widely between individual events, as a consequence of rainfall measurement errors and the spatial variability of the rain. Only few of the rainfall events are well defined. The marginal posterior distributions of the SWAT model parameter values are well defined and identified by DREAM, within their prior ranges. The posterior distributions of output uncertainty parameter values also show that the stream flow data is highly uncertain. The approach of using rainfall multipliers to treat rainfall uncertainty for a complex model has an impact on the model parameter marginal posterior distributions and on the model results Corresponding author: Tel.: +32 (0)2629 3027; fax: +32(0)2629 3022. E-mail: otolessa@vub.ac.be

  4. A Survey of Xenon Ion Sputter Yield Data and Fits Relevant to Electric Propulsion Spacecraft Integration

    NASA Technical Reports Server (NTRS)

    Yim, John T.

    2017-01-01

    A survey of low energy xenon ion impact sputter yields was conducted to provide a more coherent baseline set of sputter yield data and accompanying fits for electric propulsion integration. Data uncertainties are discussed and different available curve fit formulas are assessed for their general suitability. A Bayesian parameter fitting approach is used with a Markov chain Monte Carlo method to provide estimates for the fitting parameters while characterizing the uncertainties for the resulting yield curves.

  5. Flood Hazard Mapping by Applying Fuzzy TOPSIS Method

    NASA Astrophysics Data System (ADS)

    Han, K. Y.; Lee, J. Y.; Keum, H.; Kim, B. J.; Kim, T. H.

    2017-12-01

    There are lots of technical methods to integrate various factors for flood hazard mapping. The purpose of this study is to suggest the methodology of integrated flood hazard mapping using MCDM(Multi Criteria Decision Making). MCDM problems involve a set of alternatives that are evaluated on the basis of conflicting and incommensurate criteria. In this study, to apply MCDM to assessing flood risk, maximum flood depth, maximum velocity, and maximum travel time are considered as criterion, and each applied elements are considered as alternatives. The scheme to find the efficient alternative closest to a ideal value is appropriate way to assess flood risk of a lot of element units(alternatives) based on various flood indices. Therefore, TOPSIS which is most commonly used MCDM scheme is adopted to create flood hazard map. The indices for flood hazard mapping(maximum flood depth, maximum velocity, and maximum travel time) have uncertainty concerning simulation results due to various values according to flood scenario and topographical condition. These kind of ambiguity of indices can cause uncertainty of flood hazard map. To consider ambiguity and uncertainty of criterion, fuzzy logic is introduced which is able to handle ambiguous expression. In this paper, we made Flood Hazard Map according to levee breach overflow using the Fuzzy TOPSIS Technique. We confirmed the areas where the highest grade of hazard was recorded through the drawn-up integrated flood hazard map, and then produced flood hazard map can be compared them with those indicated in the existing flood risk maps. Also, we expect that if we can apply the flood hazard map methodology suggested in this paper even to manufacturing the current flood risk maps, we will be able to make a new flood hazard map to even consider the priorities for hazard areas, including more varied and important information than ever before. Keywords : Flood hazard map; levee break analysis; 2D analysis; MCDM; Fuzzy TOPSIS Acknowlegement This research was supported by a grant (17AWMP-B079625-04) from Water Management Research Program funded by Ministry of Land, Infrastructure and Transport of Korean government.

  6. Double-pulse 2-μm integrated path differential absorption lidar airborne validation for atmospheric carbon dioxide measurement.

    PubMed

    Refaat, Tamer F; Singh, Upendra N; Yu, Jirong; Petros, Mulugeta; Remus, Ruben; Ismail, Syed

    2016-05-20

    Field experiments were conducted to test and evaluate the initial atmospheric carbon dioxide (CO2) measurement capability of airborne, high-energy, double-pulsed, 2-μm integrated path differential absorption (IPDA) lidar. This IPDA was designed, integrated, and operated at the NASA Langley Research Center on-board the NASA B-200 aircraft. The IPDA was tuned to the CO2 strong absorption line at 2050.9670 nm, which is the optimum for lower tropospheric weighted column measurements. Flights were conducted over land and ocean under different conditions. The first validation experiments of the IPDA for atmospheric CO2 remote sensing, focusing on low surface reflectivity oceanic surface returns during full day background conditions, are presented. In these experiments, the IPDA measurements were validated by comparison to airborne flask air-sampling measurements conducted by the NOAA Earth System Research Laboratory. IPDA performance modeling was conducted to evaluate measurement sensitivity and bias errors. The IPDA signals and their variation with altitude compare well with predicted model results. In addition, off-off-line testing was conducted, with fixed instrument settings, to evaluate the IPDA systematic and random errors. Analysis shows an altitude-independent differential optical depth offset of 0.0769. Optical depth measurement uncertainty of 0.0918 compares well with the predicted value of 0.0761. IPDA CO2 column measurement compares well with model-driven, near-simultaneous air-sampling measurements from the NOAA aircraft at different altitudes. With a 10-s shot average, CO2 differential optical depth measurement of 1.0054±0.0103 was retrieved from a 6-km altitude and a 4-GHz on-line operation. As compared to CO2 weighted-average column dry-air volume mixing ratio of 404.08 ppm, derived from air sampling, IPDA measurement resulted in a value of 405.22±4.15  ppm with 1.02% uncertainty and 0.28% additional bias. Sensitivity analysis of environmental systematic errors correlates the additional bias to water vapor. IPDA ranging resulted in a measurement uncertainty of <3  m.

  7. Applied Mathematics in EM Studies with Special Emphasis on an Uncertainty Quantification and 3-D Integral Equation Modelling

    NASA Astrophysics Data System (ADS)

    Pankratov, Oleg; Kuvshinov, Alexey

    2016-01-01

    Despite impressive progress in the development and application of electromagnetic (EM) deterministic inverse schemes to map the 3-D distribution of electrical conductivity within the Earth, there is one question which remains poorly addressed—uncertainty quantification of the recovered conductivity models. Apparently, only an inversion based on a statistical approach provides a systematic framework to quantify such uncertainties. The Metropolis-Hastings (M-H) algorithm is the most popular technique for sampling the posterior probability distribution that describes the solution of the statistical inverse problem. However, all statistical inverse schemes require an enormous amount of forward simulations and thus appear to be extremely demanding computationally, if not prohibitive, if a 3-D set up is invoked. This urges development of fast and scalable 3-D modelling codes which can run large-scale 3-D models of practical interest for fractions of a second on high-performance multi-core platforms. But, even with these codes, the challenge for M-H methods is to construct proposal functions that simultaneously provide a good approximation of the target density function while being inexpensive to be sampled. In this paper we address both of these issues. First we introduce a variant of the M-H method which uses information about the local gradient and Hessian of the penalty function. This, in particular, allows us to exploit adjoint-based machinery that has been instrumental for the fast solution of deterministic inverse problems. We explain why this modification of M-H significantly accelerates sampling of the posterior probability distribution. In addition we show how Hessian handling (inverse, square root) can be made practicable by a low-rank approximation using the Lanczos algorithm. Ultimately we discuss uncertainty analysis based on stochastic inversion results. In addition, we demonstrate how this analysis can be performed within a deterministic approach. In the second part, we summarize modern trends in the development of efficient 3-D EM forward modelling schemes with special emphasis on recent advances in the integral equation approach.

  8. Assessment of SFR Wire Wrap Simulation Uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delchini, Marc-Olivier G.; Popov, Emilian L.; Pointer, William David

    Predictive modeling and simulation of nuclear reactor performance and fuel are challenging due to the large number of coupled physical phenomena that must be addressed. Models that will be used for design or operational decisions must be analyzed for uncertainty to ascertain impacts to safety or performance. Rigorous, structured uncertainty analyses are performed by characterizing the model’s input uncertainties and then propagating the uncertainties through the model to estimate output uncertainty. This project is part of the ongoing effort to assess modeling uncertainty in Nek5000 simulations of flow configurations relevant to the advanced reactor applications of the Nuclear Energy Advancedmore » Modeling and Simulation (NEAMS) program. Three geometries are under investigation in these preliminary assessments: a 3-D pipe, a 3-D 7-pin bundle, and a single pin from the Thermal-Hydraulic Out-of-Reactor Safety (THORS) facility. Initial efforts have focused on gaining an understanding of Nek5000 modeling options and integrating Nek5000 with Dakota. These tasks are being accomplished by demonstrating the use of Dakota to assess parametric uncertainties in a simple pipe flow problem. This problem is used to optimize performance of the uncertainty quantification strategy and to estimate computational requirements for assessments of complex geometries. A sensitivity analysis to three turbulent models was conducted for a turbulent flow in a single wire wrapped pin (THOR) geometry. Section 2 briefly describes the software tools used in this study and provides appropriate references. Section 3 presents the coupling interface between Dakota and a computational fluid dynamic (CFD) code (Nek5000 or STARCCM+), with details on the workflow, the scripts used for setting up the run, and the scripts used for post-processing the output files. In Section 4, the meshing methods used to generate the THORS and 7-pin bundle meshes are explained. Sections 5, 6 and 7 present numerical results for the 3-D pipe, the single pin THORS mesh, and the 7-pin bundle mesh, respectively.« less

  9. Predictions of space radiation fatality risk for exploration missions.

    PubMed

    Cucinotta, Francis A; To, Khiet; Cacao, Eliedonna

    2017-05-01

    In this paper we describe revisions to the NASA Space Cancer Risk (NSCR) model focusing on updates to probability distribution functions (PDF) representing the uncertainties in the radiation quality factor (QF) model parameters and the dose and dose-rate reduction effectiveness factor (DDREF). We integrate recent heavy ion data on liver, colorectal, intestinal, lung, and Harderian gland tumors with other data from fission neutron experiments into the model analysis. In an earlier work we introduced distinct QFs for leukemia and solid cancer risk predictions, and here we consider liver cancer risks separately because of the higher RBE's reported in mouse experiments compared to other tumors types, and distinct risk factors for liver cancer for astronauts compared to the U.S. The revised model is used to make predictions of fatal cancer and circulatory disease risks for 1-year deep space and International Space Station (ISS) missions, and a 940 day Mars mission. We analyzed the contribution of the various model parameter uncertainties to the overall uncertainty, which shows that the uncertainties in relative biological effectiveness (RBE) factors at high LET due to statistical uncertainties and differences across tissue types and mouse strains are the dominant uncertainty. NASA's exposure limits are approached or exceeded for each mission scenario considered. Two main conclusions are made: 1) Reducing the current estimate of about a 3-fold uncertainty to a 2-fold or lower uncertainty will require much more expansive animal carcinogenesis studies in order to reduce statistical uncertainties and understand tissue, sex and genetic variations. 2) Alternative model assumptions such as non-targeted effects, increased tumor lethality and decreased latency at high LET, and non-cancer mortality risks from circulatory diseases could significantly increase risk estimates to several times higher than the NASA limits. Copyright © 2017 The Committee on Space Research (COSPAR). Published by Elsevier Ltd. All rights reserved.

  10. First Reprocessing of Southern Hemisphere ADditional OZonesondes Profile Records: 3. Uncertainty in Ozone Profile and Total Column

    NASA Astrophysics Data System (ADS)

    Witte, Jacquelyn C.; Thompson, Anne M.; Smit, Herman G. J.; Vömel, Holger; Posny, Françoise; Stübi, Rene

    2018-03-01

    Reprocessed ozonesonde data from eight SHADOZ (Southern Hemisphere ADditional OZonesondes) sites have been used to derive the first analysis of uncertainty estimates for both profile and total column ozone (TCO). The ozone uncertainty is a composite of the uncertainties of the individual terms in the ozone partial pressure (PO3) equation, those being the ozone sensor current, background current, internal pump temperature, pump efficiency factors, conversion efficiency, and flow rate. Overall, PO3 uncertainties (ΔPO3) are within 15% and peak around the tropopause (15 ± 3 km) where ozone is a minimum and ΔPO3 approaches the measured signal. The uncertainty in the background and sensor currents dominates the overall ΔPO3 in the troposphere including the tropopause region, while the uncertainties in the conversion efficiency and flow rate dominate in the stratosphere. Seasonally, ΔPO3 is generally a maximum in the March-May, with the exception of SHADOZ sites in Asia, for which the highest ΔPO3 occurs in September-February. As a first approach, we calculate sonde TCO uncertainty (ΔTCO) by integrating the profile ΔPO3 and adding the ozone residual uncertainty, derived from the McPeters and Labow (2012, doi:10.1029/2011JD017006) 1σ ozone mixing ratios. Overall, ΔTCO are within ±15 Dobson units (DU), representing 5-6% of the TCO. Total Ozone Mapping Spectrometer and Ozone Monitoring Instrument (TOMS and OMI) satellite overpasses are generally within the sonde ΔTCO. However, there is a discontinuity between TOMS v8.6 (1998 to September 2004) and OMI (October 2004-2016) TCO on the order of 10 DU that accounts for the significant 16 DU overall difference observed between sonde and TOMS. By comparison, the sonde-OMI absolute difference for the eight stations is only 4 DU.

  11. Methods and Issues for the Combined Use of Integral Experiments and Covariance Data: Results of a NEA International Collaborative Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palmiotti, Giuseppe; Salvatores, Massimo

    2014-04-01

    The Working Party on International Nuclear Data Evaluation Cooperation (WPEC) of the Nuclear Science Committee under the Nuclear Energy Agency (NEA/OECD) established a Subgroup (called “Subgroup 33”) in 2009 on “Methods and issues for the combined use of integral experiments and covariance data.” The first stage was devoted to producing the description of different adjustment methodologies and assessing their merits. A detailed document related to this first stage has been issued. Nine leading organizations (often with a long and recognized expertise in the field) have contributed: ANL, CEA, INL, IPPE, JAEA, JSI, NRG, IRSN and ORNL. In the second stagemore » a practical benchmark exercise was defined in order to test the reliability of the nuclear data adjustment methodology. A comparison of the results obtained by the participants and major lessons learned in the exercise are discussed in the present paper that summarizes individual contributions which often include several original developments not reported separately. The paper provides the analysis of the most important results of the adjustment of the main nuclear data of 11 major isotopes in a 33-group energy structure. This benchmark exercise was based on a set of 20 well defined integral parameters from 7 fast assembly experiments. The exercise showed that using a common shared set of integral experiments but different starting evaluated libraries and/or different covariance matrices, there is a good convergence of trends for adjustments. Moreover, a significant reduction of the original uncertainties is often observed. Using the a–posteriori covariance data, there is a strong reduction of the uncertainties of integral parameters for reference reactor designs, mainly due to the new correlations in the a–posteriori covariance matrix. Furthermore, criteria have been proposed and applied to verify the consistency of differential and integral data used in the adjustment. Finally, recommendations are given for an appropriate use of sensitivity analysis methods and indications for future work are provided.« less

  12. Managing uncertainty in flood protection planning with climate projections

    NASA Astrophysics Data System (ADS)

    Dittes, Beatrice; Špačková, Olga; Schoppa, Lukas; Straub, Daniel

    2018-04-01

    Technical flood protection is a necessary part of integrated strategies to protect riverine settlements from extreme floods. Many technical flood protection measures, such as dikes and protection walls, are costly to adapt after their initial construction. This poses a challenge to decision makers as there is large uncertainty in how the required protection level will change during the measure lifetime, which is typically many decades long. Flood protection requirements should account for multiple future uncertain factors: socioeconomic, e.g., whether the population and with it the damage potential grows or falls; technological, e.g., possible advancements in flood protection; and climatic, e.g., whether extreme discharge will become more frequent or not. This paper focuses on climatic uncertainty. Specifically, we devise methodology to account for uncertainty associated with the use of discharge projections, ultimately leading to planning implications. For planning purposes, we categorize uncertainties as either visible, if they can be quantified from available catchment data, or hidden, if they cannot be quantified from catchment data and must be estimated, e.g., from the literature. It is vital to consider the hidden uncertainty, since in practical applications only a limited amount of information (e.g., a finite projection ensemble) is available. We use a Bayesian approach to quantify the visible uncertainties and combine them with an estimate of the hidden uncertainties to learn a joint probability distribution of the parameters of extreme discharge. The methodology is integrated into an optimization framework and applied to a pre-alpine case study to give a quantitative, cost-optimal recommendation on the required amount of flood protection. The results show that hidden uncertainty ought to be considered in planning, but the larger the uncertainty already present, the smaller the impact of adding more. The recommended planning is robust to moderate changes in uncertainty as well as in trend. In contrast, planning without consideration of bias and dependencies in and between uncertainty components leads to strongly suboptimal planning recommendations.

  13. Evaluation of UK Integrated Care Pilots: research protocol.

    PubMed

    Ling, Tom; Bardsley, Martin; Adams, John; Lewis, Richard; Roland, Martin

    2010-09-27

    In response to concerns that the needs of the aging population for well-integrated care were increasing, the English National Health Service (NHS) appointed 16 Integrated Care Pilots following a national competition. The pilots have a range of aims including development of new organisational structures to support integration, changes in staff roles, reducing unscheduled emergency hospital admissions, reduced length of hospital stay, increasing patient satisfaction, and reducing cost. This paper describes the evaluation of the initiative which has been commissioned. A mixed methods approach has been adopted including interviews with staff and patients, non-participant observation of meetings, structured written feedback from sites, questionnaires to patients and staff, and analysis of routinely collected hospital utilisation data for patients/service users. The qualitative analysis aims to identify the approaches taken to integration by the sites, the benefits which result, the context in which benefits have resulted, and the mechanisms by which they occur. The quantitative analysis adopts a 'difference in differences' approach comparing health care utilisation before and after the intervention with risk-matched controls. The qualitative data analysis adopts a 'theory of change' approach in which we triangulate data from the quantitative analysis with qualitative data in order to describe causal effects (what happens when an independent variable changes) and causal mechanisms (what connects causes to their effects). An economic analysis will identify what incremental resources are required to make integration succeed and how they can be combined efficiently to produce better outcomes for patients. This evaluation will produce a portfolio of evidence aimed at strengthening the evidence base for integrated care, and in particular identifying the context in which interventions are likely to be effective. These data will support a series of evaluation judgements aimed at reducing uncertainties about the role of integrated care in improving the efficient and effective delivery of healthcare.

  14. Global Sensitivity Analysis for Identifying Important Parameters of Nitrogen Nitrification and Denitrification under Model and Scenario Uncertainties

    NASA Astrophysics Data System (ADS)

    Ye, M.; Chen, Z.; Shi, L.; Zhu, Y.; Yang, J.

    2017-12-01

    Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. While global sensitivity analysis is a vital tool for identifying the parameters important to nitrogen reactive transport, conventional global sensitivity analysis only considers parametric uncertainty. This may result in inaccurate selection of important parameters, because parameter importance may vary under different models and modeling scenarios. By using a recently developed variance-based global sensitivity analysis method, this paper identifies important parameters with simultaneous consideration of parametric uncertainty, model uncertainty, and scenario uncertainty. In a numerical example of nitrogen reactive transport modeling, a combination of three scenarios of soil temperature and two scenarios of soil moisture leads to a total of six scenarios. Four alternative models are used to evaluate reduction functions used for calculating actual rates of nitrification and denitrification. The model uncertainty is tangled with scenario uncertainty, as the reduction functions depend on soil temperature and moisture content. The results of sensitivity analysis show that parameter importance varies substantially between different models and modeling scenarios, which may lead to inaccurate selection of important parameters if model and scenario uncertainties are not considered. This problem is avoided by using the new method of sensitivity analysis in the context of model averaging and scenario averaging. The new method of sensitivity analysis can be applied to other problems of contaminant transport modeling when model uncertainty and/or scenario uncertainty are present.

  15. Urban water supply infrastructure planning under predictive groundwater uncertainty: Bayesian updating and flexible design

    NASA Astrophysics Data System (ADS)

    Fletcher, S.; Strzepek, K.

    2017-12-01

    Many urban water planners face increased pressure on water supply systems from increasing demands from population and economic growth in combination with uncertain water supply, driven by short-term climate variability and long-term climate change. These uncertainties are often exacerbated in groundwater-dependent water systems due to the extra difficulty in measuring groundwater storage, recharge, and sustainable yield. Groundwater models are typically under-parameterized due to the high data requirements for calibration and limited data availability, leading to uncertainty in the models' predictions. We develop an integrated approach to urban water supply planning that combines predictive groundwater uncertainty analysis with adaptive water supply planning using multi-stage decision analysis. This allows us to compare the value of collecting additional groundwater data and reducing predictive uncertainty with the value of using water infrastructure planning that is flexible, modular, and can react quickly in response to unexpected changes in groundwater availability. We apply this approach to a case from Riyadh, Saudi Arabia. Riyadh relies on fossil groundwater aquifers and desalination for urban use. The main fossil aquifers incur minimal recharge and face depletion as a result of intense withdrawals for urban and agricultural use. As the water table declines and pumping becomes uneconomical, Riyadh will have to build new supply infrastructure, decrease demand, or increase the efficiency of its distribution system. However, poor groundwater characterization has led to severe uncertainty in aquifer parameters such as hydraulic conductivity, and therefore severe uncertainty in how the water table will respond to pumping over time and when these transitions will be necessary: the potential depletion time varies from approximately five years to 100 years. This case is an excellent candidate for flexible planning both because of its severity and the potential for learning: additional information can be collected over time and flexible options exercised in response. Stochastic dynamic programming is used to find optimal policies for using flexibility under different information scenarios. The performance of each strategy is then assessed using a simulation model of Riyadh's water system.

  16. Uncertainty and sensitivity assessments of an agricultural-hydrological model (RZWQM2) using the GLUE method

    NASA Astrophysics Data System (ADS)

    Sun, Mei; Zhang, Xiaolin; Huo, Zailin; Feng, Shaoyuan; Huang, Guanhua; Mao, Xiaomin

    2016-03-01

    Quantitatively ascertaining and analyzing the effects of model uncertainty on model reliability is a focal point for agricultural-hydrological models due to more uncertainties of inputs and processes. In this study, the generalized likelihood uncertainty estimation (GLUE) method with Latin hypercube sampling (LHS) was used to evaluate the uncertainty of the RZWQM-DSSAT (RZWQM2) model outputs responses and the sensitivity of 25 parameters related to soil properties, nutrient transport and crop genetics. To avoid the one-sided risk of model prediction caused by using a single calibration criterion, the combined likelihood (CL) function integrated information concerning water, nitrogen, and crop production was introduced in GLUE analysis for the predictions of the following four model output responses: the total amount of water content (T-SWC) and the nitrate nitrogen (T-NIT) within the 1-m soil profile, the seed yields of waxy maize (Y-Maize) and winter wheat (Y-Wheat). In the process of evaluating RZWQM2, measurements and meteorological data were obtained from a field experiment that involved a winter wheat and waxy maize crop rotation system conducted from 2003 to 2004 in southern Beijing. The calibration and validation results indicated that RZWQM2 model can be used to simulate the crop growth and water-nitrogen migration and transformation in wheat-maize crop rotation planting system. The results of uncertainty analysis using of GLUE method showed T-NIT was sensitive to parameters relative to nitrification coefficient, maize growth characteristics on seedling period, wheat vernalization period, and wheat photoperiod. Parameters on soil saturated hydraulic conductivity, nitrogen nitrification and denitrification, and urea hydrolysis played an important role in crop yield component. The prediction errors for RZWQM2 outputs with CL function were relatively lower and uniform compared with other likelihood functions composed of individual calibration criterion. This new and successful application of the GLUE method for determining the uncertainty and sensitivity of the RZWQM2 could provide a reference for the optimization of model parameters with different emphases according to research interests.

  17. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  18. The worth of data to reduce predictive uncertainty of an integrated catchment model by multi-constraint calibration

    NASA Astrophysics Data System (ADS)

    Koch, J.; Jensen, K. H.; Stisen, S.

    2017-12-01

    Hydrological models that integrate numerical process descriptions across compartments of the water cycle are typically required to undergo thorough model calibration in order to estimate suitable effective model parameters. In this study, we apply a spatially distributed hydrological model code which couples the saturated zone with the unsaturated zone and the energy portioning at the land surface. We conduct a comprehensive multi-constraint model calibration against nine independent observational datasets which reflect both the temporal and the spatial behavior of hydrological response of a 1000km2 large catchment in Denmark. The datasets are obtained from satellite remote sensing and in-situ measurements and cover five keystone hydrological variables: discharge, evapotranspiration, groundwater head, soil moisture and land surface temperature. Results indicate that a balanced optimization can be achieved where errors on objective functions for all nine observational datasets can be reduced simultaneously. The applied calibration framework was tailored with focus on improving the spatial pattern performance; however results suggest that the optimization is still more prone to improve the temporal dimension of model performance. This study features a post-calibration linear uncertainty analysis. This allows quantifying parameter identifiability which is the worth of a specific observational dataset to infer values to model parameters through calibration. Furthermore the ability of an observation to reduce predictive uncertainty is assessed as well. Such findings determine concrete implications on the design of model calibration frameworks and, in more general terms, the acquisition of data in hydrological observatories.

  19. Management strategies in hospitals: scenario planning.

    PubMed

    Ghanem, Mohamed; Schnoor, Jörg; Heyde, Christoph-Eckhard; Kuwatsch, Sandra; Bohn, Marco; Josten, Christoph

    2015-01-01

    Instead of waiting for challenges to confront hospital management, doctors and managers should act in advance to optimize and sustain value-based health. This work highlights the importance of scenario planning in hospitals, proposes an elaborated definition of the stakeholders of a hospital and defines the influence factors to which hospitals are exposed to. Based on literature analysis as well as on personal interviews with stakeholders we propose an elaborated definition of stakeholders and designed a questionnaire that integrated the following influence factors, which have relevant impact on hospital management: political/legal, economic, social, technological and environmental forces. These influence factors are examined to develop the so-called critical uncertainties. Thorough identification of uncertainties was based on a "Stakeholder Feedback". Two key uncertainties were identified and considered in this study: the development of workload for the medical staff the profit oriented performance of the medical staff. According to the developed scenarios, complementary education of the medical staff as well as of non-medical top executives and managers of hospitals was the recommended core strategy. Complementary scenario-specific strategic options should be considered whenever needed to optimize dealing with a specific future development of the health care environment. Strategic planning in hospitals is essential to ensure sustainable success. It considers multiple situations and integrates internal and external insights and perspectives in addition to identifying weak signals and "blind spots". This flows into a sound planning for multiple strategic options. It is a state of the art tool that allows dealing with the increasing challenges facing hospital management.

  20. Integrated reservoir characterization for unconventional reservoirs using seismic, microseismic and well log data

    NASA Astrophysics Data System (ADS)

    Maity, Debotyam

    This study is aimed at an improved understanding of unconventional reservoirs which include tight reservoirs (such as shale oil and gas plays), geothermal developments, etc. We provide a framework for improved fracture zone identification and mapping of the subsurface for a geothermal system by integrating data from different sources. The proposed ideas and methods were tested primarily on data obtained from North Brawley geothermal field and the Geysers geothermal field apart from synthetic datasets which were used to test new algorithms before actual application on the real datasets. The study has resulted in novel or improved algorithms for use at specific stages of data acquisition and analysis including improved phase detection technique for passive seismic (and teleseismic) data as well as optimization of passive seismic surveys for best possible processing results. The proposed workflow makes use of novel integration methods as a means of making best use of the available geophysical data for fracture characterization. The methodology incorporates soft computing tools such as hybrid neural networks (neuro-evolutionary algorithms) as well as geostatistical simulation techniques to improve the property estimates as well as overall characterization efficacy. The basic elements of the proposed characterization workflow involves using seismic and microseismic data to characterize structural and geomechanical features within the subsurface. We use passive seismic data to model geomechanical properties which are combined with other properties evaluated from seismic and well logs to derive both qualitative and quantitative fracture zone identifiers. The study has resulted in a broad framework highlighting a new technique for utilizing geophysical data (seismic and microseismic) for unconventional reservoir characterization. It provides an opportunity to optimally develop the resources in question by incorporating data from different sources and using their temporal and spatial variability as a means to better understand the reservoir behavior. As part of this study, we have developed the following elements which are discussed in the subsequent chapters: 1. An integrated characterization framework for unconventional settings with adaptable workflows for all stages of data processing, interpretation and analysis. 2. A novel autopicking workflow for noisy passive seismic data used for improved accuracy in event picking as well as for improved velocity model building. 3. Improved passive seismic survey design optimization framework for better data collection and improved property estimation. 4. Extensive post-stack seismic attribute studies incorporating robust schemes applicable in complex reservoir settings. 5. Uncertainty quantification and analysis to better quantify property estimates over and above the qualitative interpretations made and to validate observations independently with quantified uncertainties to prevent erroneous interpretations. 6. Property mapping from microseismic data including stress and anisotropic weakness estimates for integrated reservoir characterization and analysis. 7. Integration of results (seismic, microseismic and well logs) from analysis of individual data sets for integrated interpretation using predefined integration framework and soft computing tools.

  1. Probabilistic Multi-Scale, Multi-Level, Multi-Disciplinary Analysis and Optimization of Engine Structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Abumeri, Galib H.

    2000-01-01

    Aircraft engines are assemblies of dynamically interacting components. Engine updates to keep present aircraft flying safely and engines for new aircraft are progressively required to operate in more demanding technological and environmental requirements. Designs to effectively meet those requirements are necessarily collections of multi-scale, multi-level, multi-disciplinary analysis and optimization methods and probabilistic methods are necessary to quantify respective uncertainties. These types of methods are the only ones that can formally evaluate advanced composite designs which satisfy those progressively demanding requirements while assuring minimum cost, maximum reliability and maximum durability. Recent research activities at NASA Glenn Research Center have focused on developing multi-scale, multi-level, multidisciplinary analysis and optimization methods. Multi-scale refers to formal methods which describe complex material behavior metal or composite; multi-level refers to integration of participating disciplines to describe a structural response at the scale of interest; multidisciplinary refers to open-ended for various existing and yet to be developed discipline constructs required to formally predict/describe a structural response in engine operating environments. For example, these include but are not limited to: multi-factor models for material behavior, multi-scale composite mechanics, general purpose structural analysis, progressive structural fracture for evaluating durability and integrity, noise and acoustic fatigue, emission requirements, hot fluid mechanics, heat-transfer and probabilistic simulations. Many of these, as well as others, are encompassed in an integrated computer code identified as Engine Structures Technology Benefits Estimator (EST/BEST) or Multi-faceted/Engine Structures Optimization (MP/ESTOP). The discipline modules integrated in MP/ESTOP include: engine cycle (thermodynamics), engine weights, internal fluid mechanics, cost, mission and coupled structural/thermal, various composite property simulators and probabilistic methods to evaluate uncertainty effects (scatter ranges) in all the design parameters. The objective of the proposed paper is to briefly describe a multi-faceted design analysis and optimization capability for coupled multi-discipline engine structures optimization. Results are presented for engine and aircraft type metrics to illustrate the versatility of that capability. Results are also presented for reliability, noise and fatigue to illustrate its inclusiveness. For example, replacing metal rotors with composites reduces the engine weight by 20 percent, 15 percent noise reduction, and an order of magnitude improvement in reliability. Composite designs exist to increase fatigue life by at least two orders of magnitude compared to state-of-the-art metals.

  2. Forward and backward uncertainty propagation: an oxidation ditch modelling example.

    PubMed

    Abusam, A; Keesman, K J; van Straten, G

    2003-01-01

    In the field of water technology, forward uncertainty propagation is frequently used, whereas backward uncertainty propagation is rarely used. In forward uncertainty analysis, one moves from a given (or assumed) parameter subspace towards the corresponding distribution of the output or objective function. However, in the backward uncertainty propagation, one moves in the reverse direction, from the distribution function towards the parameter subspace. Backward uncertainty propagation, which is a generalisation of parameter estimation error analysis, gives information essential for designing experimental or monitoring programmes, and for tighter bounding of parameter uncertainty intervals. The procedure of carrying out backward uncertainty propagation is illustrated in this technical note by working example for an oxidation ditch wastewater treatment plant. Results obtained have demonstrated that essential information can be achieved by carrying out backward uncertainty propagation analysis.

  3. An imprecise probability approach for squeal instability analysis based on evidence theory

    NASA Astrophysics Data System (ADS)

    Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie

    2017-01-01

    An imprecise probability approach based on evidence theory is proposed for squeal instability analysis of uncertain disc brakes in this paper. First, the squeal instability of the finite element (FE) model of a disc brake is investigated and its dominant unstable eigenvalue is detected by running two typical numerical simulations, i.e., complex eigenvalue analysis (CEA) and transient dynamical analysis. Next, the uncertainty mainly caused by contact and friction is taken into account and some key parameters of the brake are described as uncertain parameters. All these uncertain parameters are usually involved with imprecise data such as incomplete information and conflict information. Finally, a squeal instability analysis model considering imprecise uncertainty is established by integrating evidence theory, Taylor expansion, subinterval analysis and surrogate model. In the proposed analysis model, the uncertain parameters with imprecise data are treated as evidence variables, and the belief measure and plausibility measure are employed to evaluate system squeal instability. The effectiveness of the proposed approach is demonstrated by numerical examples and some interesting observations and conclusions are summarized from the analyses and discussions. The proposed approach is generally limited to the squeal problems without too many investigated parameters. It can be considered as a potential method for squeal instability analysis, which will act as the first step to reduce squeal noise of uncertain brakes with imprecise information.

  4. Pretest uncertainty analysis for chemical rocket engine tests

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.

    1987-01-01

    A parametric pretest uncertainty analysis has been performed for a chemical rocket engine test at a unique 1000:1 area ratio altitude test facility. Results from the parametric study provide the error limits required in order to maintain a maximum uncertainty of 1 percent on specific impulse. Equations used in the uncertainty analysis are presented.

  5. Integrated situational awareness for cyber attack detection, analysis, and mitigation

    NASA Astrophysics Data System (ADS)

    Cheng, Yi; Sagduyu, Yalin; Deng, Julia; Li, Jason; Liu, Peng

    2012-06-01

    Real-time cyberspace situational awareness is critical for securing and protecting today's enterprise networks from various cyber threats. When a security incident occurs, network administrators and security analysts need to know what exactly has happened in the network, why it happened, and what actions or countermeasures should be taken to quickly mitigate the potential impacts. In this paper, we propose an integrated cyberspace situational awareness system for efficient cyber attack detection, analysis and mitigation in large-scale enterprise networks. Essentially, a cyberspace common operational picture will be developed, which is a multi-layer graphical model and can efficiently capture and represent the statuses, relationships, and interdependencies of various entities and elements within and among different levels of a network. Once shared among authorized users, this cyberspace common operational picture can provide an integrated view of the logical, physical, and cyber domains, and a unique visualization of disparate data sets to support decision makers. In addition, advanced analyses, such as Bayesian Network analysis, will be explored to address the information uncertainty, dynamic and complex cyber attack detection, and optimal impact mitigation issues. All the developed technologies will be further integrated into an automatic software toolkit to achieve near real-time cyberspace situational awareness and impact mitigation in large-scale computer networks.

  6. Benefits and Limitations of Real Options Analysis for the Practice of River Flood Risk Management

    NASA Astrophysics Data System (ADS)

    Kind, Jarl M.; Baayen, Jorn H.; Botzen, W. J. Wouter

    2018-04-01

    Decisions on long-lived flood risk management (FRM) investments are complex because the future is uncertain. Flexibility and robustness can be used to deal with future uncertainty. Real options analysis (ROA) provides a welfare-economics framework to design and evaluate robust and flexible FRM strategies under risk or uncertainty. Although its potential benefits are large, ROA is hardly used in todays' FRM practice. In this paper, we investigate benefits and limitations of a ROA, by applying it to a realistic FRM case study for an entire river branch. We illustrate how ROA identifies optimal short-term investments and values future options. We develop robust dike investment strategies and value the flexibility offered by additional room for the river measures. We benchmark the results of ROA against those of a standard cost-benefit analysis and show ROA's potential policy implications. The ROA for a realistic case requires a high level of geographical detail, a large ensemble of scenarios, and the inclusion of stakeholders' preferences. We found several limitations of applying the ROA. It is complex. In particular, relevant sources of uncertainty need to be recognized, quantified, integrated, and discretized in scenarios, requiring subjective choices and expert judgment. Decision trees have to be generated and stakeholders' preferences have to be translated into decision rules. On basis of this study, we give general recommendations to use high discharge scenarios for the design of measures with high fixed costs and few alternatives. Lower scenarios may be used when alternatives offer future flexibility.

  7. Info-gap theory and robust design of surveillance for invasive species: the case study of Barrow Island.

    PubMed

    Davidovitch, Lior; Stoklosa, Richard; Majer, Jonathan; Nietrzeba, Alex; Whittle, Peter; Mengersen, Kerrie; Ben-Haim, Yakov

    2009-06-01

    Surveillance for invasive non-indigenous species (NIS) is an integral part of a quarantine system. Estimating the efficiency of a surveillance strategy relies on many uncertain parameters estimated by experts, such as the efficiency of its components in face of the specific NIS, the ability of the NIS to inhabit different environments, and so on. Due to the importance of detecting an invasive NIS within a critical period of time, it is crucial that these uncertainties be accounted for in the design of the surveillance system. We formulate a detection model that takes into account, in addition to structured sampling for incursive NIS, incidental detection by untrained workers. We use info-gap theory for satisficing (not minimizing) the probability of detection, while at the same time maximizing the robustness to uncertainty. We demonstrate the trade-off between robustness to uncertainty, and an increase in the required probability of detection. An empirical example based on the detection of Pheidole megacephala on Barrow Island demonstrates the use of info-gap analysis to select a surveillance strategy.

  8. Uncertainty evaluation of EnPIs in industrial applications as a key factor in setting improvement actions

    NASA Astrophysics Data System (ADS)

    D'Emilia, G.; Di Gasbarro, D.; Gaspari, A.; Natale, E.

    2015-11-01

    A methodology is proposed assuming high-level Energy Performance Indicators (EnPIs) uncertainty as quantitative indicator of the evolution of an Energy Management System (EMS). Motivations leading to the selection of the EnPIs, uncertainty evaluation techniques and criteria supporting decision-making are discussed, in order to plan and pursue reliable measures for energy performance improvement. In this paper, problems, priorities, operative possibilities and reachable improvement limits are examined, starting from the measurement uncertainty assessment. Two different industrial cases are analysed with reference to the following aspects: absence/presence of energy management policy and action plans; responsibility level for the energy issues; employees’ training and motivation in respect of the energy problems; absence/presence of adequate infrastructures for monitoring and sharing of energy information; level of standardization and integration of methods and procedures linked to the energy activities; economic and financial resources for the improvement of energy efficiency. A critic and comparative analysis of the obtained results is realized. The methodology, experimentally validated, allows developing useful considerations for effective, realistic and economically feasible improvement plans, depending on the specific situation. Recursive application of the methodology allows getting reliable and resolved assessment of the EMS status, also in dynamic industrial contexts.

  9. Data Analysis Recipes: Using Markov Chain Monte Carlo

    NASA Astrophysics Data System (ADS)

    Hogg, David W.; Foreman-Mackey, Daniel

    2018-05-01

    Markov Chain Monte Carlo (MCMC) methods for sampling probability density functions (combined with abundant computational resources) have transformed the sciences, especially in performing probabilistic inferences, or fitting models to data. In this primarily pedagogical contribution, we give a brief overview of the most basic MCMC method and some practical advice for the use of MCMC in real inference problems. We give advice on method choice, tuning for performance, methods for initialization, tests of convergence, troubleshooting, and use of the chain output to produce or report parameter estimates with associated uncertainties. We argue that autocorrelation time is the most important test for convergence, as it directly connects to the uncertainty on the sampling estimate of any quantity of interest. We emphasize that sampling is a method for doing integrals; this guides our thinking about how MCMC output is best used. .

  10. SUNPLIN: Simulation with Uncertainty for Phylogenetic Investigations

    PubMed Central

    2013-01-01

    Background Phylogenetic comparative analyses usually rely on a single consensus phylogenetic tree in order to study evolutionary processes. However, most phylogenetic trees are incomplete with regard to species sampling, which may critically compromise analyses. Some approaches have been proposed to integrate non-molecular phylogenetic information into incomplete molecular phylogenies. An expanded tree approach consists of adding missing species to random locations within their clade. The information contained in the topology of the resulting expanded trees can be captured by the pairwise phylogenetic distance between species and stored in a matrix for further statistical analysis. Thus, the random expansion and processing of multiple phylogenetic trees can be used to estimate the phylogenetic uncertainty through a simulation procedure. Because of the computational burden required, unless this procedure is efficiently implemented, the analyses are of limited applicability. Results In this paper, we present efficient algorithms and implementations for randomly expanding and processing phylogenetic trees so that simulations involved in comparative phylogenetic analysis with uncertainty can be conducted in a reasonable time. We propose algorithms for both randomly expanding trees and calculating distance matrices. We made available the source code, which was written in the C++ language. The code may be used as a standalone program or as a shared object in the R system. The software can also be used as a web service through the link: http://purl.oclc.org/NET/sunplin/. Conclusion We compare our implementations to similar solutions and show that significant performance gains can be obtained. Our results open up the possibility of accounting for phylogenetic uncertainty in evolutionary and ecological analyses of large datasets. PMID:24229408

  11. SUNPLIN: simulation with uncertainty for phylogenetic investigations.

    PubMed

    Martins, Wellington S; Carmo, Welton C; Longo, Humberto J; Rosa, Thierson C; Rangel, Thiago F

    2013-11-15

    Phylogenetic comparative analyses usually rely on a single consensus phylogenetic tree in order to study evolutionary processes. However, most phylogenetic trees are incomplete with regard to species sampling, which may critically compromise analyses. Some approaches have been proposed to integrate non-molecular phylogenetic information into incomplete molecular phylogenies. An expanded tree approach consists of adding missing species to random locations within their clade. The information contained in the topology of the resulting expanded trees can be captured by the pairwise phylogenetic distance between species and stored in a matrix for further statistical analysis. Thus, the random expansion and processing of multiple phylogenetic trees can be used to estimate the phylogenetic uncertainty through a simulation procedure. Because of the computational burden required, unless this procedure is efficiently implemented, the analyses are of limited applicability. In this paper, we present efficient algorithms and implementations for randomly expanding and processing phylogenetic trees so that simulations involved in comparative phylogenetic analysis with uncertainty can be conducted in a reasonable time. We propose algorithms for both randomly expanding trees and calculating distance matrices. We made available the source code, which was written in the C++ language. The code may be used as a standalone program or as a shared object in the R system. The software can also be used as a web service through the link: http://purl.oclc.org/NET/sunplin/. We compare our implementations to similar solutions and show that significant performance gains can be obtained. Our results open up the possibility of accounting for phylogenetic uncertainty in evolutionary and ecological analyses of large datasets.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, W. D.

    In order to appropriately model and predict the chemical integrity and performance of cementitious materials used for waste immobilization at the Savannah River Site (SRS), it is critical to understand the I-129 solubility and distribution within the tank farm. Iodine in radioactive waste and in environmental media is typically highly mobile and long lived. Iodine is ubiquitous in SRS tank waste and waste forms. The iodine is assumed to be soluble and present at low levels in Performance Assessments (PAs) for SRS Tank Farms, and is one of the dose drivers in the PAs for both the SRS Salt Disposalmore » Facility (SDF) and the H-Area Tank Farm (HTF). Analysis of tank waste samples is critical to understanding the Tank Farm iodine inventory and reducing disposal uncertainty. Higher than expected iodine levels have recently been observed in residual solids isolated from some SRS tanks prior to closure, indicating uncertainty regarding the chemical species involved. If the iodine inventory uncertainty is larger than anticipated, future work may be necessary to reduce the uncertainty. This memorandum satisfies a portion of the work scope identified in Task Plan SRNL-RP-2016-00651. A separate memorandum issued previously, reported historical unpublished I-129 data, a significant portion of which was below detectable analytical limits. This memorandum includes iodine and general chemical analysis results for six archived SRNL samples which were previously reported to have I-129 concentrations below detectable limits. Lower sample dilution factors were used for the current analyses in order to obtain concentrations above detection. The samples analyzed included surface and depth samples from SRS tanks 30, 32, and 39.« less

  13. Flood risk and adaptation strategies under climate change and urban expansion: A probabilistic analysis using global data.

    PubMed

    Muis, Sanne; Güneralp, Burak; Jongman, Brenden; Aerts, Jeroen C J H; Ward, Philip J

    2015-12-15

    An accurate understanding of flood risk and its drivers is crucial for effective risk management. Detailed risk projections, including uncertainties, are however rarely available, particularly in developing countries. This paper presents a method that integrates recent advances in global-scale modeling of flood hazard and land change, which enables the probabilistic analysis of future trends in national-scale flood risk. We demonstrate its application to Indonesia. We develop 1000 spatially-explicit projections of urban expansion from 2000 to 2030 that account for uncertainty associated with population and economic growth projections, as well as uncertainty in where urban land change may occur. The projections show that the urban extent increases by 215%-357% (5th and 95th percentiles). Urban expansion is particularly rapid on Java, which accounts for 79% of the national increase. From 2000 to 2030, increases in exposure will elevate flood risk by, on average, 76% and 120% for river and coastal floods. While sea level rise will further increase the exposure-induced trend by 19%-37%, the response of river floods to climate change is highly uncertain. However, as urban expansion is the main driver of future risk, the implementation of adaptation measures is increasingly urgent, regardless of the wide uncertainty in climate projections. Using probabilistic urban projections, we show that spatial planning can be a very effective adaptation strategy. Our study emphasizes that global data can be used successfully for probabilistic risk assessment in data-scarce countries. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Detailed Uncertainty Analysis of the ZEM-3 Measurement System

    NASA Technical Reports Server (NTRS)

    Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred

    2014-01-01

    The measurement of Seebeck coefficient and electrical resistivity are critical to the investigation of all thermoelectric systems. Therefore, it stands that the measurement uncertainty must be well understood to report ZT values which are accurate and trustworthy. A detailed uncertainty analysis of the ZEM-3 measurement system has been performed. The uncertainty analysis calculates error in the electrical resistivity measurement as a result of sample geometry tolerance, probe geometry tolerance, statistical error, and multi-meter uncertainty. The uncertainty on Seebeck coefficient includes probe wire correction factors, statistical error, multi-meter uncertainty, and most importantly the cold-finger effect. The cold-finger effect plagues all potentiometric (four-probe) Seebeck measurement systems, as heat parasitically transfers through thermocouple probes. The effect leads to an asymmetric over-estimation of the Seebeck coefficient. A thermal finite element analysis allows for quantification of the phenomenon, and provides an estimate on the uncertainty of the Seebeck coefficient. The thermoelectric power factor has been found to have an uncertainty of +9-14 at high temperature and 9 near room temperature.

  15. A comparison of absolute calibrations of a radiation thermometer based on a monochromator and a tunable source

    NASA Astrophysics Data System (ADS)

    Keawprasert, T.; Anhalt, K.; Taubert, D. R.; Sperling, A.; Schuster, M.; Nevas, S.

    2013-09-01

    An LP3 radiation thermometer was absolutely calibrated at a newly developed monochromator-based set-up and the TUneable Lasers in Photometry (TULIP) facility of PTB in the wavelength range from 400 nm to 1100 nm. At both facilities, the spectral radiation of the respective sources irradiates an integrating sphere, thus generating uniform radiance across its precision aperture. The spectral irradiance of the integrating sphere is determined via an effective area of a precision aperture and a Si trap detector, traceable to the primary cryogenic radiometer of PTB. Due to the limited output power from the monochromator, the absolute calibration was performed with the measurement uncertainty of 0.17 % (k = 1), while the respective uncertainty at the TULIP facility is 0.14 %. Calibration results obtained by the two facilities were compared in terms of spectral radiance responsivity, effective wavelength and integral responsivity. It was found that the measurement results in integral responsivity at the both facilities are in agreement within the expanded uncertainty (k = 2). To verify the calibration accuracy, the absolutely calibrated radiation thermometer was used to measure the thermodynamic freezing temperatures of the PTB gold fixed-point blackbody.

  16. Evaluation of stormwater micropollutant source control and end-of-pipe control strategies using an uncertainty-calibrated integrated dynamic simulation model.

    PubMed

    Vezzaro, L; Sharma, A K; Ledin, A; Mikkelsen, P S

    2015-03-15

    The estimation of micropollutant (MP) fluxes in stormwater systems is a fundamental prerequisite when preparing strategies to reduce stormwater MP discharges to natural waters. Dynamic integrated models can be important tools in this step, as they can be used to integrate the limited data provided by monitoring campaigns and to evaluate the performance of different strategies based on model simulation results. This study presents an example where six different control strategies, including both source-control and end-of-pipe treatment, were compared. The comparison focused on fluxes of heavy metals (copper, zinc) and organic compounds (fluoranthene). MP fluxes were estimated by using an integrated dynamic model, in combination with stormwater quality measurements. MP sources were identified by using GIS land usage data, runoff quality was simulated by using a conceptual accumulation/washoff model, and a stormwater retention pond was simulated by using a dynamic treatment model based on MP inherent properties. Uncertainty in the results was estimated with a pseudo-Bayesian method. Despite the great uncertainty in the MP fluxes estimated by the runoff quality model, it was possible to compare the six scenarios in terms of discharged MP fluxes, compliance with water quality criteria, and sediment accumulation. Source-control strategies obtained better results in terms of reduction of MP emissions, but all the simulated strategies failed in fulfilling the criteria based on emission limit values. The results presented in this study shows how the efficiency of MP pollution control strategies can be quantified by combining advanced modeling tools (integrated stormwater quality model, uncertainty calibration). Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Minimum and Maximum Times Required to Obtain Representative Suspended Sediment Samples

    NASA Astrophysics Data System (ADS)

    Gitto, A.; Venditti, J. G.; Kostaschuk, R.; Church, M. A.

    2014-12-01

    Bottle sampling is a convenient method of obtaining suspended sediment measurements for the development of sediment budgets. While these methods are generally considered to be reliable, recent analysis of depth-integrated sampling has identified considerable uncertainty in measurements of grain-size concentration between grain-size classes of multiple samples. Point-integrated bottle sampling is assumed to represent the mean concentration of suspended sediment but the uncertainty surrounding this method is not well understood. Here we examine at-a-point variability in velocity, suspended sediment concentration, grain-size distribution, and grain-size moments to determine if traditional point-integrated methods provide a representative sample of suspended sediment. We present continuous hour-long observations of suspended sediment from the sand-bedded portion of the Fraser River at Mission, British Columbia, Canada, using a LISST laser-diffraction instrument. Spectral analysis suggests that there are no statistically significant peak in energy density, suggesting the absence of periodic fluctuations in flow and suspended sediment. However, a slope break in the spectra at 0.003 Hz corresponds to a period of 5.5 minutes. This coincides with the threshold between large-scale turbulent eddies that scale with channel width/mean velocity and hydraulic phenomena related to channel dynamics. This suggests that suspended sediment samples taken over a period longer than 5.5 minutes incorporate variability that is larger scale than turbulent phenomena in this channel. Examination of 5.5-minute periods of our time series indicate that ~20% of the time a stable mean value of volumetric concentration is reached within 30 seconds, a typical bottle sample duration. In ~12% of measurements a stable mean was not reached over the 5.5 minute sample duration. The remaining measurements achieve a stable mean in an even distribution over the intervening interval.

  18. The Modular Modeling System (MMS): A toolbox for water- and environmental-resources management

    USGS Publications Warehouse

    Leavesley, G.H.; Markstrom, S.L.; Viger, R.J.; Hay, L.E.; ,

    2005-01-01

    The increasing complexity of water- and environmental-resource problems require modeling approaches that incorporate knowledge from a broad range of scientific and software disciplines. To address this need, the U.S. Geological Survey (USGS) has developed the Modular Modeling System (MMS). MMS is an integrated system of computer software for model development, integration, and application. Its modular design allows a high level of flexibility and adaptability to enable modelers to incorporate their own software into a rich array of built-in models and modeling tools. These include individual process models, tightly coupled models, loosely coupled models, and fully- integrated decision support systems. A geographic information system (GIS) interface, the USGS GIS Weasel, has been integrated with MMS to enable spatial delineation and characterization of basin and ecosystem features, and to provide objective parameter-estimation methods for models using available digital data. MMS provides optimization and sensitivity-analysis tools to analyze model parameters and evaluate the extent to which uncertainty in model parameters affects uncertainty in simulation results. MMS has been coupled with the Bureau of Reclamation object-oriented reservoir and river-system modeling framework, RiverWare, to develop models to evaluate and apply optimal resource-allocation and management strategies to complex, operational decisions on multipurpose reservoir systems and watersheds. This decision support system approach has been developed, tested, and implemented in the Gunnison, Yakima, San Joaquin, Rio Grande, and Truckee River basins of the western United States. MMS is currently being coupled with the U.S. Forest Service model SIMulating Patterns and Processes at Landscape Scales (SIMPPLLE) to assess the effects of alternative vegetation-management strategies on a variety of hydrological and ecological responses. Initial development and testing of the MMS-SIMPPLLE integration is being conducted on the Colorado Plateau region of the western United Sates.

  19. NPSS Multidisciplinary Integration and Analysis

    NASA Technical Reports Server (NTRS)

    Hall, Edward J.; Rasche, Joseph; Simons, Todd A.; Hoyniak, Daniel

    2006-01-01

    The objective of this task was to enhance the capability of the Numerical Propulsion System Simulation (NPSS) by expanding its reach into the high-fidelity multidisciplinary analysis area. This task investigated numerical techniques to convert between cold static to hot running geometry of compressor blades. Numerical calculations of blade deformations were iteratively done with high fidelity flow simulations together with high fidelity structural analysis of the compressor blade. The flow simulations were performed with the Advanced Ducted Propfan Analysis (ADPAC) code, while structural analyses were performed with the ANSYS code. High fidelity analyses were used to evaluate the effects on performance of: variations in tip clearance, uncertainty in manufacturing tolerance, variable inlet guide vane scheduling, and the effects of rotational speed on the hot running geometry of the compressor blades.

  20. Integrating Uncertainty Analysis in the Risk Characterization of In-Place Remedial Strategies for Contaminated Sediments

    DTIC Science & Technology

    2009-03-01

    sediment. The plastic limit for the samples tested averaged to 43.4 percent while the liquid limit was 77.7 percent. As discussed above, the sediment...sediment from the drums, clean it of debris such as plastic , clam shells, sticks, etc., mix to a density to be maintained consistently among all... pollutant loading to aquatic ecosystems has decreased in the past decades, contaminated sediments have shifted from a sink to a potential source of

  1. Compensation of distributed delays in integrated communication and control systems

    NASA Technical Reports Server (NTRS)

    Ray, Asok; Luck, Rogelio

    1991-01-01

    The concept, analysis, implementation, and verification of a method for compensating delays that are distributed between the sensors, controller, and actuators within a control loop are discussed. With the objective of mitigating the detrimental effects of these network induced delays, a predictor-controller algorithm was formulated and analyzed. Robustness of the delay compensation algorithm was investigated relative to parametric uncertainties in plant modeling. The delay compensator was experimentally verified on an IEEE 802.4 network testbed for velocity control of a DC servomotor.

  2. Budget Studies of a Prefrontal Convective Rainband in Northern Taiwan Determined from TAMEX Data

    DTIC Science & Technology

    1993-06-01

    storm top accumulates less error in w calculation than an upward integration from the surface. Other Doppler studies, e.g., Chong and Testud (1983), Lin...contribute to the uncertainty of w is a result of the advection problem (Gal-Chen, 1982; Chong and Testud , 1983). Parsons et al (1983) employed an...Boulder, CO., 95-102. Chong, M., and J. Testud , 1983: Three-dimensional wind field analysis from dual-Doppler radar data. Part III: The boundary condition

  3. GNSS Radio Occultation Excess Phase Processing with Integrated Uncertainty Estimation for Thermodynamic Cal/Val of Passive Atmospheric Sounders and Climate Science

    NASA Astrophysics Data System (ADS)

    Innerkofler, J.; Pock, C.; Kirchengast, G.; Schwaerz, M.; Jaeggi, A.; Andres, Y.; Marquardt, C.; Hunt, D.; Schreiner, W. S.; Schwarz, J.

    2017-12-01

    Global Navigation Satellite System (GNSS) radio occultation (RO) is a highly valuable satellite remote sensing technique for atmospheric and climate sciences, including calibration and validation (cal/val) of passive sounding instruments such as radiometers. It is providing accurate and precise measurements in the troposphere and stratosphere regions with global coverage, long-term stability, and virtually all-weather capability since 2001. For fully exploiting the potential of RO data as a cal/val reference and climate data record, uncertainties attributed to the data need to be assessed. Here we focus on the atmospheric excess phase data, based on the raw occultation tracking and orbit data, and its integrated uncertainty estimation within the new Reference Occultation Processing System (rOPS) developed at the WEGC. These excess phases correspond to integrated refractivity, proportional to pressure/temperature and water vapor, and are therefore highly valuable reference data for thermodynamic cal/val of passive (radiometric) sounder data. In order to enable high accuracy of the excess phase profiles, accurate orbit positions and velocities as well as clock estimates of the GNSS transmitter satellites and RO receiver satellites are determined using the Bernese and Napeos orbit determination software packages. We find orbit uncertainty estimates of about 5 cm (position) / 0.05 mm/s (velocity) for daily orbits for the MetOp, GRACE, and CHAMP RO missions, and decreased uncertainty estimates near 20 cm (position) / 0.2 mm/s (velocity) for the COSMIC RO mission. The strict evaluation and quality control of the position, velocity, and clock accuracies of the daily LEO and GNSS orbits assure smallest achievable uncertainties in the excess phase data. We compared the excess phase profiles from WEGC against profiles from EUMETSAT and UCAR. Results show good agreement in line with the estimated uncertainties, with millimetric differences in the upper stratosphere and mesosphere and centimetric differences in the troposphere, where the excess phases amount to beyond 100 m. This underlines the potential for a new fundamental cal/val reference and climate data record based on atmospheric excess phases from RO, given their narrow uncertainty and independence from background data.

  4. Integrating legal liabilities in nanomanufacturing risk management.

    PubMed

    Mohan, Mayank; Trump, Benjamin D; Bates, Matthew E; Monica, John C; Linkov, Igor

    2012-08-07

    Among other things, the wide-scale development and use of nanomaterials is expected to produce costly regulatory and civil liabilities for nanomanufacturers due to lingering uncertainties, unanticipated effects, and potential toxicity. The life-cycle environmental, health, and safety (EHS) risks of nanomaterials are currently being studied, but the corresponding legal risks have not been systematically addressed. With the aid of a systematic approach that holistically evaluates and accounts for uncertainties about the inherent properties of nanomaterials, it is possible to provide an order of magnitude estimate of liability risks from regulatory and litigious sources based on current knowledge. In this work, we present a conceptual framework for integrating estimated legal liabilities with EHS risks across nanomaterial life-cycle stages using empirical knowledge in the field, scientific and legal judgment, probabilistic risk assessment, and multicriteria decision analysis. Such estimates will provide investors and operators with a basis to compare different technologies and practices and will also inform regulatory and legislative bodies in determining standards that balance risks with technical advancement. We illustrate the framework through the hypothetical case of a manufacturer of nanoscale titanium dioxide and use the resulting expected legal costs to evaluate alternative risk-management actions.

  5. Analysis of Wien filter spectra from Hall thruster plumes.

    PubMed

    Huang, Wensheng; Shastry, Rohit

    2015-07-01

    A method for analyzing the Wien filter spectra obtained from the plumes of Hall thrusters is derived and presented. The new method extends upon prior work by deriving the integration equations for the current and species fractions. Wien filter spectra from the plume of the NASA-300M Hall thruster are analyzed with the presented method and the results are used to examine key trends. The new integration method is found to produce results slightly different from the traditional area-under-the-curve method. The use of different velocity distribution forms when performing curve-fits to the peaks in the spectra is compared. Additional comparison is made with the scenario where the current fractions are assumed to be proportional to the heights of peaks. The comparison suggests that the calculated current fractions are not sensitive to the choice of form as long as both the height and width of the peaks are accounted for. Conversely, forms that only account for the height of the peaks produce inaccurate results. Also presented are the equations for estimating the uncertainty associated with applying curve fits and charge-exchange corrections. These uncertainty equations can be used to plan the geometry of the experimental setup.

  6. Uncertain Environmental Footprint of Current and Future Battery Electric Vehicles.

    PubMed

    Cox, Brian; Mutel, Christopher L; Bauer, Christian; Mendoza Beltran, Angelica; van Vuuren, Detlef P

    2018-04-17

    The future environmental impacts of battery electric vehicles (EVs) are very important given their expected dominance in future transport systems. Previous studies have shown these impacts to be highly uncertain, though a detailed treatment of this uncertainty is still lacking. We help to fill this gap by using Monte Carlo and global sensitivity analysis to quantify parametric uncertainty and also consider two additional factors that have not yet been addressed in the field. First, we include changes to driving patterns due to the introduction of autonomous and connected vehicles. Second, we deeply integrate scenario results from the IMAGE integrated assessment model into our life cycle database to include the impacts of changes to the electricity sector on the environmental burdens of producing and recharging future EVs. Future EVs are expected to have 45-78% lower climate change impacts than current EVs. Electricity used for charging is the largest source of variability in results, though vehicle size, lifetime, driving patterns, and battery size also strongly contribute to variability. We also show that it is imperative to consider changes to the electricity sector when calculating upstream impacts of EVs, as without this, results could be overestimated by up to 75%.

  7. The Advanced Modeling, Simulation and Analysis Capability Roadmap Vision for Engineering

    NASA Technical Reports Server (NTRS)

    Zang, Thomas; Lieber, Mike; Norton, Charles; Fucik, Karen

    2006-01-01

    This paper summarizes a subset of the Advanced Modeling Simulation and Analysis (AMSA) Capability Roadmap that was developed for NASA in 2005. The AMSA Capability Roadmap Team was chartered to "To identify what is needed to enhance NASA's capabilities to produce leading-edge exploration and science missions by improving engineering system development, operations, and science understanding through broad application of advanced modeling, simulation and analysis techniques." The AMSA roadmap stressed the need for integration, not just within the science, engineering and operations domains themselves, but also across these domains. Here we discuss the roadmap element pertaining to integration within the engineering domain, with a particular focus on implications for future observatory missions. The AMSA products supporting the system engineering function are mission information, bounds on information quality, and system validation guidance. The Engineering roadmap element contains 5 sub-elements: (1) Large-Scale Systems Models, (2) Anomalous Behavior Models, (3) advanced Uncertainty Models, (4) Virtual Testing Models, and (5) space-based Robotics Manufacture and Servicing Models.

  8. Expected value analysis for integrated supplier selection and inventory control of multi-product inventory system with fuzzy cost

    NASA Astrophysics Data System (ADS)

    Sutrisno, Widowati, Tjahjana, R. Heru

    2017-12-01

    The future cost in many industrial problem is obviously uncertain. Then a mathematical analysis for a problem with uncertain cost is needed. In this article, we deals with the fuzzy expected value analysis to solve an integrated supplier selection and supplier selection problem with uncertain cost where the costs uncertainty is approached by a fuzzy variable. We formulate the mathematical model of the problems fuzzy expected value based quadratic optimization with total cost objective function and solve it by using expected value based fuzzy programming. From the numerical examples result performed by the authors, the supplier selection problem was solved i.e. the optimal supplier was selected for each time period where the optimal product volume of all product that should be purchased from each supplier for each time period was determined and the product stock level was controlled as decided by the authors i.e. it was followed the given reference level.

  9. Hydrologic and geochemical data assimilation at the Hanford 300 Area

    NASA Astrophysics Data System (ADS)

    Chen, X.; Hammond, G. E.; Murray, C. J.; Zachara, J. M.

    2012-12-01

    In modeling the uranium migration within the Integrated Field Research Challenge (IFRC) site at the Hanford 300 Area, uncertainties arise from both hydrologic and geochemical sources. The hydrologic uncertainty includes the transient flow boundary conditions induced by dynamic variations in Columbia River stage and the underlying heterogeneous hydraulic conductivity field, while the geochemical uncertainty is a result of limited knowledge of the geochemical reaction processes and parameters, as well as heterogeneity in uranium source terms. In this work, multiple types of data, including the results from constant-injection tests, borehole flowmeter profiling, and conservative tracer tests, are sequentially assimilated across scales within a Bayesian framework to reduce the hydrologic uncertainty. The hydrologic data assimilation is then followed by geochemical data assimilation, where the goal is to infer the heterogeneous distribution of uranium sources using uranium breakthrough curves from a desorption test that took place at high spring water table. We demonstrate in our study that Ensemble-based data assimilation techniques (Ensemble Kalman filter and smoother) are efficient in integrating multiple types of data sequentially for uncertainty reduction. The computational demand is managed by using the multi-realization capability within the parallel PFLOTRAN simulator.

  10. Assessing the multidimensionality of coastal erosion risks: public participation and multicriteria analysis in a Mediterranean coastal system.

    PubMed

    Roca, Elisabet; Gamboa, Gonzalo; Tàbara, J David

    2008-04-01

    The complex and multidimensional nature of coastal erosion risks makes it necessary to move away from single-perspective assessment and management methods that have conventionally predominated in coastal management. This article explores the suitability of participatory multicriteria analysis (MCA) for improving the integration of diverse expertises and values and enhancing the social-ecological robustness of the processes that lead to the definition of relevant policy options to deal with those risks. We test this approach in the Mediterranean coastal locality of Lido de Sète in France. Results show that the more adaptive alternatives such as "retreating the shoreline" were preferred by our selected stakeholders to those corresponding to "protecting the shoreline" and the business as usual proposals traditionally put forward by experts and policymakers on these matters. Participative MCA contributed to represent coastal multidimensionality, elicit and integrate different views and preferences, facilitated knowledge exchange, and allowed highlighting existing uncertainties.

  11. Contribution of physical modelling to climate-driven landslide hazard mapping: an alpine test site

    NASA Astrophysics Data System (ADS)

    Vandromme, R.; Desramaut, N.; Baills, A.; Hohmann, A.; Grandjean, G.; Sedan, O.; Mallet, J. P.

    2012-04-01

    The aim of this work is to develop a methodology for integrating climate change scenarios into quantitative hazard assessment and especially their precipitation component. The effects of climate change will be different depending on both the location of the site and the type of landslide considered. Indeed, mass movements can be triggered by different factors. This paper describes a methodology to address this issue and shows an application on an alpine test site. Mechanical approaches represent a solution for quantitative landslide susceptibility and hazard modeling. However, as the quantity and the quality of data are generally very heterogeneous at a regional scale, it is necessary to take into account the uncertainty in the analysis. In this perspective, a new hazard modeling method is developed and integrated in a program named ALICE. This program integrates mechanical stability analysis through a GIS software taking into account data uncertainty. This method proposes a quantitative classification of landslide hazard and offers a useful tool to gain time and efficiency in hazard mapping. However, an expertise approach is still necessary to finalize the maps. Indeed it is the only way to take into account some influent factors in slope stability such as heterogeneity of the geological formations or effects of anthropic interventions. To go further, the alpine test site (Barcelonnette area, France) is being used to integrate climate change scenarios into ALICE program, and especially their precipitation component with the help of a hydrological model (GARDENIA) and the regional climate model REMO (Jacob, 2001). From a DEM, land-cover map, geology, geotechnical data and so forth the program classifies hazard zones depending on geotechnics and different hydrological contexts varying in time. This communication, realized within the framework of Safeland project, is supported by the European Commission under the 7th Framework Programme for Research and Technological Development, Area "Environment", Activity 1.3.3.1 "Prediction of triggering and risk assessment for landslides".

  12. QUANTIFYING UNCERTAINTY DUE TO RANDOM ERRORS FOR MOMENT ANALYSES OF BREAKTHROUGH CURVES

    EPA Science Inventory

    The uncertainty in moments calculated from breakthrough curves (BTCs) is investigated as a function of random measurement errors in the data used to define the BTCs. The method presented assumes moments are calculated by numerical integration using the trapezoidal rule, and is t...

  13. Inexact nonlinear improved fuzzy chance-constrained programming model for irrigation water management under uncertainty

    NASA Astrophysics Data System (ADS)

    Zhang, Chenglong; Zhang, Fan; Guo, Shanshan; Liu, Xiao; Guo, Ping

    2018-01-01

    An inexact nonlinear mλ-measure fuzzy chance-constrained programming (INMFCCP) model is developed for irrigation water allocation under uncertainty. Techniques of inexact quadratic programming (IQP), mλ-measure, and fuzzy chance-constrained programming (FCCP) are integrated into a general optimization framework. The INMFCCP model can deal with not only nonlinearities in the objective function, but also uncertainties presented as discrete intervals in the objective function, variables and left-hand side constraints and fuzziness in the right-hand side constraints. Moreover, this model improves upon the conventional fuzzy chance-constrained programming by introducing a linear combination of possibility measure and necessity measure with varying preference parameters. To demonstrate its applicability, the model is then applied to a case study in the middle reaches of Heihe River Basin, northwest China. An interval regression analysis method is used to obtain interval crop water production functions in the whole growth period under uncertainty. Therefore, more flexible solutions can be generated for optimal irrigation water allocation. The variation of results can be examined by giving different confidence levels and preference parameters. Besides, it can reflect interrelationships among system benefits, preference parameters, confidence levels and the corresponding risk levels. Comparison between interval crop water production functions and deterministic ones based on the developed INMFCCP model indicates that the former is capable of reflecting more complexities and uncertainties in practical application. These results can provide more reliable scientific basis for supporting irrigation water management in arid areas.

  14. Target Uncertainty Mediates Sensorimotor Error Correction

    PubMed Central

    Vijayakumar, Sethu; Wolpert, Daniel M.

    2017-01-01

    Human movements are prone to errors that arise from inaccuracies in both our perceptual processing and execution of motor commands. We can reduce such errors by both improving our estimates of the state of the world and through online error correction of the ongoing action. Two prominent frameworks that explain how humans solve these problems are Bayesian estimation and stochastic optimal feedback control. Here we examine the interaction between estimation and control by asking if uncertainty in estimates affects how subjects correct for errors that may arise during the movement. Unbeknownst to participants, we randomly shifted the visual feedback of their finger position as they reached to indicate the center of mass of an object. Even though participants were given ample time to compensate for this perturbation, they only fully corrected for the induced error on trials with low uncertainty about center of mass, with correction only partial in trials involving more uncertainty. The analysis of subjects’ scores revealed that participants corrected for errors just enough to avoid significant decrease in their overall scores, in agreement with the minimal intervention principle of optimal feedback control. We explain this behavior with a term in the loss function that accounts for the additional effort of adjusting one’s response. By suggesting that subjects’ decision uncertainty, as reflected in their posterior distribution, is a major factor in determining how their sensorimotor system responds to error, our findings support theoretical models in which the decision making and control processes are fully integrated. PMID:28129323

  15. Target Uncertainty Mediates Sensorimotor Error Correction.

    PubMed

    Acerbi, Luigi; Vijayakumar, Sethu; Wolpert, Daniel M

    2017-01-01

    Human movements are prone to errors that arise from inaccuracies in both our perceptual processing and execution of motor commands. We can reduce such errors by both improving our estimates of the state of the world and through online error correction of the ongoing action. Two prominent frameworks that explain how humans solve these problems are Bayesian estimation and stochastic optimal feedback control. Here we examine the interaction between estimation and control by asking if uncertainty in estimates affects how subjects correct for errors that may arise during the movement. Unbeknownst to participants, we randomly shifted the visual feedback of their finger position as they reached to indicate the center of mass of an object. Even though participants were given ample time to compensate for this perturbation, they only fully corrected for the induced error on trials with low uncertainty about center of mass, with correction only partial in trials involving more uncertainty. The analysis of subjects' scores revealed that participants corrected for errors just enough to avoid significant decrease in their overall scores, in agreement with the minimal intervention principle of optimal feedback control. We explain this behavior with a term in the loss function that accounts for the additional effort of adjusting one's response. By suggesting that subjects' decision uncertainty, as reflected in their posterior distribution, is a major factor in determining how their sensorimotor system responds to error, our findings support theoretical models in which the decision making and control processes are fully integrated.

  16. An adaptive approach to invasive plant management on U.S. Fish and Wildlife Service-owned native prairies in the Prairie Pothole Region: decision support under uncertainity

    USGS Publications Warehouse

    Gannon, Jill J.; Moore, Clinton T.; Shaffer, Terry L.; Flanders-Wanner, Bridgette

    2011-01-01

    Much of the native prairie managed by the U.S. Fish and Wildlife Service (Service) in the Prairie Pothole Region (PPR) is extensively invaded by the introduced cool-season grasses smooth brome (Bromus inermis) and Kentucky bluegrass (Poa pratensis). The central challenge to managers is selecting appropriate management actions in the face of biological and environmental uncertainties. We describe the technical components of a USGS management project, and explain how the components integrate and inform each other, how data feedback from individual cooperators serves to reduce uncertainty across the whole region, and how a successful adaptive management project is coordinated and maintained on a large scale. In partnership with the Service, the U.S. Geological Survey is developing an adaptive decision support framework to assist managers in selecting management actions under uncertainty and maximizing learning from management outcomes. The framework is built around practical constraints faced by refuge managers and includes identification of the management objective and strategies, analysis of uncertainty and construction of competing decision models, monitoring, and mechanisms for model feedback and decision selection. Nineteen Service field stations, spanning four states of the PPR, are participating in the project. They share a common management objective, available management strategies, and biological uncertainties. While the scope is broad, the project interfaces with individual land managers who provide refuge-specific information and receive updated decision guidance that incorporates understanding gained from the collective experience of all cooperators.

  17. A detailed description of the uncertainty analysis for high area ratio rocket nozzle tests at the NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac

    1987-01-01

    A preliminary uncertainty analysis was performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis is presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.

  18. A detailed description of the uncertainty analysis for High Area Ratio Rocket Nozzle tests at the NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac

    1987-01-01

    A preliminary uncertainty analysis has been performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis are presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.

  19. Scalable Methods for Uncertainty Quantification, Data Assimilation and Target Accuracy Assessment for Multi-Physics Advanced Simulation of Light Water Reactors

    NASA Astrophysics Data System (ADS)

    Khuwaileh, Bassam

    High fidelity simulation of nuclear reactors entails large scale applications characterized with high dimensionality and tremendous complexity where various physics models are integrated in the form of coupled models (e.g. neutronic with thermal-hydraulic feedback). Each of the coupled modules represents a high fidelity formulation of the first principles governing the physics of interest. Therefore, new developments in high fidelity multi-physics simulation and the corresponding sensitivity/uncertainty quantification analysis are paramount to the development and competitiveness of reactors achieved through enhanced understanding of the design and safety margins. Accordingly, this dissertation introduces efficient and scalable algorithms for performing efficient Uncertainty Quantification (UQ), Data Assimilation (DA) and Target Accuracy Assessment (TAA) for large scale, multi-physics reactor design and safety problems. This dissertation builds upon previous efforts for adaptive core simulation and reduced order modeling algorithms and extends these efforts towards coupled multi-physics models with feedback. The core idea is to recast the reactor physics analysis in terms of reduced order models. This can be achieved via identifying the important/influential degrees of freedom (DoF) via the subspace analysis, such that the required analysis can be recast by considering the important DoF only. In this dissertation, efficient algorithms for lower dimensional subspace construction have been developed for single physics and multi-physics applications with feedback. Then the reduced subspace is used to solve realistic, large scale forward (UQ) and inverse problems (DA and TAA). Once the elite set of DoF is determined, the uncertainty/sensitivity/target accuracy assessment and data assimilation analysis can be performed accurately and efficiently for large scale, high dimensional multi-physics nuclear engineering applications. Hence, in this work a Karhunen-Loeve (KL) based algorithm previously developed to quantify the uncertainty for single physics models is extended for large scale multi-physics coupled problems with feedback effect. Moreover, a non-linear surrogate based UQ approach is developed, used and compared to performance of the KL approach and brute force Monte Carlo (MC) approach. On the other hand, an efficient Data Assimilation (DA) algorithm is developed to assess information about model's parameters: nuclear data cross-sections and thermal-hydraulics parameters. Two improvements are introduced in order to perform DA on the high dimensional problems. First, a goal-oriented surrogate model can be used to replace the original models in the depletion sequence (MPACT -- COBRA-TF - ORIGEN). Second, approximating the complex and high dimensional solution space with a lower dimensional subspace makes the sampling process necessary for DA possible for high dimensional problems. Moreover, safety analysis and design optimization depend on the accurate prediction of various reactor attributes. Predictions can be enhanced by reducing the uncertainty associated with the attributes of interest. Accordingly, an inverse problem can be defined and solved to assess the contributions from sources of uncertainty; and experimental effort can be subsequently directed to further improve the uncertainty associated with these sources. In this dissertation a subspace-based gradient-free and nonlinear algorithm for inverse uncertainty quantification namely the Target Accuracy Assessment (TAA) has been developed and tested. The ideas proposed in this dissertation were first validated using lattice physics applications simulated using SCALE6.1 package (Pressurized Water Reactor (PWR) and Boiling Water Reactor (BWR) lattice models). Ultimately, the algorithms proposed her were applied to perform UQ and DA for assembly level (CASL progression problem number 6) and core wide problems representing Watts Bar Nuclear 1 (WBN1) for cycle 1 of depletion (CASL Progression Problem Number 9) modeled via simulated using VERA-CS which consists of several multi-physics coupled models. The analysis and algorithms developed in this dissertation were encoded and implemented in a newly developed tool kit algorithms for Reduced Order Modeling based Uncertainty/Sensitivity Estimator (ROMUSE).

  20. Simulating and quantifying legacy topographic data uncertainty: an initial step to advancing topographic change analyses

    NASA Astrophysics Data System (ADS)

    Wasklewicz, Thad; Zhu, Zhen; Gares, Paul

    2017-12-01

    Rapid technological advances, sustained funding, and a greater recognition of the value of topographic data have helped develop an increasing archive of topographic data sources. Advances in basic and applied research related to Earth surface changes require researchers to integrate recent high-resolution topography (HRT) data with the legacy datasets. Several technical challenges and data uncertainty issues persist to date when integrating legacy datasets with more recent HRT data. The disparate data sources required to extend the topographic record back in time are often stored in formats that are not readily compatible with more recent HRT data. Legacy data may also contain unknown error or unreported error that make accounting for data uncertainty difficult. There are also cases of known deficiencies in legacy datasets, which can significantly bias results. Finally, scientists are faced with the daunting challenge of definitively deriving the extent to which a landform or landscape has or will continue to change in response natural and/or anthropogenic processes. Here, we examine the question: how do we evaluate and portray data uncertainty from the varied topographic legacy sources and combine this uncertainty with current spatial data collection techniques to detect meaningful topographic changes? We view topographic uncertainty as a stochastic process that takes into consideration spatial and temporal variations from a numerical simulation and physical modeling experiment. The numerical simulation incorporates numerous topographic data sources typically found across a range of legacy data to present high-resolution data, while the physical model focuses on more recent HRT data acquisition techniques. Elevation uncertainties observed from anchor points in the digital terrain models are modeled using "states" in a stochastic estimator. Stochastic estimators trace the temporal evolution of the uncertainties and are natively capable of incorporating sensor measurements observed at various times in history. The geometric relationship between the anchor point and the sensor measurement can be approximated via spatial correlation even when a sensor does not directly observe an anchor point. Findings from a numerical simulation indicate the estimated error coincides with the actual error using certain sensors (Kinematic GNSS, ALS, TLS, and SfM-MVS). Data from 2D imagery and static GNSS did not perform as well at the time the sensor is integrated into estimator largely as a result of the low density of data added from these sources. The estimator provides a history of DEM estimation as well as the uncertainties and cross correlations observed on anchor points. Our work provides preliminary evidence that our approach is valid for integrating legacy data with HRT and warrants further exploration and field validation. [Figure not available: see fulltext.

  1. Uncertainty Analysis of NASA Glenn's 8- by 6-Foot Supersonic Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Stephens, Julia E.; Hubbard, Erin P.; Walter, Joel A.; McElroy, Tyler

    2016-01-01

    An analysis was performed to determine the measurement uncertainty of the Mach Number of the 8- by 6-foot Supersonic Wind Tunnel at the NASA Glenn Research Center. This paper details the analysis process used, including methods for handling limited data and complicated data correlations. Due to the complexity of the equations used, a Monte Carlo Method was utilized for this uncertainty analysis. A summary of the findings are presented as pertains to understanding what the uncertainties are, how they impact various research tests in the facility, and methods of reducing the uncertainties in the future.

  2. Methodology for qualitative uncertainty assessment of climate impact indicators

    NASA Astrophysics Data System (ADS)

    Otto, Juliane; Keup-Thiel, Elke; Rechid, Diana; Hänsler, Andreas; Pfeifer, Susanne; Roth, Ellinor; Jacob, Daniela

    2016-04-01

    The FP7 project "Climate Information Portal for Copernicus" (CLIPC) is developing an integrated platform of climate data services to provide a single point of access for authoritative scientific information on climate change and climate change impacts. In this project, the Climate Service Center Germany (GERICS) has been in charge of the development of a methodology on how to assess the uncertainties related to climate impact indicators. Existing climate data portals mainly treat the uncertainties in two ways: Either they provide generic guidance and/or express with statistical measures the quantifiable fraction of the uncertainty. However, none of the climate data portals give the users a qualitative guidance how confident they can be in the validity of the displayed data. The need for such guidance was identified in CLIPC user consultations. Therefore, we aim to provide an uncertainty assessment that provides the users with climate impact indicator-specific guidance on the degree to which they can trust the outcome. We will present an approach that provides information on the importance of different sources of uncertainties associated with a specific climate impact indicator and how these sources affect the overall 'degree of confidence' of this respective indicator. To meet users requirements in the effective communication of uncertainties, their feedback has been involved during the development process of the methodology. Assessing and visualising the quantitative component of uncertainty is part of the qualitative guidance. As visual analysis method, we apply the Climate Signal Maps (Pfeifer et al. 2015), which highlight only those areas with robust climate change signals. Here, robustness is defined as a combination of model agreement and the significance of the individual model projections. Reference Pfeifer, S., Bülow, K., Gobiet, A., Hänsler, A., Mudelsee, M., Otto, J., Rechid, D., Teichmann, C. and Jacob, D.: Robustness of Ensemble Climate Projections Analyzed with Climate Signal Maps: Seasonal and Extreme Precipitation for Germany, Atmosphere (Basel)., 6(5), 677-698, doi:10.3390/atmos6050677, 2015.

  3. Benchmarking observational uncertainties for hydrology (Invited)

    NASA Astrophysics Data System (ADS)

    McMillan, H. K.; Krueger, T.; Freer, J. E.; Westerberg, I.

    2013-12-01

    There is a pressing need for authoritative and concise information on the expected error distributions and magnitudes in hydrological data, to understand its information content. Many studies have discussed how to incorporate uncertainty information into model calibration and implementation, and shown how model results can be biased if uncertainty is not appropriately characterised. However, it is not always possible (for example due to financial or time constraints) to make detailed studies of uncertainty for every research study. Instead, we propose that the hydrological community could benefit greatly from sharing information on likely uncertainty characteristics and the main factors that control the resulting magnitude. In this presentation, we review the current knowledge of uncertainty for a number of key hydrological variables: rainfall, flow and water quality (suspended solids, nitrogen, phosphorus). We collated information on the specifics of the data measurement (data type, temporal and spatial resolution), error characteristics measured (e.g. standard error, confidence bounds) and error magnitude. Our results were primarily split by data type. Rainfall uncertainty was controlled most strongly by spatial scale, flow uncertainty was controlled by flow state (low, high) and gauging method. Water quality presented a more complex picture with many component errors. For all variables, it was easy to find examples where relative error magnitude exceeded 40%. We discuss some of the recent developments in hydrology which increase the need for guidance on typical error magnitudes, in particular when doing comparative/regionalisation and multi-objective analysis. Increased sharing of data, comparisons between multiple catchments, and storage in national/international databases can mean that data-users are far removed from data collection, but require good uncertainty information to reduce bias in comparisons or catchment regionalisation studies. Recently it has become more common for hydrologists to use multiple data types and sources within a single study. This may be driven by complex water management questions which integrate water quantity, quality and ecology; or by recognition of the value of auxiliary data to understand hydrological processes. We discuss briefly the impact of data uncertainty on the increasingly popular use of diagnostic signatures for hydrological process understanding and model development.

  4. [A correlational study on uncertainty, mastery and appraisal of uncertainty in hospitalized children's mothers].

    PubMed

    Yoo, Kyung Hee

    2007-06-01

    This study was conducted to investigate the correlation among uncertainty, mastery and appraisal of uncertainty in hospitalized children's mothers. Self report questionnaires were used to measure the variables. Variables were uncertainty, mastery and appraisal of uncertainty. In data analysis, the SPSSWIN 12.0 program was utilized for descriptive statistics, Pearson's correlation coefficients, and regression analysis. Reliability of the instruments was cronbach's alpha=.84~.94. Mastery negatively correlated with uncertainty(r=-.444, p=.000) and danger appraisal of uncertainty(r=-.514, p=.000). In regression of danger appraisal of uncertainty, uncertainty and mastery were significant predictors explaining 39.9%. Mastery was a significant mediating factor between uncertainty and danger appraisal of uncertainty in hospitalized children's mothers. Therefore, nursing interventions which improve mastery must be developed for hospitalized children's mothers.

  5. Uncertainty Analysis of the NASA Glenn 8x6 Supersonic Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Stephens, Julia; Hubbard, Erin; Walter, Joel; McElroy, Tyler

    2016-01-01

    This paper presents methods and results of a detailed measurement uncertainty analysis that was performed for the 8- by 6-foot Supersonic Wind Tunnel located at the NASA Glenn Research Center. The statistical methods and engineering judgments used to estimate elemental uncertainties are described. The Monte Carlo method of propagating uncertainty was selected to determine the uncertainty of calculated variables of interest. A detailed description of the Monte Carlo method as applied for this analysis is provided. Detailed uncertainty results for the uncertainty in average free stream Mach number as well as other variables of interest are provided. All results are presented as random (variation in observed values about a true value), systematic (potential offset between observed and true value), and total (random and systematic combined) uncertainty. The largest sources contributing to uncertainty are determined and potential improvement opportunities for the facility are investigated.

  6. A Novel Robust H∞ Filter Based on Krein Space Theory in the SINS/CNS Attitude Reference System

    PubMed Central

    Yu, Fei; Lv, Chongyang; Dong, Qianhui

    2016-01-01

    Owing to their numerous merits, such as compact, autonomous and independence, the strapdown inertial navigation system (SINS) and celestial navigation system (CNS) can be used in marine applications. What is more, due to the complementary navigation information obtained from two different kinds of sensors, the accuracy of the SINS/CNS integrated navigation system can be enhanced availably. Thus, the SINS/CNS system is widely used in the marine navigation field. However, the CNS is easily interfered with by the surroundings, which will lead to the output being discontinuous. Thus, the uncertainty problem caused by the lost measurement will reduce the system accuracy. In this paper, a robust H∞ filter based on the Krein space theory is proposed. The Krein space theory is introduced firstly, and then, the linear state and observation models of the SINS/CNS integrated navigation system are established reasonably. By taking the uncertainty problem into account, in this paper, a new robust H∞ filter is proposed to improve the robustness of the integrated system. At last, this new robust filter based on the Krein space theory is estimated by numerical simulations and actual experiments. Additionally, the simulation and experiment results and analysis show that the attitude errors can be reduced by utilizing the proposed robust filter effectively when the measurements are missing discontinuous. Compared to the traditional Kalman filter (KF) method, the accuracy of the SINS/CNS integrated system is improved, verifying the robustness and the availability of the proposed robust H∞ filter. PMID:26999153

  7. Development of a special-purpose test surface guided by uncertainty analysis - Introduction of a new uncertainty analysis step

    NASA Technical Reports Server (NTRS)

    Wang, T.; Simon, T. W.

    1988-01-01

    Development of a recent experimental program to investigate the effects of streamwise curvature on boundary layer transition required making a bendable, heated and instrumented test wall, a rather nonconventional surface. The present paper describes this surface, the design choices made in its development and how uncertainty analysis was used, beginning early in the test program, to make such design choices. Published uncertainty analysis techniques were found to be of great value; but, it became clear that another step, one herein called the pre-test analysis, would aid the program development. Finally, it is shown how the uncertainty analysis was used to determine whether the test surface was qualified for service.

  8. Industrial water resources management based on violation risk analysis of the total allowable target on wastewater discharge.

    PubMed

    Yue, Wencong; Cai, Yanpeng; Xu, Linyu; Yang, Zhifeng; Yin, Xin'An; Su, Meirong

    2017-07-11

    To improve the capabilities of conventional methodologies in facilitating industrial water allocation under uncertain conditions, an integrated approach was developed through the combination of operational research, uncertainty analysis, and violation risk analysis methods. The developed approach can (a) address complexities of industrial water resources management (IWRM) systems, (b) facilitate reflections of multiple uncertainties and risks of the system and incorporate them into a general optimization framework, and (c) manage robust actions for industrial productions in consideration of water supply capacity and wastewater discharging control. The developed method was then demonstrated in a water-stressed city (i.e., the City of Dalian), northeastern China. Three scenarios were proposed according to the city's industrial plans. The results indicated that in the planning year of 2020 (a) the production of civilian-used steel ships and machine-made paper & paperboard would reduce significantly, (b) violation risk of chemical oxygen demand (COD) discharge under scenario 1 would be the most prominent, compared with those under scenarios 2 and 3, (c) the maximal total economic benefit under scenario 2 would be higher than the benefit under scenario 3, and (d) the production of rolling contact bearing, rail vehicles, and commercial vehicles would be promoted.

  9. Uncertainty analysis of hydrological modeling in a tropical area using different algorithms

    NASA Astrophysics Data System (ADS)

    Rafiei Emam, Ammar; Kappas, Martin; Fassnacht, Steven; Linh, Nguyen Hoang Khanh

    2018-01-01

    Hydrological modeling outputs are subject to uncertainty resulting from different sources of errors (e.g., error in input data, model structure, and model parameters), making quantification of uncertainty in hydrological modeling imperative and meant to improve reliability of modeling results. The uncertainty analysis must solve difficulties in calibration of hydrological models, which further increase in areas with data scarcity. The purpose of this study is to apply four uncertainty analysis algorithms to a semi-distributed hydrological model, quantifying different source of uncertainties (especially parameter uncertainty) and evaluate their performance. In this study, the Soil and Water Assessment Tools (SWAT) eco-hydrological model was implemented for the watershed in the center of Vietnam. The sensitivity of parameters was analyzed, and the model was calibrated. The uncertainty analysis for the hydrological model was conducted based on four algorithms: Generalized Likelihood Uncertainty Estimation (GLUE), Sequential Uncertainty Fitting (SUFI), Parameter Solution method (ParaSol) and Particle Swarm Optimization (PSO). The performance of the algorithms was compared using P-factor and Rfactor, coefficient of determination (R 2), the Nash Sutcliffe coefficient of efficiency (NSE) and Percent Bias (PBIAS). The results showed the high performance of SUFI and PSO with P-factor>0.83, R-factor <0.56 and R 2>0.91, NSE>0.89, and 0.18

  10. Integration of Dakota into the NEAMS Workbench

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swiler, Laura Painton; Lefebvre, Robert A.; Langley, Brandon R.

    2017-07-01

    This report summarizes a NEAMS (Nuclear Energy Advanced Modeling and Simulation) project focused on integrating Dakota into the NEAMS Workbench. The NEAMS Workbench, developed at Oak Ridge National Laboratory, is a new software framework that provides a graphical user interface, input file creation, parsing, validation, job execution, workflow management, and output processing for a variety of nuclear codes. Dakota is a tool developed at Sandia National Laboratories that provides a suite of uncertainty quantification and optimization algorithms. Providing Dakota within the NEAMS Workbench allows users of nuclear simulation codes to perform uncertainty and optimization studies on their nuclear codes frommore » within a common, integrated environment. Details of the integration and parsing are provided, along with an example of Dakota running a sampling study on the fuels performance code, BISON, from within the NEAMS Workbench.« less

  11. UNCERTAINTY AND SENSITIVITY ANALYSES FOR INTEGRATED HUMAN HEALTH AND ECOLOGICAL RISK ASSESSMENT OF HAZARDOUS WASTE DISPOSAL

    EPA Science Inventory

    While there is a high potential for exposure of humans and ecosystems to chemicals released from hazardous waste sites, the degree to which this potential is realized is often uncertain. Conceptually divided among parameter, model, and modeler uncertainties imparted during simula...

  12. Identification of sensitive parameters in the modeling of SVOC reemission processes from soil to atmosphere.

    PubMed

    Loizeau, Vincent; Ciffroy, Philippe; Roustan, Yelva; Musson-Genon, Luc

    2014-09-15

    Semi-volatile organic compounds (SVOCs) are subject to Long-Range Atmospheric Transport because of transport-deposition-reemission successive processes. Several experimental data available in the literature suggest that soil is a non-negligible contributor of SVOCs to atmosphere. Then coupling soil and atmosphere in integrated coupled models and simulating reemission processes can be essential for estimating atmospheric concentration of several pollutants. However, the sources of uncertainty and variability are multiple (soil properties, meteorological conditions, chemical-specific parameters) and can significantly influence the determination of reemissions. In order to identify the key parameters in reemission modeling and their effect on global modeling uncertainty, we conducted a sensitivity analysis targeted on the 'reemission' output variable. Different parameters were tested, including soil properties, partition coefficients and meteorological conditions. We performed EFAST sensitivity analysis for four chemicals (benzo-a-pyrene, hexachlorobenzene, PCB-28 and lindane) and different spatial scenari (regional and continental scales). Partition coefficients between air, solid and water phases are influent, depending on the precision of data and global behavior of the chemical. Reemissions showed a lower variability to soil parameters (soil organic matter and water contents at field capacity and wilting point). A mapping of these parameters at a regional scale is sufficient to correctly estimate reemissions when compared to other sources of uncertainty. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Helping Water Utilities Grapple with Climate Change

    NASA Astrophysics Data System (ADS)

    Yates, D.; Gracely, B.; Miller, K.

    2008-12-01

    The Water Research Foundation (WRF), serving the drinking water industry and the National Center for Atmospheric Research (NCAR) are collaborating on an effort to develop and implement locally-relevant, structured processes to help water utilities consider the impacts and adaptation options that climate variability and change might have on their water systems. Adopting a case-study approach, the structured process include 1) a problem definition phase, focused on identifying goals, information needs, utility vulnerabilities and possible adaptation options in the face of climate and hydrologic uncertainty; 2) developing and/or modifying system-specific Integrated Water Resource Management (IWRM) models and conducting sensitivity analysis to identify critical variables; 3) developing probabilistic climate change scenarios focused on exploring uncertainties identified as important in the sensitivity analysis in step 2; and 4) implementing the structured process and examining approaches decision making under uncertainty. Collaborators include seven drinking water utilities and two state agencies: 1) The Inland Empire Utility Agency, CA; 2) The El Dorado Irrigation District, Placerville CA; 2) Portland Water Bureau, Portland OR; 3) Colorado Springs Utilities, Colo Spgs, CO; 4) Cincinnati Water, Cincinnati, OH; 5) Massachusetts Water Resources Authority (MWRA), Boston, MA; 6) Durham Water, Durham, NC; and 7) Palm Beach County Water (PBCW), Palm Beach, FL. The California Department of Water Resources and the Colorado Water Conservation Board were the state agencies that we have collaborated with.

  14. Communicating microarray results of uncertain clinical significance in consultation summary letters and implications for practice.

    PubMed

    Paul, Jean Lillian; Pope-Couston, Rachel; Wake, Samantha; Burgess, Trent; Tan, Tiong Yang

    2016-01-01

    Letter-writing is an integral practice for genetic health professionals. In Victoria, Australia, patients with a chromosomal variant of uncertain clinical significance (VUS) referred to a clinical geneticist (CG) for evaluation receive consultation summary letters. While communication of uncertainty has been explored in research to some extent, little has focused on how uncertainty is communicated within consultation letters. We aimed to develop a multi-layered understanding of the ways in which CGs communicate diagnostic uncertainty in consultation summary letters. We used theme-oriented discourse analysis of 49 consultation summary letters and thematic analysis of a focus group involving eight CGs. Results showed that CGs have become more confident in their description of VUS as 'contributing factors' to patients' clinical features, but remain hesitant to assign definitive causality. CGs displayed strong epistemic stance when discussing future technological improvements to provide hope and minimise potentially disappointing outcomes for patients and families. CGs reported feeling overwhelmed by their workload associated with increasing numbers of patients with VUS, and this has led to a reduction in the number of review appointments offered over time. This study provides a rich description of the content and process of summary letters discussing VUS. Our findings have implications for letter-writing and workforce management. Furthermore, these findings may be of relevance to VUS identified by genomic sequencing in clinical practice.

  15. Assessing and reporting uncertainties in dietary exposure analysis: Mapping of uncertainties in a tiered approach.

    PubMed

    Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David

    2015-08-01

    Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. AN OVERVIEW OF THE UNCERTAINTY ANALYSIS, SENSITIVITY ANALYSIS, AND PARAMETER ESTIMATION (UA/SA/PE) API AND HOW TO IMPLEMENT IT

    EPA Science Inventory

    The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and
    Parameter Estimation (UA/SA/PE API) (also known as Calibration, Optimization and Sensitivity and Uncertainty (CUSO)) was developed in a joint effort between several members of both ...

  17. Estimating Coastal Digital Elevation Model (DEM) Uncertainty

    NASA Astrophysics Data System (ADS)

    Amante, C.; Mesick, S.

    2017-12-01

    Integrated bathymetric-topographic digital elevation models (DEMs) are representations of the Earth's solid surface and are fundamental to the modeling of coastal processes, including tsunami, storm surge, and sea-level rise inundation. Deviations in elevation values from the actual seabed or land surface constitute errors in DEMs, which originate from numerous sources, including: (i) the source elevation measurements (e.g., multibeam sonar, lidar), (ii) the interpolative gridding technique (e.g., spline, kriging) used to estimate elevations in areas unconstrained by source measurements, and (iii) the datum transformation used to convert bathymetric and topographic data to common vertical reference systems. The magnitude and spatial distribution of the errors from these sources are typically unknown, and the lack of knowledge regarding these errors represents the vertical uncertainty in the DEM. The National Oceanic and Atmospheric Administration (NOAA) National Centers for Environmental Information (NCEI) has developed DEMs for more than 200 coastal communities. This study presents a methodology developed at NOAA NCEI to derive accompanying uncertainty surfaces that estimate DEM errors at the individual cell-level. The development of high-resolution (1/9th arc-second), integrated bathymetric-topographic DEMs along the southwest coast of Florida serves as the case study for deriving uncertainty surfaces. The estimated uncertainty can then be propagated into the modeling of coastal processes that utilize DEMs. Incorporating the uncertainty produces more reliable modeling results, and in turn, better-informed coastal management decisions.

  18. Durability reliability analysis for corroding concrete structures under uncertainty

    NASA Astrophysics Data System (ADS)

    Zhang, Hao

    2018-02-01

    This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.

  19. One Health Economics to confront disease threats

    PubMed Central

    Machalaba, Catherine; Smith, Kristine M; Awada, Lina; Berry, Kevin; Berthe, Franck; Bouley, Timothy A; Bruce, Mieghan; Cortiñas Abrahantes, Jose; El Turabi, Anas; Feferholtz, Yasha; Flynn, Louise; Fournié, Giullaume; Andre, Amanda; Grace, Delia; Jonas, Olga; Kimani, Tabitha; Le Gall, François; Miranda, Juan Jose; Peyre, Marisa; Pinto, Julio; Ross, Noam; Rüegg, Simon R; Salerno, Robert H; Seifman, Richard; Zambrana-Torrelio, Carlos; Karesh, William B

    2017-01-01

    Abstract Global economic impacts of epidemics suggest high return on investment in prevention and One Health capacity. However, such investments remain limited, contributing to persistent endemic diseases and vulnerability to emerging ones. An interdisciplinary workshop explored methods for country-level analysis of added value of One Health approaches to disease control. Key recommendations include: 1. systems thinking to identify risks and mitigation options for decision-making under uncertainty; 2. multisectoral economic impact assessment to identify wider relevance and possible resource-sharing, and 3. consistent integration of environmental considerations. Economic analysis offers a congruent measure of value complementing diverse impact metrics among sectors and contexts. PMID:29044367

  20. Continuous quantum measurements and the action uncertainty principle

    NASA Astrophysics Data System (ADS)

    Mensky, Michael B.

    1992-09-01

    The path-integral approach to quantum theory of continuous measurements has been developed in preceding works of the author. According to this approach the measurement amplitude determining probabilities of different outputs of the measurement can be evaluated in the form of a restricted path integral (a path integral “in finite limits”). With the help of the measurement amplitude, maximum deviation of measurement outputs from the classical one can be easily determined. The aim of the present paper is to express this variance in a simpler and transparent form of a specific uncertainty principle (called the action uncertainty principle, AUP). The most simple (but weak) form of AUP is δ S≳ℏ, where S is the action functional. It can be applied for simple derivation of the Bohr-Rosenfeld inequality for measurability of gravitational field. A stronger (and having wider application) form of AUP (for ideal measurements performed in the quantum regime) is |∫{/' t″ }(δ S[ q]/δ q( t))Δ q( t) dt|≃ℏ, where the paths [ q] and [Δ q] stand correspondingly for the measurement output and for the measurement error. It can also be presented in symbolic form as Δ(Equation) Δ(Path) ≃ ℏ. This means that deviation of the observed (measured) motion from that obeying the classical equation of motion is reciprocally proportional to the uncertainty in a path (the latter uncertainty resulting from the measurement error). The consequence of AUP is that improving the measurement precision beyond the threshold of the quantum regime leads to decreasing information resulting from the measurement.

  1. Influence of national culture on the adoption of integrated and problem-based curricula in Europe.

    PubMed

    Jippes, Mariëlle; Majoor, Gerard D

    2008-03-01

    There is an evident misbalance in the frequency of medical schools with problem-based learning (PBL) curricula in northern versus southern Europe. This study explores the hypothesis that national culture influences the flexibility of (medical) schools in terms of their propensity to adopt integrated and PBL curricula. National culture was defined by a country's scores on indexes for 4 dimensions of culture as described by Hofstede, defined as: power distance; individualism/collectivism; masculinity/femininity, and uncertainty avoidance. Non-integrated medical curricula were defined as those that included courses in 2 of the 3 basic sciences (anatomy, biochemistry and physiology) in the first 2 years; otherwise, by exclusion, curricula were assumed to be integrated. The medical curricula of 134 of the 263 schools in the 17 European countries included in Hofstede's study were examined. Correlations were calculated between the percentage of integrated medical curricula in a country and that country's scores on indexes for each of the 4 dimensions of culture. Significant negative correlations were found between the percentage of integrated curricula and scores on the power distance index (correlation coefficient [CC]: - 0.692; P = 0.002) and the uncertainty avoidance index (CC: - 0.704; P = 0.002). No significant correlations were found between the percentage of integrated curricula and scores on the indexes for individualism/collectivism and masculinity/femininity. A (medical) school which is considering adopting an integrated or PBL curriculum and which is based in a country with a high score on Hofstede's power distance index and/or uncertainty avoidance index must a priori design strategies to reduce or overcome the obstructive effects of these dimensions of culture on the school's organisation.

  2. Grid Integration of Offshore Wind | Wind | NREL

    Science.gov Websites

    . Photograph of a wind turbine in the ocean. Located about 10 kilometers off the coast of Arklow, Ireland, the Grid Integration of Offshore Wind Grid Integration of Offshore Wind Much can be learned from the existing land-based integration research for handling the variability and uncertainty of the wind resource

  3. The Prediction-Focused Approach: An opportunity for hydrogeophysical data integration and interpretation

    NASA Astrophysics Data System (ADS)

    Hermans, Thomas; Nguyen, Frédéric; Klepikova, Maria; Dassargues, Alain; Caers, Jef

    2017-04-01

    Hydrogeophysics is an interdisciplinary field of sciences aiming at a better understanding of subsurface hydrological processes. If geophysical surveys have been successfully used to qualitatively characterize the subsurface, two important challenges remain for a better quantification of hydrological processes: (1) the inversion of geophysical data and (2) their integration in hydrological subsurface models. The classical inversion approach using regularization suffers from spatially and temporally varying resolution and yields geologically unrealistic solutions without uncertainty quantification, making their utilization for hydrogeological calibration less consistent. More advanced techniques such as coupled inversion allow for a direct use of geophysical data for conditioning groundwater and solute transport model calibration. However, the technique is difficult to apply in complex cases and remains computationally demanding to estimate uncertainty. In a recent study, we investigate a prediction-focused approach (PFA) to directly estimate subsurface physical properties from geophysical data, circumventing the need for classic inversions. In PFA, we seek a direct relationship between the data and the subsurface variables we want to predict (the forecast). This relationship is obtained through a prior set of subsurface models for which both data and forecast are computed. A direct relationship can often be derived through dimension reduction techniques. PFA offers a framework for both hydrogeophysical "inversion" and hydrogeophysical data integration. For hydrogeophysical "inversion", the considered forecast variable is the subsurface variable, such as the salinity. An ensemble of possible solutions is generated, allowing uncertainty quantification. For hydrogeophysical data integration, the forecast variable becomes the prediction we want to make with our subsurface models, such as the concentration of contaminant in a drinking water production well. Geophysical and hydrological data are combined to derive a direct relationship between data and forecast. We illustrate the process for the design of an aquifer thermal energy storage (ATES) system. An ATES system can theoretically recover in winter the heat stored in the aquifer during summer. In practice, the energy efficiency is often lower than expected due to spatial heterogeneity of hydraulic properties combined to a non-favorable hydrogeological gradient. A proper design of ATES systems should consider the uncertainty of the prediction related to those parameters. With a global sensitivity analysis, we identify sensitive parameters for heat storage prediction and validate the use of a short term heat tracing experiment monitored with geophysics to generate informative data. First, we illustrate how PFA can be used to successfully derive the distribution of temperature in the aquifer from ERT during the heat tracing experiment. Then, we successfully integrate the geophysical data to predict medium-term heat storage in the aquifer using PFA. The result is a full quantification of the posterior distribution of the prediction conditioned to observed data in a relatively limited time budget.

  4. Methodology to explore interactions between the water system and society in order to identify adaptation strategies

    NASA Astrophysics Data System (ADS)

    Offermans, A. G. E.; Haasnoot, M.

    2009-04-01

    Development of sustainable water management strategies involves analysing current and future vulnerability, identification of adaptation possibilities, effect analysis and evaluation of the strategies under different possible futures. Recent studies on water management often followed the pressure-effect chain and compared the state of social, economic and ecological functions of the water systems in one or two future situations with the current situation. The future is, however, more complex and dynamic. Water management faces major challenges to cope with future uncertainties in both the water system as well as the social system. Uncertainties in our water system relate to (changes in) drivers and pressures and their effects on the state, like the effects of climate change on discharges. Uncertainties in the social world relate to changing of perceptions, objectives and demands concerning water (management), which are often related with the aforementioned changes in the physical environment. The methodology presented here comprises the 'Perspectives method', derived from the Cultural Theory, a method on analyzing and classifying social response to social and natural states and pressures. The method will be used for scenario analysis and to identify social responses including changes in perspectives and management strategies. The scenarios and responses will be integrated within a rapid assessment tool. The purpose of the tool is to provide users with insight about the interaction of the social and physical system and to identify robust water management strategies by analysing the effectiveness under different possible futures on the physical, social and socio-economic system. This method allows for a mutual interaction between the physical and social system. We will present the theoretical background of the perspectives method as well as a historical overview of perspective changes in the Dutch Meuse area to show how social and physical systems interrelate. We will also show how the integration of both can contribute to the identification of robust water management strategies.

  5. Slip Potential of Faults in the Fort Worth Basin

    NASA Astrophysics Data System (ADS)

    Hennings, P.; Osmond, J.; Lund Snee, J. E.; Zoback, M. D.

    2017-12-01

    Similar to other areas of the southcentral United States, the Fort Worth Basin of NE Texas has experienced an increase in the rate of seismicity which has been attributed to injection of waste water in deep saline aquifers. To assess the hazard of induced seismicity in the basin we have integrated new data on location and character of previously known and unknown faults, stress state, and pore pressure to produce an assessment of fault slip potential which can be used to investigate prior and ongoing earthquake sequences and for development of mitigation strategies. We have assembled data on faults in the basin from published sources, 2D and 3D seismic data, and interpretations provided from petroleum operators to yield a 3D fault model with 292 faults ranging in strike-length from 116 to 0.4 km. The faults have mostly normal geometries, all cut the disposal intervals, and most are presumed to cut into the underlying crystalline and metamorphic basement. Analysis of outcrops along the SW flank of the basin assist with geometric characterization of the fault systems. The interpretation of stress state comes from integration of wellbore image and sonic data, reservoir stimulation data, and earthquake focal mechanisms. The orientation of SHmax is generally uniform across the basin but stress style changes from being more strike-slip in the NE part of the basin to normal faulting in the SW part. Estimates of pore pressure come from a basin-scale hydrogeologic model as history-matched to injection test data. With these deterministic inputs and appropriate ranges of uncertainty we assess the conditional probability that faults in our 3D model might slip via Mohr-Coulomb reactivation in response to increases in injected-related pore pressure. A key component of the analysis is constraining the uncertainties associated with each of the principal parameters. Many of the faults in the model are interpreted to be critically-stressed within reasonable ranges of uncertainty.

  6. An interdisciplinary approach to volcanic risk reduction under conditions of uncertainty: a case study of Tristan da Cunha

    NASA Astrophysics Data System (ADS)

    Hicks, A.; Barclay, J.; Simmons, P.; Loughlin, S.

    2014-07-01

    The uncertainty brought about by intermittent volcanic activity is fairly common at volcanoes worldwide. While better knowledge of any one volcano's behavioural characteristics has the potential to reduce this uncertainty, the subsequent reduction of risk from volcanic threats is only realised if that knowledge is pertinent to stakeholders and effectively communicated to inform good decision making. Success requires integration of methods, skills and expertise across disciplinary boundaries. This research project develops and trials a novel interdisciplinary approach to volcanic risk reduction on the remote volcanic island of Tristan da Cunha (South Atlantic). For the first time, volcanological techniques, probabilistic decision support and social scientific methods were integrated in a single study. New data were produced that (1) established no spatio-temporal pattern to recent volcanic activity; (2) quantified the high degree of scientific uncertainty around future eruptive scenarios; (3) analysed the physical vulnerability of the community as a consequence of their geographical isolation and exposure to volcanic hazards; (4) evaluated social and cultural influences on vulnerability and resilience; and (5) evaluated the effectiveness of a scenario planning approach, both as a method for integrating the different strands of the research and as a way of enabling on-island decision makers to take ownership of risk identification and management, and capacity building within their community. The paper provides empirical evidence of the value of an innovative interdisciplinary framework for reducing volcanic risk. It also provides evidence for the strength that comes from integrating social and physical sciences with the development of effective, tailored engagement and communication strategies in volcanic risk reduction.

  7. A novel integrated approach for the hazardous radioactive dust source terms estimation in future nuclear fusion power plants.

    PubMed

    Poggi, L A; Malizia, A; Ciparisse, J F; Gaudio, P

    2016-10-01

    An open issue still under investigation by several international entities working on the safety and security field for the foreseen nuclear fusion reactors is the estimation of source terms that are a hazard for the operators and public, and for the machine itself in terms of efficiency and integrity in case of severe accident scenarios. Source term estimation is a crucial key safety issue to be addressed in the future reactors safety assessments, and the estimates available at the time are not sufficiently satisfactory. The lack of neutronic data along with the insufficiently accurate methodologies used until now, calls for an integrated methodology for source term estimation that can provide predictions with an adequate accuracy. This work proposes a complete methodology to estimate dust source terms starting from a broad information gathering. The wide number of parameters that can influence dust source term production is reduced with statistical tools using a combination of screening, sensitivity analysis, and uncertainty analysis. Finally, a preliminary and simplified methodology for dust source term production prediction for future devices is presented.

  8. Geophysical data integration, stochastic simulation and significance analysis of groundwater responses using ANOVA in the Chicot Aquifer system, Louisiana, USA

    USGS Publications Warehouse

    Rahman, A.; Tsai, F.T.-C.; White, C.D.; Carlson, D.A.; Willson, C.S.

    2008-01-01

    Data integration is challenging where there are different levels of support between primary and secondary data that need to be correlated in various ways. A geostatistical method is described, which integrates the hydraulic conductivity (K) measurements and electrical resistivity data to better estimate the K distribution in the Upper Chicot Aquifer of southwestern Louisiana, USA. The K measurements were obtained from pumping tests and represent the primary (hard) data. Borehole electrical resistivity data from electrical logs were regarded as the secondary (soft) data, and were used to infer K values through Archie's law and the Kozeny-Carman equation. A pseudo cross-semivariogram was developed to cope with the resistivity data non-collocation. Uncertainties in the auto-semivariograms and pseudo cross-semivariogram were quantified. The groundwater flow model responses by the regionalized and coregionalized models of K were compared using analysis of variance (ANOVA). The results indicate that non-collocated secondary data may improve estimates of K and affect groundwater flow responses of practical interest, including specific capacity and drawdown. ?? Springer-Verlag 2007.

  9. Model based inference from microvascular measurements: Combining experimental measurements and model predictions using a Bayesian probabilistic approach

    PubMed Central

    Rasmussen, Peter M.; Smith, Amy F.; Sakadžić, Sava; Boas, David A.; Pries, Axel R.; Secomb, Timothy W.; Østergaard, Leif

    2017-01-01

    Objective In vivo imaging of the microcirculation and network-oriented modeling have emerged as powerful means of studying microvascular function and understanding its physiological significance. Network-oriented modeling may provide the means of summarizing vast amounts of data produced by high-throughput imaging techniques in terms of key, physiological indices. To estimate such indices with sufficient certainty, however, network-oriented analysis must be robust to the inevitable presence of uncertainty due to measurement errors as well as model errors. Methods We propose the Bayesian probabilistic data analysis framework as a means of integrating experimental measurements and network model simulations into a combined and statistically coherent analysis. The framework naturally handles noisy measurements and provides posterior distributions of model parameters as well as physiological indices associated with uncertainty. Results We applied the analysis framework to experimental data from three rat mesentery networks and one mouse brain cortex network. We inferred distributions for more than five hundred unknown pressure and hematocrit boundary conditions. Model predictions were consistent with previous analyses, and remained robust when measurements were omitted from model calibration. Conclusion Our Bayesian probabilistic approach may be suitable for optimizing data acquisition and for analyzing and reporting large datasets acquired as part of microvascular imaging studies. PMID:27987383

  10. SCALE Code System 6.2.2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T.; Jessee, Matthew Anderson

    The SCALE Code System is a widely used modeling and simulation suite for nuclear safety analysis and design that is developed, maintained, tested, and managed by the Reactor and Nuclear Systems Division (RNSD) of Oak Ridge National Laboratory (ORNL). SCALE provides a comprehensive, verified and validated, user-friendly tool set for criticality safety, reactor physics, radiation shielding, radioactive source term characterization, and sensitivity and uncertainty analysis. Since 1980, regulators, licensees, and research institutions around the world have used SCALE for safety analysis and design. SCALE provides an integrated framework with dozens of computational modules including 3 deterministic and 3 Monte Carlomore » radiation transport solvers that are selected based on the desired solution strategy. SCALE includes current nuclear data libraries and problem-dependent processing tools for continuous-energy (CE) and multigroup (MG) neutronics and coupled neutron-gamma calculations, as well as activation, depletion, and decay calculations. SCALE includes unique capabilities for automated variance reduction for shielding calculations, as well as sensitivity and uncertainty analysis. SCALE’s graphical user interfaces assist with accurate system modeling, visualization of nuclear data, and convenient access to desired results. SCALE 6.2 represents one of the most comprehensive revisions in the history of SCALE, providing several new capabilities and significant improvements in many existing features.« less

  11. Desensitized Optimal Filtering and Sensor Fusion Toolkit

    NASA Technical Reports Server (NTRS)

    Karlgaard, Christopher D.

    2015-01-01

    Analytical Mechanics Associates, Inc., has developed a software toolkit that filters and processes navigational data from multiple sensor sources. A key component of the toolkit is a trajectory optimization technique that reduces the sensitivity of Kalman filters with respect to model parameter uncertainties. The sensor fusion toolkit also integrates recent advances in adaptive Kalman and sigma-point filters for non-Gaussian problems with error statistics. This Phase II effort provides new filtering and sensor fusion techniques in a convenient package that can be used as a stand-alone application for ground support and/or onboard use. Its modular architecture enables ready integration with existing tools. A suite of sensor models and noise distribution as well as Monte Carlo analysis capability are included to enable statistical performance evaluations.

  12. The OLI Radiometric Scale Realization Round Robin Measurement Campaign

    NASA Technical Reports Server (NTRS)

    Cutlip, Hansford; Cole,Jerold; Johnson, B. Carol; Maxwell, Stephen; Markham, Brian; Ong, Lawrence; Hom, Milton; Biggar, Stuart

    2011-01-01

    A round robin radiometric scale realization was performed at the Ball Aerospace Radiometric Calibration Laboratory in January/February 2011 in support of the Operational Land Imager (OLI) Program. Participants included Ball Aerospace, NIST, NASA Goddard Space Flight Center, and the University of Arizona. The eight day campaign included multiple observations of three integrating sphere sources by nine radiometers. The objective of the campaign was to validate the radiance calibration uncertainty ascribed to the integrating sphere used to calibrate the OLI instrument. The instrument level calibration source uncertainty was validated by quatnifying: (1) the long term stability of the NIST calibrated radiance artifact, (2) the responsivity scale of the Ball Aerospace transfer radiometer and (3) the operational characteristics of the large integrating sphere.

  13. A probabilistic sizing tool and Monte Carlo analysis for entry vehicle ablative thermal protection systems

    NASA Astrophysics Data System (ADS)

    Mazzaracchio, Antonio; Marchetti, Mario

    2010-03-01

    Implicit ablation and thermal response software was developed to analyse and size charring ablative thermal protection systems for entry vehicles. A statistical monitor integrated into the tool, which uses the Monte Carlo technique, allows a simulation to run over stochastic series. This performs an uncertainty and sensitivity analysis, which estimates the probability of maintaining the temperature of the underlying material within specified requirements. This approach and the associated software are primarily helpful during the preliminary design phases of spacecraft thermal protection systems. They are proposed as an alternative to traditional approaches, such as the Root-Sum-Square method. The developed tool was verified by comparing the results with those from previous work on thermal protection system probabilistic sizing methodologies, which are based on an industry standard high-fidelity ablation and thermal response program. New case studies were analysed to establish thickness margins on sizing heat shields that are currently proposed for vehicles using rigid aeroshells for future aerocapture missions at Neptune, and identifying the major sources of uncertainty in the material response.

  14. Development of a management tool for reservoirs in Mediterranean environments based on uncertainty analysis

    NASA Astrophysics Data System (ADS)

    Gómez-Beas, R.; Moñino, A.; Polo, M. J.

    2012-05-01

    In compliance with the development of the Water Framework Directive, there is a need for an integrated management of water resources, which involves the elaboration of reservoir management models. These models should include the operational and technical aspects which allow us to forecast an optimal management in the short term, besides the factors that may affect the volume of water stored in the medium and long term. The climate fluctuations of the water cycle that affect the reservoir watershed should be considered, as well as the social and economic aspects of the area. This paper shows the development of a management model for Rules reservoir (southern Spain), through which the water supply is regulated based on set criteria, in a sustainable way with existing commitments downstream, with the supply capacity being well established depending on demand, and the probability of failure when the operating requirements are not fulfilled. The results obtained allowed us: to find out the reservoir response at different time scales, to introduce an uncertainty analysis and to demonstrate the potential of the methodology proposed here as a tool for decision making.

  15. Synthesizing Global and Local Datasets to Estimate Jurisdictional Forest Carbon Fluxes in Berau, Indonesia.

    PubMed

    Griscom, Bronson W; Ellis, Peter W; Baccini, Alessandro; Marthinus, Delon; Evans, Jeffrey S; Ruslandi

    2016-01-01

    Forest conservation efforts are increasingly being implemented at the scale of sub-national jurisdictions in order to mitigate global climate change and provide other ecosystem services. We see an urgent need for robust estimates of historic forest carbon emissions at this scale, as the basis for credible measures of climate and other benefits achieved. Despite the arrival of a new generation of global datasets on forest area change and biomass, confusion remains about how to produce credible jurisdictional estimates of forest emissions. We demonstrate a method for estimating the relevant historic forest carbon fluxes within the Regency of Berau in eastern Borneo, Indonesia. Our method integrates best available global and local datasets, and includes a comprehensive analysis of uncertainty at the regency scale. We find that Berau generated 8.91 ± 1.99 million tonnes of net CO2 emissions per year during 2000-2010. Berau is an early frontier landscape where gross emissions are 12 times higher than gross sequestration. Yet most (85%) of Berau's original forests are still standing. The majority of net emissions were due to conversion of native forests to unspecified agriculture (43% of total), oil palm (28%), and fiber plantations (9%). Most of the remainder was due to legal commercial selective logging (17%). Our overall uncertainty estimate offers an independent basis for assessing three other estimates for Berau. Two other estimates were above the upper end of our uncertainty range. We emphasize the importance of including an uncertainty range for all parameters of the emissions equation to generate a comprehensive uncertainty estimate-which has not been done before. We believe comprehensive estimates of carbon flux uncertainty are increasingly important as national and international institutions are challenged with comparing alternative estimates and identifying a credible range of historic emissions values.

  16. Developing a non-point source P loss indicator in R and its parameter uncertainty assessment using GLUE: a case study in northern China.

    PubMed

    Su, Jingjun; Du, Xinzhong; Li, Xuyong

    2018-05-16

    Uncertainty analysis is an important prerequisite for model application. However, the existing phosphorus (P) loss indexes or indicators were rarely evaluated. This study applied generalized likelihood uncertainty estimation (GLUE) method to assess the uncertainty of parameters and modeling outputs of a non-point source (NPS) P indicator constructed in R language. And the influences of subjective choices of likelihood formulation and acceptability threshold of GLUE on model outputs were also detected. The results indicated the following. (1) Parameters RegR 2 , RegSDR 2 , PlossDP fer , PlossDP man , DPDR, and DPR were highly sensitive to overall TP simulation and their value ranges could be reduced by GLUE. (2) Nash efficiency likelihood (L 1 ) seemed to present better ability in accentuating high likelihood value simulations than the exponential function (L 2 ) did. (3) The combined likelihood integrating the criteria of multiple outputs acted better than single likelihood in model uncertainty assessment in terms of reducing the uncertainty band widths and assuring the fitting goodness of whole model outputs. (4) A value of 0.55 appeared to be a modest choice of threshold value to balance the interests between high modeling efficiency and high bracketing efficiency. Results of this study could provide (1) an option to conduct NPS modeling under one single computer platform, (2) important references to the parameter setting for NPS model development in similar regions, (3) useful suggestions for the application of GLUE method in studies with different emphases according to research interests, and (4) important insights into the watershed P management in similar regions.

  17. An analysis of the errors associated with the determination of atmospheric temperature from atmospheric pressure and density data

    NASA Technical Reports Server (NTRS)

    Minzner, R. A.

    1976-01-01

    A graph was developed for relating delta T/T, the relative uncertainty in atmospheric temperature T, to delta p/p, the relative uncertainty in the atmospheric pressure p, for situations, when T is derived from the slope of the pressure-height profile. A similar graph relates delta T/T to delta roh/rho, the relative uncertainty in the atmospheric density rho, for those cases when T is derived from the downward integration of the density-height profile. A comparison of these two graphs shows that for equal uncertainties in the respective basic parameters, p or rho, smaller uncertainties in the derived temperatures are associated with density-height rather than with pressure-height data. The value of delta T/T is seen to depend not only upon delta p or delta rho, and to a small extent upon the value of T or the related scale height H, but also upon the inverse of delta h, the height increment between successive observations of p or rho. In the case of pressure-height data, delta T/T is dominated by 1/delta h for all values of delta h; for density-height data, delta T/T is dominated by delta rho/rho for delta h smaller than about 5 km. In the case of T derived from density-height data, this inverse relationship between delta T/T and delta h applies only for large values of delta h, that is, for delta h 35 km. No limit exists in the fineness of usable height resolution of T which may be derived from densities, while a fine height resolution in pressure-height data leads to temperature with unacceptably large uncertainties.

  18. Interval Estimation of Seismic Hazard Parameters

    NASA Astrophysics Data System (ADS)

    Orlecka-Sikora, Beata; Lasocki, Stanislaw

    2017-03-01

    The paper considers Poisson temporal occurrence of earthquakes and presents a way to integrate uncertainties of the estimates of mean activity rate and magnitude cumulative distribution function in the interval estimation of the most widely used seismic hazard functions, such as the exceedance probability and the mean return period. The proposed algorithm can be used either when the Gutenberg-Richter model of magnitude distribution is accepted or when the nonparametric estimation is in use. When the Gutenberg-Richter model of magnitude distribution is used the interval estimation of its parameters is based on the asymptotic normality of the maximum likelihood estimator. When the nonparametric kernel estimation of magnitude distribution is used, we propose the iterated bias corrected and accelerated method for interval estimation based on the smoothed bootstrap and second-order bootstrap samples. The changes resulted from the integrated approach in the interval estimation of the seismic hazard functions with respect to the approach, which neglects the uncertainty of the mean activity rate estimates have been studied using Monte Carlo simulations and two real dataset examples. The results indicate that the uncertainty of mean activity rate affects significantly the interval estimates of hazard functions only when the product of activity rate and the time period, for which the hazard is estimated, is no more than 5.0. When this product becomes greater than 5.0, the impact of the uncertainty of cumulative distribution function of magnitude dominates the impact of the uncertainty of mean activity rate in the aggregated uncertainty of the hazard functions. Following, the interval estimates with and without inclusion of the uncertainty of mean activity rate converge. The presented algorithm is generic and can be applied also to capture the propagation of uncertainty of estimates, which are parameters of a multiparameter function, onto this function.

  19. An uncertainty analysis of wildfire modeling [Chapter 13

    Treesearch

    Karin Riley; Matthew Thompson

    2017-01-01

    Before fire models can be understood, evaluated, and effectively applied to support decision making, model-based uncertainties must be analyzed. In this chapter, we identify and classify sources of uncertainty using an established analytical framework, and summarize results graphically in an uncertainty matrix. Our analysis facilitates characterization of the...

  20. Analytic uncertainty and sensitivity analysis of models with input correlations

    NASA Astrophysics Data System (ADS)

    Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu

    2018-03-01

    Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.

  1. On-line coupled high performance liquid chromatography-gas chromatography for the analysis of contamination by mineral oil. Part 2: migration from paperboard into dry foods: interpretation of chromatograms.

    PubMed

    Biedermann, Maurus; Grob, Koni

    2012-09-14

    Mineral oil hydrocarbons are complex as well as varying mixtures and produce correspondingly complex chromatograms (on-line HPLC-GC-FID as described in Part 1): mostly humps of unresolved components are obtained, sometimes with sharp peaks on top. Chromatograms may also contain peaks of hydrocarbons from other sources which need to be subtracted from the mineral oil components. The review focuses on the interpretation and integration of chromatograms related to food contamination by mineral oil from paperboard boxes (off-set printing inks and recycled fibers), if possible distinguishing between various sources of mineral oil. Typical chromatograms are shown for relevant components and interferences as well as food samples encountered on the market. Details are pointed out which may provide relevant information. Integration is shown for examples of paperboard packaging materials as well as various foods. Finally the uncertainty of the analysis and limit of quantitation are discussed for specific examples. They primarily result from the interpretation of the chromatogram, manually placing the baseline and cuts for taking off extraneous components. Without previous enrichment, the limit of quantitation is between around 0.1 mg/kg for foods with a low fat content and 2.5 mg/kg for fats and oils. The measurement uncertainty can be kept clearly below 20% for most samples. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Risk-Aversion: Understanding Teachers' Resistance to Technology Integration

    ERIC Educational Resources Information Center

    Howard, Sarah K.

    2013-01-01

    Teachers who do not integrate technology are often labelled as "resistant" to change. Yet, considerable uncertainties remain about appropriate uses and actual value of technology in teaching and learning, which can make integration and change seem risky. The purpose of this article is to explore the nature of teachers' analytical and…

  3. Uncertainty evaluation of nuclear reaction model parameters using integral and microscopic measurements. Covariances evaluation with CONRAD code

    NASA Astrophysics Data System (ADS)

    de Saint Jean, C.; Habert, B.; Archier, P.; Noguere, G.; Bernard, D.; Tommasi, J.; Blaise, P.

    2010-10-01

    In the [eV;MeV] energy range, modelling of the neutron induced reactions are based on nuclear reaction models having parameters. Estimation of co-variances on cross sections or on nuclear reaction model parameters is a recurrent puzzle in nuclear data evaluation. Major breakthroughs were asked by nuclear reactor physicists to assess proper uncertainties to be used in applications. In this paper, mathematical methods developped in the CONRAD code[2] will be presented to explain the treatment of all type of uncertainties, including experimental ones (statistical and systematic) and propagate them to nuclear reaction model parameters or cross sections. Marginalization procedure will thus be exposed using analytical or Monte-Carlo solutions. Furthermore, one major drawback found by reactor physicist is the fact that integral or analytical experiments (reactor mock-up or simple integral experiment, e.g. ICSBEP, …) were not taken into account sufficiently soon in the evaluation process to remove discrepancies. In this paper, we will describe a mathematical framework to take into account properly this kind of information.

  4. Matrix approach to uncertainty assessment and reduction for modeling terrestrial carbon cycle

    NASA Astrophysics Data System (ADS)

    Luo, Y.; Xia, J.; Ahlström, A.; Zhou, S.; Huang, Y.; Shi, Z.; Wang, Y.; Du, Z.; Lu, X.

    2017-12-01

    Terrestrial ecosystems absorb approximately 30% of the anthropogenic carbon dioxide emissions. This estimate has been deduced indirectly: combining analyses of atmospheric carbon dioxide concentrations with ocean observations to infer the net terrestrial carbon flux. In contrast, when knowledge about the terrestrial carbon cycle is integrated into different terrestrial carbon models they make widely different predictions. To improve the terrestrial carbon models, we have recently developed a matrix approach to uncertainty assessment and reduction. Specifically, the terrestrial carbon cycle has been commonly represented by a series of carbon balance equations to track carbon influxes into and effluxes out of individual pools in earth system models. This representation matches our understanding of carbon cycle processes well and can be reorganized into one matrix equation without changing any modeled carbon cycle processes and mechanisms. We have developed matrix equations of several global land C cycle models, including CLM3.5, 4.0 and 4.5, CABLE, LPJ-GUESS, and ORCHIDEE. Indeed, the matrix equation is generic and can be applied to other land carbon models. This matrix approach offers a suite of new diagnostic tools, such as the 3-dimensional (3-D) parameter space, traceability analysis, and variance decomposition, for uncertainty analysis. For example, predictions of carbon dynamics with complex land models can be placed in a 3-D parameter space (carbon input, residence time, and storage potential) as a common metric to measure how much model predictions are different. The latter can be traced to its source components by decomposing model predictions to a hierarchy of traceable components. Then, variance decomposition can help attribute the spread in predictions among multiple models to precisely identify sources of uncertainty. The highly uncertain components can be constrained by data as the matrix equation makes data assimilation computationally possible. We will illustrate various applications of this matrix approach to uncertainty assessment and reduction for terrestrial carbon cycle models.

  5. Creating NDA working standards through high-fidelity spent fuel modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Skutnik, Steven E; Gauld, Ian C; Romano, Catherine E

    2012-01-01

    The Next Generation Safeguards Initiative (NGSI) is developing advanced non-destructive assay (NDA) techniques for spent nuclear fuel assemblies to advance the state-of-the-art in safeguards measurements. These measurements aim beyond the capabilities of existing methods to include the evaluation of plutonium and fissile material inventory, independent of operator declarations. Testing and evaluation of advanced NDA performance will require reference assemblies with well-characterized compositions to serve as working standards against which the NDA methods can be benchmarked and for uncertainty quantification. To support the development of standards for the NGSI spent fuel NDA project, high-fidelity modeling of irradiated fuel assemblies is beingmore » performed to characterize fuel compositions and radiation emission data. The assembly depletion simulations apply detailed operating history information and core simulation data as it is available to perform high fidelity axial and pin-by-pin fuel characterization for more than 1600 nuclides. The resulting pin-by-pin isotopic inventories are used to optimize the NDA measurements and provide information necessary to unfold and interpret the measurement data, e.g., passive gamma emitters, neutron emitters, neutron absorbers, and fissile content. A key requirement of this study is the analysis of uncertainties associated with the calculated compositions and signatures for the standard assemblies; uncertainties introduced by the calculation methods, nuclear data, and operating information. An integral part of this assessment involves the application of experimental data from destructive radiochemical assay to assess the uncertainty and bias in computed inventories, the impact of parameters such as assembly burnup gradients and burnable poisons, and the influence of neighboring assemblies on periphery rods. This paper will present the results of high fidelity assembly depletion modeling and uncertainty analysis from independent calculations performed using SCALE and MCNP. This work is supported by the Next Generation Safeguards Initiative, Office of Nuclear Safeguards and Security, National Nuclear Security Administration.« less

  6. Management strategies in hospitals: scenario planning

    PubMed Central

    Ghanem, Mohamed; Schnoor, Jörg; Heyde, Christoph-Eckhard; Kuwatsch, Sandra; Bohn, Marco; Josten, Christoph

    2015-01-01

    Background: Instead of waiting for challenges to confront hospital management, doctors and managers should act in advance to optimize and sustain value-based health. This work highlights the importance of scenario planning in hospitals, proposes an elaborated definition of the stakeholders of a hospital and defines the influence factors to which hospitals are exposed to. Methodology: Based on literature analysis as well as on personal interviews with stakeholders we propose an elaborated definition of stakeholders and designed a questionnaire that integrated the following influence factors, which have relevant impact on hospital management: political/legal, economic, social, technological and environmental forces. These influence factors are examined to develop the so-called critical uncertainties. Thorough identification of uncertainties was based on a “Stakeholder Feedback”. Results: Two key uncertainties were identified and considered in this study: the development of workload for the medical staff the profit oriented performance of the medical staff. According to the developed scenarios, complementary education of the medical staff as well as of non-medical top executives and managers of hospitals was the recommended core strategy. Complementary scenario-specific strategic options should be considered whenever needed to optimize dealing with a specific future development of the health care environment. Conclusion: Strategic planning in hospitals is essential to ensure sustainable success. It considers multiple situations and integrates internal and external insights and perspectives in addition to identifying weak signals and “blind spots”. This flows into a sound planning for multiple strategic options. It is a state of the art tool that allows dealing with the increasing challenges facing hospital management. PMID:26504735

  7. Development of a Risk-Based Comparison Methodology of Carbon Capture Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engel, David W.; Dalton, Angela C.; Dale, Crystal

    2014-06-01

    Given the varying degrees of maturity among existing carbon capture (CC) technology alternatives, an understanding of the inherent technical and financial risk and uncertainty associated with these competing technologies is requisite to the success of carbon capture as a viable solution to the greenhouse gas emission challenge. The availability of tools and capabilities to conduct rigorous, risk–based technology comparisons is thus highly desirable for directing valuable resources toward the technology option(s) with a high return on investment, superior carbon capture performance, and minimum risk. To address this research need, we introduce a novel risk-based technology comparison method supported by anmore » integrated multi-domain risk model set to estimate risks related to technological maturity, technical performance, and profitability. Through a comparison between solid sorbent and liquid solvent systems, we illustrate the feasibility of estimating risk and quantifying uncertainty in a single domain (modular analytical capability) as well as across multiple risk dimensions (coupled analytical capability) for comparison. This method brings technological maturity and performance to bear on profitability projections, and carries risk and uncertainty modeling across domains via inter-model sharing of parameters, distributions, and input/output. The integration of the models facilitates multidimensional technology comparisons within a common probabilistic risk analysis framework. This approach and model set can equip potential technology adopters with the necessary computational capabilities to make risk-informed decisions about CC technology investment. The method and modeling effort can also be extended to other industries where robust tools and analytical capabilities are currently lacking for evaluating nascent technologies.« less

  8. The Prevention of Hemorrhagic Stroke

    PubMed Central

    Raymond, J.; Mohr, JP; the TEAM-ARUBA collaborative groups

    2008-01-01

    Summary There is currently no evidence that preventive treatment of unruptured aneurysms or AVMs is beneficial and randomized trials have been proposed to address this clinical uncertainty. Participation in a trial may necessitate a shift of point of view compared to a certain habitual clinical mentality. A review of the ethical and rational principles governing the design and realization of a trial may help integrate clinical research into expert clinical practices. The treatment of unruptured aneurysms and AVMs remains controversial, and data from observational studies cannot provide a normative basis for clinical decisions. Prevention targets healthy individuals and hence has an obligation of results. There is no opposition between the search for objective facts using scientific methods and the ethics of medical practice since a good practice cannot forbid physicians the means to define what could be beneficial to patients. Perhaps the most difficult task is to recognize the uncertainty that is crucial to allow resorting to trial methodology. The reasoning that is used in research and analysis differs from the casuistic methods typical of clinical work, but clinical judgement remains the dominant factor that decides both who enters the trial and to whom the results of the trial will apply. Randomization is still perceived as a difficult and strange method to integrate into normal practice, but in the face of uncertainty it assures the best chances for the best outcome to each participant. Some tension exists between scientific methods and normal practice, but they need to coexist if we are to progress at the same time we care for patients. PMID:20557736

  9. SLFP: a stochastic linear fractional programming approach for sustainable waste management.

    PubMed

    Zhu, H; Huang, G H

    2011-12-01

    A stochastic linear fractional programming (SLFP) approach is developed for supporting sustainable municipal solid waste management under uncertainty. The SLFP method can solve ratio optimization problems associated with random information, where chance-constrained programming is integrated into a linear fractional programming framework. It has advantages in: (1) comparing objectives of two aspects, (2) reflecting system efficiency, (3) dealing with uncertainty expressed as probability distributions, and (4) providing optimal-ratio solutions under different system-reliability conditions. The method is applied to a case study of waste flow allocation within a municipal solid waste (MSW) management system. The obtained solutions are useful for identifying sustainable MSW management schemes with maximized system efficiency under various constraint-violation risks. The results indicate that SLFP can support in-depth analysis of the interrelationships among system efficiency, system cost and system-failure risk. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Sensitivity analysis of a sediment dynamics model applied in a Mediterranean river basin: global change and management implications.

    PubMed

    Sánchez-Canales, M; López-Benito, A; Acuña, V; Ziv, G; Hamel, P; Chaplin-Kramer, R; Elorza, F J

    2015-01-01

    Climate change and land-use change are major factors influencing sediment dynamics. Models can be used to better understand sediment production and retention by the landscape, although their interpretation is limited by large uncertainties, including model parameter uncertainties. The uncertainties related to parameter selection may be significant and need to be quantified to improve model interpretation for watershed management. In this study, we performed a sensitivity analysis of the InVEST (Integrated Valuation of Environmental Services and Tradeoffs) sediment retention model in order to determine which model parameters had the greatest influence on model outputs, and therefore require special attention during calibration. The estimation of the sediment loads in this model is based on the Universal Soil Loss Equation (USLE). The sensitivity analysis was performed in the Llobregat basin (NE Iberian Peninsula) for exported and retained sediment, which support two different ecosystem service benefits (avoided reservoir sedimentation and improved water quality). Our analysis identified the model parameters related to the natural environment as the most influential for sediment export and retention. Accordingly, small changes in variables such as the magnitude and frequency of extreme rainfall events could cause major changes in sediment dynamics, demonstrating the sensitivity of these dynamics to climate change in Mediterranean basins. Parameters directly related to human activities and decisions (such as cover management factor, C) were also influential, especially for sediment exported. The importance of these human-related parameters in the sediment export process suggests that mitigation measures have the potential to at least partially ameliorate climate-change driven changes in sediment exportation. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Adaptive integral robust control and application to electromechanical servo systems.

    PubMed

    Deng, Wenxiang; Yao, Jianyong

    2017-03-01

    This paper proposes a continuous adaptive integral robust control with robust integral of the sign of the error (RISE) feedback for a class of uncertain nonlinear systems, in which the RISE feedback gain is adapted online to ensure the robustness against disturbances without the prior bound knowledge of the additive disturbances. In addition, an adaptive compensation integrated with the proposed adaptive RISE feedback term is also constructed to further reduce design conservatism when the system also exists parametric uncertainties. Lyapunov analysis reveals the proposed controllers could guarantee the tracking errors are asymptotically converging to zero with continuous control efforts. To illustrate the high performance nature of the developed controllers, numerical simulations are provided. At the end, an application case of an actual electromechanical servo system driven by motor is also studied, with some specific design consideration, and comparative experimental results are obtained to verify the effectiveness of the proposed controllers. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Statistical Methodologies to Integrate Experimental and Computational Research

    NASA Technical Reports Server (NTRS)

    Parker, P. A.; Johnson, R. T.; Montgomery, D. C.

    2008-01-01

    Development of advanced algorithms for simulating engine flow paths requires the integration of fundamental experiments with the validation of enhanced mathematical models. In this paper, we provide an overview of statistical methods to strategically and efficiently conduct experiments and computational model refinement. Moreover, the integration of experimental and computational research efforts is emphasized. With a statistical engineering perspective, scientific and engineering expertise is combined with statistical sciences to gain deeper insights into experimental phenomenon and code development performance; supporting the overall research objectives. The particular statistical methods discussed are design of experiments, response surface methodology, and uncertainty analysis and planning. Their application is illustrated with a coaxial free jet experiment and a turbulence model refinement investigation. Our goal is to provide an overview, focusing on concepts rather than practice, to demonstrate the benefits of using statistical methods in research and development, thereby encouraging their broader and more systematic application.

  13. Model parameter uncertainty analysis for an annual field-scale P loss model

    NASA Astrophysics Data System (ADS)

    Bolster, Carl H.; Vadas, Peter A.; Boykin, Debbie

    2016-08-01

    Phosphorous (P) fate and transport models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. Because all models are simplifications of complex systems, there will exist an inherent amount of uncertainty associated with their predictions. It is therefore important that efforts be directed at identifying, quantifying, and communicating the different sources of model uncertainties. In this study, we conducted an uncertainty analysis with the Annual P Loss Estimator (APLE) model. Our analysis included calculating parameter uncertainties and confidence and prediction intervals for five internal regression equations in APLE. We also estimated uncertainties of the model input variables based on values reported in the literature. We then predicted P loss for a suite of fields under different management and climatic conditions while accounting for uncertainties in the model parameters and inputs and compared the relative contributions of these two sources of uncertainty to the overall uncertainty associated with predictions of P loss. Both the overall magnitude of the prediction uncertainties and the relative contributions of the two sources of uncertainty varied depending on management practices and field characteristics. This was due to differences in the number of model input variables and the uncertainties in the regression equations associated with each P loss pathway. Inspection of the uncertainties in the five regression equations brought attention to a previously unrecognized limitation with the equation used to partition surface-applied fertilizer P between leaching and runoff losses. As a result, an alternate equation was identified that provided similar predictions with much less uncertainty. Our results demonstrate how a thorough uncertainty and model residual analysis can be used to identify limitations with a model. Such insight can then be used to guide future data collection and model development and evaluation efforts.

  14. Line-of-sight extrapolation noise in dust polarization

    NASA Astrophysics Data System (ADS)

    Poh, Jason; Dodelson, Scott

    2017-05-01

    The B-modes of polarization at frequencies ranging from 50-1000 GHz are produced by Galactic dust, lensing of primordial E-modes in the cosmic microwave background (CMB) by intervening large scale structure, and possibly by primordial B-modes in the CMB imprinted by gravitational waves produced during inflation. The conventional method used to separate the dust component of the signal is to assume that the signal at high frequencies (e.g. 350 GHz) is due solely to dust and then extrapolate the signal down to a lower frequency (e.g. 150 GHz) using the measured scaling of the polarized dust signal amplitude with frequency. For typical Galactic thermal dust temperatures of ˜20 K , these frequencies are not fully in the Rayleigh-Jeans limit. Therefore, deviations in the dust cloud temperatures from cloud to cloud will lead to different scaling factors for clouds of different temperatures. Hence, when multiple clouds of different temperatures and polarization angles contribute to the integrated line-of-sight polarization signal, the relative contribution of individual clouds to the integrated signal can change between frequencies. This can cause the integrated signal to be decorrelated in both amplitude and direction when extrapolating in frequency. Here we carry out a Monte Carlo analysis on the impact of this line-of-sight extrapolation noise on a greybody dust model consistent with Planck and Pan-STARRS observations, enabling us to quantify its effect. Using results from the Planck experiment, we find that this effect is small, more than an order of magnitude smaller than the current uncertainties. However, line-of-sight extrapolation noise may be a significant source of uncertainty in future low-noise primordial B-mode experiments. Scaling from Planck results, we find that accounting for this uncertainty becomes potentially important when experiments are sensitive to primordial B-mode signals with amplitude r ≲0.0015 in the greybody dust models considered in this paper.

  15. It’s about time: How do sky surveys manage uncertainty about scientific needs many years into the future

    NASA Astrophysics Data System (ADS)

    Darch, Peter T.; Sands, Ashley E.

    2016-06-01

    Sky surveys, such as the Sloan Digital Sky Survey (SDSS) and the Large Synoptic Survey Telescope (LSST), generate data on an unprecedented scale. While many scientific projects span a few years from conception to completion, sky surveys are typically on the scale of decades. This paper focuses on critical challenges arising from long timescales, and how sky surveys address these challenges.We present findings from a study of LSST, comprising interviews (n=58) and observation. Conceived in the 1990s, the LSST Corporation was formed in 2003, and construction began in 2014. LSST will commence data collection operations in 2022 for ten years.One challenge arising from this long timescale is uncertainty about future needs of the astronomers who will use these data many years hence. Sources of uncertainty include scientific questions to be posed, astronomical phenomena to be studied, and tools and practices these astronomers will have at their disposal. These uncertainties are magnified by the rapid technological and scientific developments anticipated between now and the start of LSST operations.LSST is implementing a range of strategies to address these challenges. Some strategies involve delaying resolution of uncertainty, placing this resolution in the hands of future data users. Other strategies aim to reduce uncertainty by shaping astronomers’ data analysis practices so that these practices will integrate well with LSST once operations begin.One approach that exemplifies both types of strategy is the decision to make LSST data management software open source, even now as it is being developed. This policy will enable future data users to adapt this software to evolving needs. In addition, LSST intends for astronomers to start using this software well in advance of 2022, thereby embedding LSST software and data analysis approaches in the practices of astronomers.These findings strengthen arguments for making the software supporting sky surveys available as open source. Such arguments usually focus on reuse potential of software, and enhancing replicability of analyses. In this case, however, open source software also promises to mitigate the critical challenge of anticipating the needs of future data users.

  16. Quantifying uncertainty in national forest carbon stocks: challenges and opportunities for the United States National Greenhouse Gas Inventory

    NASA Astrophysics Data System (ADS)

    Clough, B.; Russell, M.; Domke, G. M.; Woodall, C. W.

    2016-12-01

    Uncertainty estimates are needed to establish confidence in national forest carbon stocks and to verify changes reported to the United Nations Framework Convention on Climate Change. Good practice guidance from the Intergovernmental Panel on Climate Change stipulates that uncertainty assessments should neither exaggerate nor underestimate the actual error within carbon stocks, yet methodological guidance for forests has been hampered by limited understanding of how complex dynamics give rise to errors across spatial scales (i.e., individuals to continents). This talk highlights efforts to develop a multi-scale, data-driven framework for assessing uncertainty within the United States (US) forest carbon inventory, and focuses on challenges and opportunities for improving the precision of national forest carbon stock estimates. Central to our approach is the calibration of allometric models with a newly established legacy biomass database for North American tree species, and the use of hierarchical models to link these data with the Forest Inventory and Analysis (FIA) database as well as remote sensing datasets. Our work suggests substantial risk for misestimating key sources of uncertainty including: (1) attributing more confidence in allometric models than what is warranted by the best available data; (2) failing to capture heterogeneity in biomass stocks due to environmental variation at regional scales; and (3) ignoring spatial autocorrelation and other random effects that are characteristic of national forest inventory data. Our results suggest these sources of error may be much higher than is generally assumed, though these results must be understood with the limited scope and availability of appropriate calibration data in mind. In addition to reporting on important sources of uncertainty, this talk will discuss opportunities to improve the precision of national forest carbon stocks that are motivated by our use of data-driven forecasting including: (1) improving the taxonomic and geographic scope of available biomass data; (2) direct attribution of landscape-level heterogeneity in biomass stocks to specific ecological processes; and (3) integration of expert opinion and meta-analysis to lessen the influence of often highly variable datasets on biomass stock forecasts.

  17. Collaborative decision-analytic framework to maximize resilience of tidal marshes to climate change

    USGS Publications Warehouse

    Thorne, Karen M.; Mattsson, Brady J.; Takekawa, John Y.; Cummings, Jonathan; Crouse, Debby; Block, Giselle; Bloom, Valary; Gerhart, Matt; Goldbeck, Steve; Huning, Beth; Sloop, Christina; Stewart, Mendel; Taylor, Karen; Valoppi, Laura

    2015-01-01

    Decision makers that are responsible for stewardship of natural resources face many challenges, which are complicated by uncertainty about impacts from climate change, expanding human development, and intensifying land uses. A systematic process for evaluating the social and ecological risks, trade-offs, and cobenefits associated with future changes is critical to maximize resilience and conserve ecosystem services. This is particularly true in coastal areas where human populations and landscape conversion are increasing, and where intensifying storms and sea-level rise pose unprecedented threats to coastal ecosystems. We applied collaborative decision analysis with a diverse team of stakeholders who preserve, manage, or restore tidal marshes across the San Francisco Bay estuary, California, USA, as a case study. Specifically, we followed a structured decision-making approach, and we using expert judgment developed alternative management strategies to increase the capacity and adaptability to manage tidal marsh resilience while considering uncertainties through 2050. Because sea-level rise projections are relatively confident to 2050, we focused on uncertainties regarding intensity and frequency of storms and funding. Elicitation methods allowed us to make predictions in the absence of fully compatible models and to assess short- and long-term trade-offs. Specifically we addressed two questions. (1) Can collaborative decision analysis lead to consensus among a diverse set of decision makers responsible for environmental stewardship and faced with uncertainties about climate change, funding, and stakeholder values? (2) What is an optimal strategy for the conservation of tidal marshes, and what strategy is robust to the aforementioned uncertainties? We found that when taking this approach, consensus was reached among the stakeholders about the best management strategies to maintain tidal marsh integrity. A Bayesian decision network revealed that a strategy considering sea-level rise and storms explicitly in wetland restoration planning and designs was optimal, and it was robust to uncertainties about management effectiveness and budgets. We found that strategies that avoided explicitly accounting for future climate change had the lowest expected performance based on input from the team. Our decision-analytic framework is sufficiently general to offer an adaptable template, which can be modified for use in other areas that include a diverse and engaged stakeholder group.

  18. Big Data Geo-Analytical Tool Development for Spatial Analysis Uncertainty Visualization and Quantification Needs

    NASA Astrophysics Data System (ADS)

    Rose, K.; Bauer, J. R.; Baker, D. V.

    2015-12-01

    As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation via custom multi-step Hadoop applications, performance benchmarking comparisons, and Hadoop-centric opportunities for greater parallelization of geospatial operations. The presentation includes examples of the approach being applied to a range of subsurface, geospatial studies (e.g. induced seismicity risk).

  19. Uncertainty analysis of vegetation distribution in the northern high latitudes during the 21st century with a dynamic vegetation model.

    PubMed

    Jiang, Yueyang; Zhuang, Qianlai; Schaphoff, Sibyll; Sitch, Stephen; Sokolov, Andrei; Kicklighter, David; Melillo, Jerry

    2012-03-01

    This study aims to assess how high-latitude vegetation may respond under various climate scenarios during the 21st century with a focus on analyzing model parameters induced uncertainty and how this uncertainty compares to the uncertainty induced by various climates. The analysis was based on a set of 10,000 Monte Carlo ensemble Lund-Potsdam-Jena (LPJ) simulations for the northern high latitudes (45(o)N and polewards) for the period 1900-2100. The LPJ Dynamic Global Vegetation Model (LPJ-DGVM) was run under contemporary and future climates from four Special Report Emission Scenarios (SRES), A1FI, A2, B1, and B2, based on the Hadley Centre General Circulation Model (GCM), and six climate scenarios, X901M, X902L, X903H, X904M, X905L, and X906H from the Integrated Global System Model (IGSM) at the Massachusetts Institute of Technology (MIT). In the current dynamic vegetation model, some parameters are more important than others in determining the vegetation distribution. Parameters that control plant carbon uptake and light-use efficiency have the predominant influence on the vegetation distribution of both woody and herbaceous plant functional types. The relative importance of different parameters varies temporally and spatially and is influenced by climate inputs. In addition to climate, these parameters play an important role in determining the vegetation distribution in the region. The parameter-based uncertainties contribute most to the total uncertainty. The current warming conditions lead to a complexity of vegetation responses in the region. Temperate trees will be more sensitive to climate variability, compared with boreal forest trees and C3 perennial grasses. This sensitivity would result in a unanimous northward greenness migration due to anomalous warming in the northern high latitudes. Temporally, boreal needleleaved evergreen plants are projected to decline considerably, and a large portion of C3 perennial grass is projected to disappear by the end of the 21st century. In contrast, the area of temperate trees would increase, especially under the most extreme A1FI scenario. As the warming continues, the northward greenness expansion in the Arctic region could continue.

  20. Uncertainty analysis for low-level radioactive waste disposal performance assessment at Oak Ridge National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, D.W.; Yambert, M.W.; Kocher, D.C.

    1994-12-31

    A performance assessment of the operating Solid Waste Storage Area 6 (SWSA 6) facility for the disposal of low-level radioactive waste at the Oak Ridge National Laboratory has been prepared to provide the technical basis for demonstrating compliance with the performance objectives of DOE Order 5820.2A, Chapter 111.2 An analysis of the uncertainty incorporated into the assessment was performed which addressed the quantitative uncertainty in the data used by the models, the subjective uncertainty associated with the models used for assessing performance of the disposal facility and site, and the uncertainty in the models used for estimating dose and humanmore » exposure. The results of the uncertainty analysis were used to interpret results and to formulate conclusions about the performance assessment. This paper discusses the approach taken in analyzing the uncertainty in the performance assessment and the role of uncertainty in performance assessment.« less

  1. Water, Resilience and the Law: From General Concepts and Governance Design Principles to Actionable Mechanisms

    NASA Astrophysics Data System (ADS)

    Hill Clarvis, M.; Allan, A.; Hannah, D. M.

    2013-12-01

    Climate change has significant ramifications for water law and governance, yet, there is strong evidence that legal regulations have often failed to protect environments or promote sustainable development. Scholars have increasingly suggested that the preservation and restoration paradigms of legislation and regulation are no longer adequate for climate change related challenges in complex and cross-scale social-ecological systems. This is namely due to past assumptions of stationarity, uniformitarianism and the perception of ecosystem change as predictable and reversible. This paper reviews the literature on law and resilience and then presents and discusses a set of practical examples of legal mechanisms from the water resources management sector, identified according to a set of guiding principles from the literature on adaptive capacity, adaptive governance as well as adaptive and integrated water resources management. It then assesses the aptness of these different measures according to scientific evidence of increased uncertainty and changing ecological baselines. A review of the best practice examples demonstrates that there are a number of best practice examples attempting to integrate adaptive elements of flexibility, iterativity, connectivity and subsidiarity into a variety of legislative mechanisms, suggesting that there is not as significant a tension between resilience and the law as many scholars have suggested. However, while many of the mechanisms may indeed be suitable for addressing challenges relating to current levels of change and uncertainty, analysis across a broader range of uncertainty highlights challenges relating to more irreversible changes associated with greater levels of warming. Furthermore the paper identifies a set of pre-requisites that are fundamental to the successful implementation of such mechanisms, namely monitoring and data sharing, financial and technical capacity, particularly in nations that are most at risk with the least data infrastructure. The article aims to contribute to both theory and practice, enabling policy makers to translate resilience based terminology and adaptive governance principles into clear instructions for incorporating uncertainty into legislation and policy design.

  2. Key comparison CCPR-K1.a as an interlaboratory comparison of correlated color temperature

    NASA Astrophysics Data System (ADS)

    Kärhä, P.; Vaskuri, A.; Pulli, T.; Ikonen, E.

    2018-02-01

    We analyze the results of spectral irradiance key comparison CCPR-K1.a for correlated color temperature (CCT). For four participants out of 13, the uncertainties of CCT, calculated using traditional methods, not accounting for correlations, would be too small. The reason for the failure of traditional uncertainty calculation is spectral correlations, producing systematic deviations of the same sign over certain wavelength regions. The results highlight the importance of accounting for such correlations when calculating uncertainties of spectrally integrated quantities.

  3. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  4. A Bottom-up Vulnerability Analysis of Water Systems with Decentralized Decision Making and Demographic Shifts- the Case of Jordan.

    NASA Astrophysics Data System (ADS)

    Lachaut, T.; Yoon, J.; Klassert, C. J. A.; Talozi, S.; Mustafa, D.; Knox, S.; Selby, P. D.; Haddad, Y.; Gorelick, S.; Tilmant, A.

    2016-12-01

    Probabilistic approaches to uncertainty in water systems management can face challenges of several types: non stationary climate, sudden shocks such as conflict-driven migrations, or the internal complexity and dynamics of large systems. There has been a rising trend in the development of bottom-up methods that place focus on the decision side instead of probability distributions and climate scenarios. These approaches are based on defining acceptability thresholds for the decision makers and considering the entire range of possibilities over which such thresholds are crossed. We aim at improving the knowledge on the applicability and relevance of this approach by enlarging its scope beyond climate uncertainty and single decision makers; thus including demographic shifts, internal system dynamics, and multiple stakeholders at different scales. This vulnerability analysis is part of the Jordan Water Project and makes use of an ambitious multi-agent model developed by its teams with the extensive cooperation of the Ministry of Water and Irrigation of Jordan. The case of Jordan is a relevant example for migration spikes, rapid social changes, resource depletion and climate change impacts. The multi-agent modeling framework used provides a consistent structure to assess the vulnerability of complex water resources systems with distributed acceptability thresholds and stakeholder interaction. A proof of concept and preliminary results are presented for a non-probabilistic vulnerability analysis that involves different types of stakeholders, uncertainties other than climatic and the integration of threshold-based indicators. For each stakeholder (agent) a vulnerability matrix is constructed over a multi-dimensional domain, which includes various hydrologic and/or demographic variables.

  5. Climate change risk analysis framework (CCRAF) a probabilistic tool for analyzing climate change uncertainties

    NASA Astrophysics Data System (ADS)

    Legget, J.; Pepper, W.; Sankovski, A.; Smith, J.; Tol, R.; Wigley, T.

    2003-04-01

    Potential risks of human-induced climate change are subject to a three-fold uncertainty associated with: the extent of future anthropogenic and natural GHG emissions; global and regional climatic responses to emissions; and impacts of climatic changes on economies and the biosphere. Long-term analyses are also subject to uncertainty regarding how humans will respond to actual or perceived changes, through adaptation or mitigation efforts. Explicitly addressing these uncertainties is a high priority in the scientific and policy communities Probabilistic modeling is gaining momentum as a technique to quantify uncertainties explicitly and use decision analysis techniques that take advantage of improved risk information. The Climate Change Risk Assessment Framework (CCRAF) presented here a new integrative tool that combines the probabilistic approaches developed in population, energy and economic sciences with empirical data and probabilistic results of climate and impact models. The main CCRAF objective is to assess global climate change as a risk management challenge and to provide insights regarding robust policies that address the risks, by mitigating greenhouse gas emissions and by adapting to climate change consequences. The CCRAF endogenously simulates to 2100 or beyond annual region-specific changes in population; GDP; primary (by fuel) and final energy (by type) use; a wide set of associated GHG emissions; GHG concentrations; global temperature change and sea level rise; economic, health, and biospheric impacts; costs of mitigation and adaptation measures and residual costs or benefits of climate change. Atmospheric and climate components of CCRAF are formulated based on the latest version of Wigley's and Raper's MAGICC model and impacts are simulated based on a modified version of Tol's FUND model. The CCRAF is based on series of log-linear equations with deterministic and random components and is implemented using a Monte-Carlo method with up to 5000 variants per set of fixed input parameters. The shape and coefficients of CCRAF equations are derived from regression analyses of historic data and expert assessments. There are two types of random components in CCRAF - one reflects a year-to-year fluctuations around the expected value of a given variable (e.g., standard error of the annual GDP growth) and another is fixed within each CCRAF variant and represents some essential constants within a "world" represented by that variant (e.g., the value of climate sensitivity). Both types of random components are drawn from pre-defined probability distributions functions developed based on historic data or expert assessments. Preliminary CCRAF results emphasize the relative importance of uncertainties associated with the conversion of GHG and particulate emissions into radiative forcing and quantifying climate change effects at the regional level. A separates analysis involves an "adaptive decision-making", which optimizes the expected future policy effects given the estimated probabilistic uncertainties. As uncertainty for some variables evolve over the time steps, the decisions also adapt. This modeling approach is feasible only with explicit modeling of uncertainties.

  6. An Integrated Understanding of Confidence and User Calibration in Information Systems Use

    ERIC Educational Resources Information Center

    Tang, Fengchun

    2012-01-01

    Dealing with uncertainty is a critical part of human decision-making and confidence reflects one's belief about the relative likelihood that various outcomes occur when making decision under uncertainty. Unfortunately, confidence often deviates from the actual quality of the decision, leading to under- or over-confidence. Calibration, the…

  7. Analyzing climate change impacts on water resources under uncertainty using an integrated simulation-optimization approach

    NASA Astrophysics Data System (ADS)

    Zhuang, X. W.; Li, Y. P.; Nie, S.; Fan, Y. R.; Huang, G. H.

    2018-01-01

    An integrated simulation-optimization (ISO) approach is developed for assessing climate change impacts on water resources. In the ISO, uncertainties presented as both interval numbers and probability distributions can be reflected. Moreover, ISO permits in-depth analyses of various policy scenarios that are associated with different levels of economic consequences when the promised water-allocation targets are violated. A snowmelt-precipitation-driven watershed (Kaidu watershed) in northwest China is selected as the study case for demonstrating the applicability of the proposed method. Results of meteorological projections disclose that the incremental trend of temperature (e.g., minimum and maximum values) and precipitation exist. Results also reveal that (i) the system uncertainties would significantly affect water resources allocation pattern (including target and shortage); (ii) water shortage would be enhanced from 2016 to 2070; and (iii) the more the inflow amount decreases, the higher estimated water shortage rates are. The ISO method is useful for evaluating climate change impacts within a watershed system with complicated uncertainties and helping identify appropriate water resources management strategies hedging against drought.

  8. Jet energy measurement and its systematic uncertainty in proton-proton collisions at [Formula: see text] TeV with the ATLAS detector.

    PubMed

    Aad, G; Abajyan, T; Abbott, B; Abdallah, J; Abdel Khalek, S; Abdinov, O; Aben, R; Abi, B; Abolins, M; AbouZeid, O S; Abramowicz, H; Abreu, H; Abulaiti, Y; Acharya, B S; Adamczyk, L; Adams, D L; Addy, T N; Adelman, J; Adomeit, S; Adye, T; Aefsky, S; Agatonovic-Jovin, T; Aguilar-Saavedra, J A; Agustoni, M; Ahlen, S P; Ahmad, A; Ahmadov, F; Aielli, G; Åkesson, T P A; Akimoto, G; Akimov, A V; Alam, M A; Albert, J; Albrand, S; Alconada Verzini, M J; Aleksa, M; Aleksandrov, I N; Alessandria, F; Alexa, C; Alexander, G; Alexandre, G; Alexopoulos, T; Alhroob, M; Aliev, M; Alimonti, G; Alio, L; Alison, J; Allbrooke, B M M; Allison, L J; Allport, P P; Allwood-Spiers, S E; Almond, J; Aloisio, A; Alon, R; Alonso, A; Alonso, F; Altheimer, A; Alvarez Gonzalez, B; Alviggi, M G; Amako, K; Amaral Coutinho, Y; Amelung, C; Ammosov, V V; Amor Dos Santos, S P; Amorim, A; Amoroso, S; Amram, N; Amundsen, G; Anastopoulos, C; Ancu, L S; Andari, N; Andeen, T; Anders, C F; Anders, G; Anderson, K J; Andreazza, A; Andrei, V; Anduaga, X S; Angelidakis, S; Anger, P; Angerami, A; Anghinolfi, F; Anisenkov, A V; Anjos, N; Annovi, A; Antonaki, A; Antonelli, M; Antonov, A; Antos, J; Anulli, F; Aoki, M; Aperio Bella, L; Apolle, R; Arabidze, G; Aracena, I; Arai, Y; Arce, A T H; Arfaoui, S; Arguin, J-F; Argyropoulos, S; Arik, E; Arik, M; Armbruster, A J; Arnaez, O; Arnal, V; Arslan, O; Artamonov, A; Artoni, G; Asai, S; Asbah, N; Ask, S; Åsman, B; Asquith, L; Assamagan, K; Astalos, R; Astbury, A; Atkinson, M; Atlay, N B; Auerbach, B; Auge, E; Augsten, K; Aurousseau, M; Avolio, G; Azuelos, G; Azuma, Y; Baak, M A; Bacci, C; Bach, A M; Bachacou, H; Bachas, K; Backes, M; Backhaus, M; Backus Mayes, J; Badescu, E; Bagiacchi, P; Bagnaia, P; Bai, Y; Bailey, D C; Bain, T; Baines, J T; Baker, O K; Baker, S; Balek, P; Balli, F; Banas, E; Banerjee, Sw; Banfi, D; Bangert, A; Bansal, V; Bansil, H S; Barak, L; Baranov, S P; Barber, T; Barberio, E L; Barberis, D; Barbero, M; Barillari, T; Barisonzi, M; Barklow, T; Barlow, N; Barnett, B M; Barnett, R M; Baroncelli, A; Barone, G; Barr, A J; Barreiro, F; Barreiro Guimarães da Costa, J; Bartoldus, R; Barton, A E; Bartos, P; Bartsch, V; Bassalat, A; Basye, A; Bates, R L; Batkova, L; Batley, J R; Battistin, M; Bauer, F; Bawa, H S; Beau, T; Beauchemin, P H; Beccherle, R; Bechtle, P; Beck, H P; Becker, K; Becker, S; Beckingham, M; Beddall, A J; Beddall, A; Bedikian, S; Bednyakov, V A; Bee, C P; Beemster, L J; Beermann, T A; Begel, M; Behr, K; Belanger-Champagne, C; Bell, P J; Bell, W H; Bella, G; Bellagamba, L; Bellerive, A; Bellomo, M; Belloni, A; Beloborodova, O L; Belotskiy, K; Beltramello, O; Benary, O; Benchekroun, D; Bendtz, K; Benekos, N; Benhammou, Y; Benhar Noccioli, E; Benitez Garcia, J A; Benjamin, D P; Bensinger, J R; Benslama, K; Bentvelsen, S; Berge, D; Bergeaas Kuutmann, E; Berger, N; Berghaus, F; Berglund, E; Beringer, J; Bernard, C; Bernat, P; Bernhard, R; Bernius, C; Bernlochner, F U; Berry, T; Berta, P; Bertella, C; Bertolucci, F; Besana, M I; Besjes, G J; Bessidskaia, O; Besson, N; Bethke, S; Bhimji, W; Bianchi, R M; Bianchini, L; Bianco, M; Biebel, O; Bieniek, S P; Bierwagen, K; Biesiada, J; Biglietti, M; Bilbao De Mendizabal, J; Bilokon, H; Bindi, M; Binet, S; Bingul, A; Bini, C; Bittner, B; Black, C W; Black, J E; Black, K M; Blackburn, D; Blair, R E; Blanchard, J-B; Blazek, T; Bloch, I; Blocker, C; Blocki, J; Blum, W; Blumenschein, U; Bobbink, G J; Bobrovnikov, V S; Bocchetta, S S; Bocci, A; Boddy, C R; Boehler, M; Boek, J; Boek, T T; Boelaert, N; Bogaerts, J A; Bogdanchikov, A G; Bogouch, A; Bohm, C; Bohm, J; Boisvert, V; Bold, T; Boldea, V; Boldyrev, A S; Bolnet, N M; Bomben, M; Bona, M; Boonekamp, M; Bordoni, S; Borer, C; Borisov, A; Borissov, G; Borri, M; Borroni, S; Bortfeldt, J; Bortolotto, V; Bos, K; Boscherini, D; Bosman, M; Boterenbrood, H; Bouchami, J; Boudreau, J; Bouhova-Thacker, E V; Boumediene, D; Bourdarios, C; Bousson, N; Boutouil, S; Boveia, A; Boyd, J; Boyko, I R; Bozovic-Jelisavcic, I; Bracinik, J; Branchini, P; Brandt, A; Brandt, G; Brandt, O; Bratzler, U; Brau, B; Brau, J E; Braun, H M; Brazzale, S F; Brelier, B; Brendlinger, K; Brenner, R; Bressler, S; Bristow, T M; Britton, D; Brochu, F M; Brock, I; Brock, R; Broggi, F; Bromberg, C; Bronner, J; Brooijmans, G; Brooks, T; Brooks, W K; Brosamer, J; Brost, E; Brown, G; Brown, J; Bruckman de Renstrom, P A; Bruncko, D; Bruneliere, R; Brunet, S; Bruni, A; Bruni, G; Bruschi, M; Bryngemark, L; Buanes, T; Buat, Q; Bucci, F; Buchholz, P; Buckingham, R M; Buckley, A G; Buda, S I; Budagov, I A; Budick, B; Buehrer, F; Bugge, L; Bugge, M K; Bulekov, O; Bundock, A C; Bunse, M; Burckhart, H; Burdin, S; Burgess, T; Burghgrave, B; Burke, S; Burmeister, I; Busato, E; Büscher, V; Bussey, P; Buszello, C P; Butler, B; Butler, J M; Butt, A I; Buttar, C M; Butterworth, J M; Buttinger, W; Buzatu, A; Byszewski, M; Cabrera Urbán, S; Caforio, D; Cakir, O; Calafiura, P; Calderini, G; Calfayan, P; Calkins, R; Caloba, L P; Caloi, R; Calvet, D; Calvet, S; Camacho Toro, R; Camarri, P; Cameron, D; Caminada, L M; Caminal Armadans, R; Campana, S; Campanelli, M; Canale, V; Canelli, F; Canepa, A; Cantero, J; Cantrill, R; Cao, T; Capeans Garrido, M D M; Caprini, I; Caprini, M; Capua, M; Caputo, R; Cardarelli, R; Carli, T; Carlino, G; Carminati, L; Caron, S; Carquin, E; Carrillo-Montoya, G D; Carter, A A; Carter, J R; Carvalho, J; Casadei, D; Casado, M P; Caso, C; Castaneda-Miranda, E; Castelli, A; Castillo Gimenez, V; Castro, N F; Catastini, P; Catinaccio, A; Catmore, J R; Cattai, A; Cattani, G; Caughron, S; Cavaliere, V; Cavalli, D; Cavalli-Sforza, M; Cavasinni, V; Ceradini, F; Cerio, B; Cerny, K; Cerqueira, A S; Cerri, A; Cerrito, L; Cerutti, F; Cervelli, A; Cetin, S A; Chafaq, A; Chakraborty, D; Chalupkova, I; Chan, K; Chang, P; Chapleau, B; Chapman, J D; Charfeddine, D; Charlton, D G; Chavda, V; Chavez Barajas, C A; Cheatham, S; Chekanov, S; Chekulaev, S V; Chelkov, G A; Chelstowska, M A; Chen, C; Chen, H; Chen, K; Chen, L; Chen, S; Chen, X; Chen, Y; Cheng, Y; Cheplakov, A; Cherkaoui El Moursli, R; Chernyatin, V; Cheu, E; Chevalier, L; Chiarella, V; Chiefari, G; Childers, J T; Chilingarov, A; Chiodini, G; Chisholm, A S; Chislett, R T; Chitan, A; Chizhov, M V; Chouridou, S; Chow, B K B; Christidi, I A; Chromek-Burckhart, D; Chu, M L; Chudoba, J; Ciapetti, G; Ciftci, A K; Ciftci, R; Cinca, D; Cindro, V; Ciocio, A; Cirilli, M; Cirkovic, P; Citron, Z H; Citterio, M; Ciubancan, M; Clark, A; Clark, P J; Clarke, R N; Cleland, W; Clemens, J C; Clement, B; Clement, C; Coadou, Y; Cobal, M; Coccaro, A; Cochran, J; Coelli, S; Coffey, L; Cogan, J G; Coggeshall, J; Colas, J; Cole, B; Cole, S; Colijn, A P; Collins-Tooth, C; Collot, J; Colombo, T; Colon, G; Compostella, G; Conde Muiño, P; Coniavitis, E; Conidi, M C; Connelly, I A; Consonni, S M; Consorti, V; Constantinescu, S; Conta, C; Conti, G; Conventi, F; Cooke, M; Cooper, B D; Cooper-Sarkar, A M; Cooper-Smith, N J; Copic, K; Cornelissen, T; Corradi, M; Corriveau, F; Corso-Radu, A; Cortes-Gonzalez, A; Cortiana, G; Costa, G; Costa, M J; Costanzo, D; Côté, D; Cottin, G; Courneyea, L; Cowan, G; Cox, B E; Cranmer, K; Cree, G; Crépé-Renaudin, S; Crescioli, F; Crispin Ortuzar, M; Cristinziani, M; Crosetti, G; Cuciuc, C-M; Cuenca Almenar, C; Cuhadar Donszelmann, T; Cummings, J; Curatolo, M; Cuthbert, C; Czirr, H; Czodrowski, P; Czyczula, Z; D'Auria, S; D'Onofrio, M; D'Orazio, A; Da Cunha Sargedas De Sousa, M J; Da Via, C; Dabrowski, W; Dafinca, A; Dai, T; Dallaire, F; Dallapiccola, C; Dam, M; Daniells, A C; Dano Hoffmann, M; Dao, V; Darbo, G; Darlea, G L; Darmora, S; Dassoulas, J A; Davey, W; David, C; Davidek, T; Davies, E; Davies, M; Davignon, O; Davison, A R; Davygora, Y; Dawe, E; Dawson, I; Daya-Ishmukhametova, R K; De, K; de Asmundis, R; De Castro, S; De Cecco, S; de Graat, J; De Groot, N; de Jong, P; De La Taille, C; De la Torre, H; De Lorenzi, F; De Nooij, L; De Pedis, D; De Salvo, A; De Sanctis, U; De Santo, A; De Vivie De Regie, J B; De Zorzi, G; Dearnaley, W J; Debbe, R; Debenedetti, C; Dechenaux, B; Dedovich, D V; Degenhardt, J; Del Peso, J; Del Prete, T; Delemontex, T; Deliot, F; Deliyergiyev, M; Dell'Acqua, A; Dell'Asta, L; Della Pietra, M; Della Volpe, D; Delmastro, M; Delsart, P A; Deluca, C; Demers, S; Demichev, M; Demilly, A; Demirkoz, B; Denisov, S P; Derendarz, D; Derkaoui, J E; Derue, F; Dervan, P; Desch, K; Deviveiros, P O; Dewhurst, A; DeWilde, B; Dhaliwal, S; Dhullipudi, R; Di Ciaccio, A; Di Ciaccio, L; Di Domenico, A; Di Donato, C; Di Girolamo, A; Di Girolamo, B; Di Mattia, A; Di Micco, B; Di Nardo, R; Di Simone, A; Di Sipio, R; Di Valentino, D; Diaz, M A; Diehl, E B; Dietrich, J; Dietzsch, T A; Diglio, S; Dindar Yagci, K; Dingfelder, J; Dionisi, C; Dita, P; Dita, S; Dittus, F; Djama, F; Djobava, T; do Vale, M A B; Do Valle Wemans, A; Doan, T K O; Dobos, D; Dobson, E; Dodd, J; Doglioni, C; Doherty, T; Dohmae, T; Dolejsi, J; Dolezal, Z; Dolgoshein, B A; Donadelli, M; Donati, S; Dondero, P; Donini, J; Dopke, J; Doria, A; Dos Anjos, A; Dotti, A; Dova, M T; Doyle, A T; Dris, M; Dubbert, J; Dube, S; Dubreuil, E; Duchovni, E; Duckeck, G; Ducu, O A; Duda, D; Dudarev, A; Dudziak, F; Duflot, L; Duguid, L; Dührssen, M; Dunford, M; Duran Yildiz, H; Düren, M; Dwuznik, M; Ebke, J; Edson, W; Edwards, C A; Edwards, N C; Ehrenfeld, W; Eifert, T; Eigen, G; Einsweiler, K; Eisenhandler, E; Ekelof, T; El Kacimi, M; Ellert, M; Elles, S; Ellinghaus, F; Ellis, K; Ellis, N; Elmsheuser, J; Elsing, M; Emeliyanov, D; Enari, Y; Endner, O C; Endo, M; Engelmann, R; Erdmann, J; Ereditato, A; Eriksson, D; Ernis, G; Ernst, J; Ernst, M; Ernwein, J; Errede, D; Errede, S; Ertel, E; Escalier, M; Esch, H; Escobar, C; Espinal Curull, X; Esposito, B; Etienne, F; Etienvre, A I; Etzion, E; Evangelakou, D; Evans, H; Fabbri, L; Facini, G; Fakhrutdinov, R M; Falciano, S; Fang, Y; Fanti, M; Farbin, A; Farilla, A; Farooque, T; Farrell, S; Farrington, S M; Farthouat, P; Fassi, F; Fassnacht, P; Fassouliotis, D; Fatholahzadeh, B; Favareto, A; Fayard, L; Federic, P; Fedin, O L; Fedorko, W; Fehling-Kaschek, M; Feligioni, L; Feng, C; Feng, E J; Feng, H; Fenyuk, A B; Fernando, W; Ferrag, S; Ferrando, J; Ferrara, V; Ferrari, A; Ferrari, P; Ferrari, R; Ferreira de Lima, D E; Ferrer, A; Ferrere, D; Ferretti, C; Ferretto Parodi, A; Fiascaris, M; Fiedler, F; Filipčič, A; Filipuzzi, M; Filthaut, F; Fincke-Keeler, M; Finelli, K D; Fiolhais, M C N; Fiorini, L; Firan, A; Fischer, J; Fisher, M J; Fitzgerald, E A; Flechl, M; Fleck, I; Fleischmann, P; Fleischmann, S; Fletcher, G T; Fletcher, G; Flick, T; Floderus, A; Flores Castillo, L R; Florez Bustos, A C; Flowerdew, M J; Fonseca Martin, T; Formica, A; Forti, A; Fortin, D; Fournier, D; Fox, H; Francavilla, P; Franchini, M; Franchino, S; Francis, D; Franklin, M; Franz, S; Fraternali, M; Fratina, S; French, S T; Friedrich, C; Friedrich, F; Froidevaux, D; Frost, J A; Fukunaga, C; Fullana Torregrosa, E; Fulsom, B G; Fuster, J; Gabaldon, C; Gabizon, O; Gabrielli, A; Gabrielli, A; Gadatsch, S; Gadfort, T; Gadomski, S; Gagliardi, G; Gagnon, P; Galea, C; Galhardo, B; Gallas, E J; Gallo, V; Gallop, B J; Gallus, P; Galster, G; Gan, K K; Gandrajula, R P; Gao, J; Gao, Y S; Garay Walls, F M; Garberson, F; García, C; García Navarro, J E; Garcia-Sciveres, M; Gardner, R W; Garelli, N; Garonne, V; Gatti, C; Gaudio, G; Gaur, B; Gauthier, L; Gauzzi, P; Gavrilenko, I L; Gay, C; Gaycken, G; Gazis, E N; Ge, P; Gecse, Z; Gee, C N P; Geerts, D A A; Geich-Gimbel, Ch; Gellerstedt, K; Gemme, C; Gemmell, A; Genest, M H; Gentile, S; George, M; George, S; Gerbaudo, D; Gershon, A; Ghazlane, H; Ghodbane, N; Giacobbe, B; Giagu, S; Giangiobbe, V; Giannetti, P; Gianotti, F; Gibbard, B; Gibson, S M; Gilchriese, M; Gillam, T P S; Gillberg, D; Gillman, A R; Gingrich, D M; Giokaris, N; Giordani, M P; Giordano, R; Giorgi, F M; Giovannini, P; Giraud, P F; Giugni, D; Giuliani, C; Giunta, M; Gjelsten, B K; Gkialas, I; Gladilin, L K; Glasman, C; Glatzer, J; Glazov, A; Glonti, G L; Goblirsch-Kolb, M; Goddard, J R; Godfrey, J; Godlewski, J; Goeringer, C; Goldfarb, S; Golling, T; Golubkov, D; Gomes, A; Gomez Fajardo, L S; Gonçalo, R; Goncalves Pinto Firmino Da Costa, J; Gonella, L; González de la Hoz, S; Gonzalez Parra, G; Gonzalez Silva, M L; Gonzalez-Sevilla, S; Goodson, J J; Goossens, L; Gorbounov, P A; Gordon, H A; Gorelov, I; Gorfine, G; Gorini, B; Gorini, E; Gorišek, A; Gornicki, E; Goshaw, A T; Gössling, C; Gostkin, M I; Gouighri, M; Goujdami, D; Goulette, M P; Goussiou, A G; Goy, C; Gozpinar, S; Grabas, H M X; Graber, L; Grabowska-Bold, I; Grafström, P; Grahn, K-J; Gramling, J; Gramstad, E; Grancagnolo, F; Grancagnolo, S; Grassi, V; Gratchev, V; Gray, H M; Gray, J A; Graziani, E; Grebenyuk, O G; Greenwood, Z D; Gregersen, K; Gregor, I M; Grenier, P; Griffiths, J; Grigalashvili, N; Grillo, A A; Grimm, K; Grinstein, S; Gris, Ph; Grishkevich, Y V; Grivaz, J-F; Grohs, J P; Grohsjean, A; Gross, E; Grosse-Knetter, J; Grossi, G C; Groth-Jensen, J; Grout, Z J; Grybel, K; Guescini, F; Guest, D; Gueta, O; Guicheney, C; Guido, E; Guillemin, T; Guindon, S; Gul, U; Gumpert, C; Gunther, J; Guo, J; Gupta, S; Gutierrez, P; Gutierrez Ortiz, N G; Gutschow, C; Guttman, N; Guyot, C; Gwenlan, C; Gwilliam, C B; Haas, A; Haber, C; Hadavand, H K; Haefner, P; Hageboeck, S; Hajduk, Z; Hakobyan, H; Haleem, M; Hall, D; Halladjian, G; Hamacher, K; Hamal, P; Hamano, K; Hamer, M; Hamilton, A; Hamilton, S; Han, L; Hanagaki, K; Hanawa, K; Hance, M; Hanke, P; Hansen, J R; Hansen, J B; Hansen, J D; Hansen, P H; Hansson, P; Hara, K; Hard, A S; Harenberg, T; Harkusha, S; Harper, D; Harrington, R D; Harris, O M; Harrison, P F; Hartjes, F; Harvey, A; Hasegawa, S; Hasegawa, Y; Hassani, S; Haug, S; Hauschild, M; Hauser, R; Havranek, M; Hawkes, C M; Hawkings, R J; Hawkins, A D; Hayashi, T; Hayden, D; Hays, C P; Hayward, H S; Haywood, S J; Head, S J; Heck, T; Hedberg, V; Heelan, L; Heim, S; Heinemann, B; Heisterkamp, S; Hejbal, J; Helary, L; Heller, C; Heller, M; Hellman, S; Hellmich, D; Helsens, C; Henderson, J; Henderson, R C W; Henrichs, A; Henriques Correia, A M; Henrot-Versille, S; Hensel, C; Herbert, G H; Hernandez, C M; Hernández Jiménez, Y; Herrberg-Schubert, R; Herten, G; Hertenberger, R; Hervas, L; Hesketh, G G; Hessey, N P; Hickling, R; Higón-Rodriguez, E; Hill, J C; Hiller, K H; Hillert, S; Hillier, S J; Hinchliffe, I; Hines, E; Hirose, M; Hirschbuehl, D; Hobbs, J; Hod, N; Hodgkinson, M C; Hodgson, P; Hoecker, A; Hoeferkamp, M R; Hoffman, J; Hoffmann, D; Hofmann, J I; Hohlfeld, M; Holmes, T R; Hong, T M; Hooft van Huysduynen, L; Hostachy, J-Y; Hou, S; Hoummada, A; Howard, J; Howarth, J; Hrabovsky, M; Hristova, I; Hrivnac, J; Hryn'ova, T; Hsu, P J; Hsu, S-C; Hu, D; Hu, X; Huang, Y; Hubacek, Z; Hubaut, F; Huegging, F; Huettmann, A; Huffman, T B; Hughes, E W; Hughes, G; Huhtinen, M; Hülsing, T A; Hurwitz, M; Huseynov, N; Huston, J; Huth, J; Iacobucci, G; Iakovidis, G; Ibragimov, I; Iconomidou-Fayard, L; Idarraga, J; Ideal, E; Iengo, P; Igonkina, O; Iizawa, T; Ikegami, Y; Ikematsu, K; Ikeno, M; Iliadis, D; Ilic, N; Inamaru, Y; Ince, T; Ioannou, P; Iodice, M; Iordanidou, K; Ippolito, V; Irles Quiles, A; Isaksson, C; Ishino, M; Ishitsuka, M; Ishmukhametov, R; Issever, C; Istin, S; Ivashin, A V; Iwanski, W; Iwasaki, H; Izen, J M; Izzo, V; Jackson, B; Jackson, J N; Jackson, M; Jackson, P; Jaekel, M R; Jain, V; Jakobs, K; Jakobsen, S; Jakoubek, T; Jakubek, J; Jamin, D O; Jana, D K; Jansen, E; Jansen, H; Janssen, J; Janus, M; Jared, R C; Jarlskog, G; Jeanty, L; Jeng, G-Y; Jen-La Plante, I; Jennens, D; Jenni, P; Jentzsch, J; Jeske, C; Jézéquel, S; Jha, M K; Ji, H; Ji, W; Jia, J; Jiang, Y; Jimenez Belenguer, M; Jin, S; Jinaru, A; Jinnouchi, O; Joergensen, M D; Joffe, D; Johansson, K E; Johansson, P; Johns, K A; Jon-And, K; Jones, G; Jones, R W L; Jones, T J; Jorge, P M; Joshi, K D; Jovicevic, J; Ju, X; Jung, C A; Jungst, R M; Jussel, P; Juste Rozas, A; Kaci, M; Kaczmarska, A; Kadlecik, P; Kado, M; Kagan, H; Kagan, M; Kajomovitz, E; Kalinin, S; Kama, S; Kanaya, N; Kaneda, M; Kaneti, S; Kanno, T; Kantserov, V A; Kanzaki, J; Kaplan, B; Kapliy, A; Kar, D; Karakostas, K; Karastathis, N; Karnevskiy, M; Karpov, S N; Karthik, K; Kartvelishvili, V; Karyukhin, A N; Kashif, L; Kasieczka, G; Kass, R D; Kastanas, A; Kataoka, Y; Katre, A; Katzy, J; Kaushik, V; Kawagoe, K; Kawamoto, T; Kawamura, G; Kazama, S; Kazanin, V F; Kazarinov, M Y; Keeler, R; Keener, P T; Kehoe, R; Keil, M; Keller, J S; Keoshkerian, H; Kepka, O; Kerševan, B P; Kersten, S; Kessoku, K; Keung, J; Khalil-Zada, F; Khandanyan, H; Khanov, A; Kharchenko, D; Khodinov, A; Khomich, A; Khoo, T J; Khoriauli, G; Khoroshilov, A; Khovanskiy, V; Khramov, E; Khubua, J; Kim, H; Kim, S H; Kimura, N; Kind, O; King, B T; King, M; King, R S B; King, S B; Kirk, J; Kiryunin, A E; Kishimoto, T; Kisielewska, D; Kitamura, T; Kittelmann, T; Kiuchi, K; Kladiva, E; Klein, M; Klein, U; Kleinknecht, K; Klimek, P; Klimentov, A; Klingenberg, R; Klinger, J A; Klinkby, E B; Klioutchnikova, T; Klok, P F; Kluge, E-E; Kluit, P; Kluth, S; Kneringer, E; Knoops, E B F G; Knue, A; Kobayashi, T; Kobel, M; Kocian, M; Kodys, P; Koenig, S; Koevesarki, P; Koffas, T; Koffeman, E; Kogan, L A; Kohlmann, S; Kohout, Z; Kohriki, T; Koi, T; Kolanoski, H; Koletsou, I; Koll, J; Komar, A A; Komori, Y; Kondo, T; Köneke, K; König, A C; Kono, T; Konoplich, R; Konstantinidis, N; Kopeliansky, R; Koperny, S; Köpke, L; Kopp, A K; Korcyl, K; Kordas, K; Korn, A; Korol, A A; Korolkov, I; Korolkova, E V; Korotkov, V A; Kortner, O; Kortner, S; Kostyukhin, V V; Kotov, S; Kotov, V M; Kotwal, A; Kourkoumelis, C; Kouskoura, V; Koutsman, A; Kowalewski, R; Kowalski, T Z; Kozanecki, W; Kozhin, A S; Kral, V; Kramarenko, V A; Kramberger, G; Krasny, M W; Krasznahorkay, A; Kraus, J K; Kravchenko, A; Kreiss, S; Kretzschmar, J; Kreutzfeldt, K; Krieger, N; Krieger, P; Kroeninger, K; Kroha, H; Kroll, J; Kroseberg, J; Krstic, J; Kruchonak, U; Krüger, H; Kruker, T; Krumnack, N; Krumshteyn, Z V; Kruse, A; Kruse, M C; Kruskal, M; Kubota, T; Kuday, S; Kuehn, S; Kugel, A; Kuhl, T; Kukhtin, V; Kulchitsky, Y; Kuleshov, S; Kuna, M; Kunkle, J; Kupco, A; Kurashige, H; Kurata, M; Kurochkin, Y A; Kurumida, R; Kus, V; Kuwertz, E S; Kuze, M; Kvita, J; Kwee, R; La Rosa, A; La Rotonda, L; Labarga, L; Lablak, S; Lacasta, C; Lacava, F; Lacey, J; Lacker, H; Lacour, D; Lacuesta, V R; Ladygin, E; Lafaye, R; Laforge, B; Lagouri, T; Lai, S; Laier, H; Laisne, E; Lambourne, L; Lampen, C L; Lampl, W; Lançon, E; Landgraf, U; Landon, M P J; Lang, V S; Lange, C; Lankford, A J; Lanni, F; Lantzsch, K; Lanza, A; Laplace, S; Lapoire, C; Laporte, J F; Lari, T; Larner, A; Lassnig, M; Laurelli, P; Lavorini, V; Lavrijsen, W; Laycock, P; Le, B T; Le Dortz, O; Le Guirriec, E; Le Menedeu, E; LeCompte, T; Ledroit-Guillon, F; Lee, C A; Lee, H; Lee, J S H; Lee, S C; Lee, L; Lefebvre, G; Lefebvre, M; Legger, F; Leggett, C; Lehan, A; Lehmacher, M; Lehmann Miotto, G; Leister, A G; Leite, M A L; Leitner, R; Lellouch, D; Lemmer, B; Lendermann, V; Leney, K J C; Lenz, T; Lenzen, G; Lenzi, B; Leone, R; Leonhardt, K; Leontsinis, S; Leroy, C; Lessard, J-R; Lester, C G; Lester, C M; Levêque, J; Levin, D; Levinson, L J; Lewis, A; Lewis, G H; Leyko, A M; Leyton, M; Li, B; Li, B; Li, H; Li, H L; Li, S; Li, X; Liang, Z; Liao, H; Liberti, B; Lichard, P; Lie, K; Liebal, J; Liebig, W; Limbach, C; Limosani, A; Limper, M; Lin, S C; Linde, F; Lindquist, B E; Linnemann, J T; Lipeles, E; Lipniacka, A; Lisovyi, M; Liss, T M; Lissauer, D; Lister, A; Litke, A M; Liu, B; Liu, D; Liu, J B; Liu, K; Liu, L; Liu, M; Liu, M; Liu, Y; Livan, M; Livermore, S S A; Lleres, A; Llorente Merino, J; Lloyd, S L; Lo Sterzo, F; Lobodzinska, E; Loch, P; Lockman, W S; Loddenkoetter, T; Loebinger, F K; Loevschall-Jensen, A E; Loginov, A; Loh, C W; Lohse, T; Lohwasser, K; Lokajicek, M; Lombardo, V P; Long, J D; Long, R E; Lopes, L; Lopez Mateos, D; Lopez Paredes, B; Lorenz, J; Lorenzo Martinez, N; Losada, M; Loscutoff, P; Losty, M J; Lou, X; Lounis, A; Love, J; Love, P A; Lowe, A J; Lu, F; Lubatti, H J; Luci, C; Lucotte, A; Ludwig, D; Ludwig, I; Luehring, F; Lukas, W; Luminari, L; Lund, E; Lundberg, J; Lundberg, O; Lund-Jensen, B; Lungwitz, M; Lynn, D; Lysak, R; Lytken, E; Ma, H; Ma, L L; Maccarrone, G; Macchiolo, A; Maček, B; Machado Miguens, J; Macina, D; Mackeprang, R; Madar, R; Madaras, R J; Maddocks, H J; Mader, W F; Madsen, A; Maeno, M; Maeno, T; Magnoni, L; Magradze, E; Mahboubi, K; Mahlstedt, J; Mahmoud, S; Mahout, G; Maiani, C; Maidantchik, C; Maio, A; Majewski, S; Makida, Y; Makovec, N; Mal, P; Malaescu, B; Malecki, Pa; Maleev, V P; Malek, F; Mallik, U; Malon, D; Malone, C; Maltezos, S; Malyshev, V M; Malyukov, S; Mamuzic, J; Mandelli, L; Mandić, I; Mandrysch, R; Maneira, J; Manfredini, A; Manhaes de Andrade Filho, L; Manjarres Ramos, J A; Mann, A; Manning, P M; Manousakis-Katsikakis, A; Mansoulie, B; Mantifel, R; Mapelli, L; March, L; Marchand, J F; Marchese, F; Marchiori, G; Marcisovsky, M; Marino, C P; Marques, C N; Marroquim, F; Marshall, Z; Marti, L F; Marti-Garcia, S; Martin, B; Martin, B; Martin, J P; Martin, T A; Martin, V J; Martin Dit Latour, B; Martinez, H; Martinez, M; Martin-Haugh, S; Martyniuk, A C; Marx, M; Marzano, F; Marzin, A; Masetti, L; Mashimo, T; Mashinistov, R; Masik, J; Maslennikov, A L; Massa, I; Massol, N; Mastrandrea, P; Mastroberardino, A; Masubuchi, T; Matsunaga, H; Matsushita, T; Mättig, P; Mättig, S; Mattmann, J; Mattravers, C; Maurer, J; Maxfield, S J; Maximov, D A; Mazini, R; Mazzaferro, L; Mazzanti, M; Mc Goldrick, G; Mc Kee, S P; McCarn, A; McCarthy, R L; McCarthy, T G; McCubbin, N A; McFarlane, K W; Mcfayden, J A; Mchedlidze, G; Mclaughlan, T; McMahon, S J; McPherson, R A; Meade, A; Mechnich, J; Mechtel, M; Medinnis, M; Meehan, S; Meera-Lebbai, R; Mehlhase, S; Mehta, A; Meier, K; Meineck, C; Meirose, B; Melachrinos, C; Mellado Garcia, B R; Meloni, F; Mendoza Navas, L; Mengarelli, A; Menke, S; Meoni, E; Mercurio, K M; Mergelmeyer, S; Meric, N; Mermod, P; Merola, L; Meroni, C; Merritt, F S; Merritt, H; Messina, A; Metcalfe, J; Mete, A S; Meyer, C; Meyer, C; Meyer, J-P; Meyer, J; Meyer, J; Michal, S; Middleton, R P; Migas, S; Mijović, L; Mikenberg, G; Mikestikova, M; Mikuž, M; Miller, D W; Mills, C; Milov, A; Milstead, D A; Milstein, D; Minaenko, A A; Miñano Moya, M; Minashvili, I A; Mincer, A I; Mindur, B; Mineev, M; Ming, Y; Mir, L M; Mirabelli, G; Mitani, T; Mitrevski, J; Mitsou, V A; Mitsui, S; Miyagawa, P S; Mjörnmark, J U; Moa, T; Moeller, V; Mohapatra, S; Mohr, W; Molander, S; Moles-Valls, R; Molfetas, A; Mönig, K; Monini, C; Monk, J; Monnier, E; Montejo Berlingen, J; Monticelli, F; Monzani, S; Moore, R W; Mora Herrera, C; Moraes, A; Morange, N; Morel, J; Moreno, D; Moreno Llácer, M; Morettini, P; Morgenstern, M; Morii, M; Moritz, S; Morley, A K; Mornacchi, G; Morris, J D; Morvaj, L; Moser, H G; Mosidze, M; Moss, J; Mount, R; Mountricha, E; Mouraviev, S V; Moyse, E J W; Mudd, R D; Mueller, F; Mueller, J; Mueller, K; Mueller, T; Mueller, T; Muenstermann, D; Munwes, Y; Murillo Quijada, J A; Murray, W J; Mussche, I; Musto, E; Myagkov, A G; Myska, M; Nackenhorst, O; Nadal, J; Nagai, K; Nagai, R; Nagai, Y; Nagano, K; Nagarkar, A; Nagasaka, Y; Nagel, M; Nairz, A M; Nakahama, Y; Nakamura, K; Nakamura, T; Nakano, I; Namasivayam, H; Nanava, G; Napier, A; Narayan, R; Nash, M; Nattermann, T; Naumann, T; Navarro, G; Neal, H A; Nechaeva, P Yu; Neep, T J; Negri, A; Negri, G; Negrini, M; Nektarijevic, S; Nelson, A; Nelson, T K; Nemecek, S; Nemethy, P; Nepomuceno, A A; Nessi, M; Neubauer, M S; Neumann, M; Neusiedl, A; Neves, R M; Nevski, P; Newcomer, F M; Newman, P R; Nguyen, D H; Nguyen Thi Hong, V; Nickerson, R B; Nicolaidou, R; Nicquevert, B; Nielsen, J; Nikiforou, N; Nikiforov, A; Nikolaenko, V; Nikolic-Audit, I; Nikolics, K; Nikolopoulos, K; Nilsson, P; Ninomiya, Y; Nisati, A; Nisius, R; Nobe, T; Nodulman, L; Nomachi, M; Nomidis, I; Norberg, S; Nordberg, M; Novakova, J; Nozaki, M; Nozka, L; Ntekas, K; Nuncio-Quiroz, A-E; Nunes Hanninger, G; Nunnemann, T; Nurse, E; O'Brien, B J; O'Grady, F; O'Neil, D C; O'Shea, V; Oakes, L B; Oakham, F G; Oberlack, H; Ocariz, J; Ochi, A; Ochoa, M I; Oda, S; Odaka, S; Ogren, H; Oh, A; Oh, S H; Ohm, C C; Ohshima, T; Okamura, W; Okawa, H; Okumura, Y; Okuyama, T; Olariu, A; Olchevski, A G; Olivares Pino, S A; Oliveira, M; Oliveira Damazio, D; Oliver Garcia, E; Olivito, D; Olszewski, A; Olszowska, J; Onofre, A; Onyisi, P U E; Oram, C J; Oreglia, M J; Oren, Y; Orestano, D; Orlando, N; Oropeza Barrera, C; Orr, R S; Osculati, B; Ospanov, R; Otero Y Garzon, G; Otono, H; Ouchrif, M; Ouellette, E A; Ould-Saada, F; Ouraou, A; Oussoren, K P; Ouyang, Q; Ovcharova, A; Owen, M; Owen, S; Ozcan, V E; Ozturk, N; Pachal, K; Pacheco Pages, A; Padilla Aranda, C; Pagan Griso, S; Paganis, E; Pahl, C; Paige, F; Pais, P; Pajchel, K; Palacino, G; Palestini, S; Pallin, D; Palma, A; Palmer, J D; Pan, Y B; Panagiotopoulou, E; Panduro Vazquez, J G; Pani, P; Panikashvili, N; Panitkin, S; Pantea, D; Papadopoulou, Th D; Papageorgiou, K; Paramonov, A; Paredes Hernandez, D; Parker, M A; Parodi, F; Parsons, J A; Parzefall, U; Pashapour, S; Pasqualucci, E; Passaggio, S; Passeri, A; Pastore, F; Pastore, Fr; Pásztor, G; Pataraia, S; Patel, N D; Pater, J R; Patricelli, S; Pauly, T; Pearce, J; Pedersen, M; Pedraza Lopez, S; Pedro, R; Peleganchuk, S V; Pelikan, D; Peng, H; Penning, B; Penwell, J; Perepelitsa, D V; Perez Cavalcanti, T; Perez Codina, E; Pérez García-Estañ, M T; Perez Reale, V; Perini, L; Pernegger, H; Perrino, R; Peschke, R; Peshekhonov, V D; Peters, K; Peters, R F Y; Petersen, B A; Petersen, J; Petersen, T C; Petit, E; Petridis, A; Petridou, C; Petrolo, E; Petrucci, F; Petteni, M; Pezoa, R; Phillips, P W; Piacquadio, G; Pianori, E; Picazio, A; Piccaro, E; Piccinini, M; Piec, S M; Piegaia, R; Pignotti, D T; Pilcher, J E; Pilkington, A D; Pina, J; Pinamonti, M; Pinder, A; Pinfold, J L; Pingel, A; Pinto, B; Pizio, C; Pleier, M-A; Pleskot, V; Plotnikova, E; Plucinski, P; Poddar, S; Podlyski, F; Poettgen, R; Poggioli, L; Pohl, D; Pohl, M; Polesello, G; Policicchio, A; Polifka, R; Polini, A; Pollard, C S; Polychronakos, V; Pomeroy, D; Pommès, K; Pontecorvo, L; Pope, B G; Popeneciu, G A; Popovic, D S; Poppleton, A; Portell Bueso, X; Pospelov, G E; Pospisil, S; Potamianos, K; Potrap, I N; Potter, C J; Potter, C T; Poulard, G; Poveda, J; Pozdnyakov, V; Prabhu, R; Pralavorio, P; Pranko, A; Prasad, S; Pravahan, R; Prell, S; Price, D; Price, J; Price, L E; Prieur, D; Primavera, M; Proissl, M; Prokofiev, K; Prokoshin, F; Protopapadaki, E; Protopopescu, S; Proudfoot, J; Prudent, X; Przybycien, M; Przysiezniak, H; Psoroulas, S; Ptacek, E; Pueschel, E; Puldon, D; Purohit, M; Puzo, P; Pylypchenko, Y; Qian, J; Quadt, A; Quarrie, D R; Quayle, W B; Quilty, D; Radeka, V; Radescu, V; Radhakrishnan, S K; Radloff, P; Ragusa, F; Rahal, G; Rajagopalan, S; Rammensee, M; Rammes, M; Randle-Conde, A S; Rangel-Smith, C; Rao, K; Rauscher, F; Rave, T C; Ravenscroft, T; Raymond, M; Read, A L; Rebuzzi, D M; Redelbach, A; Redlinger, G; Reece, R; Reeves, K; Reinsch, A; Reisin, H; Reisinger, I; Relich, M; Rembser, C; Ren, Z L; Renaud, A; Rescigno, M; Resconi, S; Resende, B; Reznicek, P; Rezvani, R; Richter, R; Ridel, M; Rieck, P; Rijssenbeek, M; Rimoldi, A; Rinaldi, L; Ritsch, E; Riu, I; Rivoltella, G; Rizatdinova, F; Rizvi, E; Robertson, S H; Robichaud-Veronneau, A; Robinson, D; Robinson, J E M; Robson, A; Rocha de Lima, J G; Roda, C; Roda Dos Santos, D; Rodrigues, L; Roe, S; Røhne, O; Rolli, S; Romaniouk, A; Romano, M; Romeo, G; Romero Adam, E; Rompotis, N; Roos, L; Ros, E; Rosati, S; Rosbach, K; Rose, A; Rose, M; Rosendahl, P L; Rosenthal, O; Rossetti, V; Rossi, E; Rossi, L P; Rosten, R; Rotaru, M; Roth, I; Rothberg, J; Rousseau, D; Royon, C R; Rozanov, A; Rozen, Y; Ruan, X; Rubbo, F; Rubinskiy, I; Rud, V I; Rudolph, C; Rudolph, M S; Rühr, F; Ruiz-Martinez, A; Rumyantsev, L; Rurikova, Z; Rusakovich, N A; Ruschke, A; Rutherfoord, J P; Ruthmann, N; Ruzicka, P; Ryabov, Y F; Rybar, M; Rybkin, G; Ryder, N C; Saavedra, A F; Sacerdoti, S; Saddique, A; Sadeh, I; Sadrozinski, H F-W; Sadykov, R; Safai Tehrani, F; Sakamoto, H; Sakurai, Y; Salamanna, G; Salamon, A; Saleem, M; Salek, D; Sales De Bruin, P H; Salihagic, D; Salnikov, A; Salt, J; Salvachua Ferrando, B M; Salvatore, D; Salvatore, F; Salvucci, A; Salzburger, A; Sampsonidis, D; Sanchez, A; Sánchez, J; Sanchez Martinez, V; Sandaker, H; Sander, H G; Sanders, M P; Sandhoff, M; Sandoval, T; Sandoval, C; Sandstroem, R; Sankey, D P C; Sansoni, A; Santoni, C; Santonico, R; Santos, H; Santoyo Castillo, I; Sapp, K; Sapronov, A; Saraiva, J G; Sarkisyan-Grinbaum, E; Sarrazin, B; Sartisohn, G; Sasaki, O; Sasaki, Y; Sasao, N; Satsounkevitch, I; Sauvage, G; Sauvan, E; Sauvan, J B; Savard, P; Savinov, V; Savu, D O; Sawyer, C; Sawyer, L; Saxon, D H; Saxon, J; Sbarra, C; Sbrizzi, A; Scanlon, T; Scannicchio, D A; Scarcella, M; Schaarschmidt, J; Schacht, P; Schaefer, D; Schaelicke, A; Schaepe, S; Schaetzel, S; Schäfer, U; Schaffer, A C; Schaile, D; Schamberger, R D; Scharf, V; Schegelsky, V A; Scheirich, D; Schernau, M; Scherzer, M I; Schiavi, C; Schieck, J; Schillo, C; Schioppa, M; Schlenker, S; Schmidt, E; Schmieden, K; Schmitt, C; Schmitt, C; Schmitt, S; Schneider, B; Schnellbach, Y J; Schnoor, U; Schoeffel, L; Schoening, A; Schoenrock, B D; Schorlemmer, A L S; Schott, M; Schouten, D; Schovancova, J; Schram, M; Schramm, S; Schreyer, M; Schroeder, C; Schroer, N; Schuh, N; Schultens, M J; Schultz-Coulon, H-C; Schulz, H; Schumacher, M; Schumm, B A; Schune, Ph; Schwartzman, A; Schwegler, Ph; Schwemling, Ph; Schwienhorst, R; Schwindling, J; Schwindt, T; Schwoerer, M; Sciacca, F G; Scifo, E; Sciolla, G; Scott, W G; Scutti, F; Searcy, J; Sedov, G; Sedykh, E; Seidel, S C; Seiden, A; Seifert, F; Seixas, J M; Sekhniaidze, G; Sekula, S J; Selbach, K E; Seliverstov, D M; Sellers, G; Seman, M; Semprini-Cesari, N; Serfon, C; Serin, L; Serkin, L; Serre, T; Seuster, R; Severini, H; Sforza, F; Sfyrla, A; Shabalina, E; Shamim, M; Shan, L Y; Shank, J T; Shao, Q T; Shapiro, M; Shatalov, P B; Shaw, K; Sherwood, P; Shimizu, S; Shimojima, M; Shin, T; Shiyakova, M; Shmeleva, A; Shochet, M J; Short, D; Shrestha, S; Shulga, E; Shupe, M A; Shushkevich, S; Sicho, P; Sidorov, D; Sidoti, A; Siegert, F; Sijacki, Dj; Silbert, O; Silva, J; Silver, Y; Silverstein, D; Silverstein, S B; Simak, V; Simard, O; Simic, Lj; Simion, S; Simioni, E; Simmons, B; Simoniello, R; Simonyan, M; Sinervo, P; Sinev, N B; Sipica, V; Siragusa, G; Sircar, A; Sisakyan, A N; Sivoklokov, S Yu; Sjölin, J; Sjursen, T B; Skinnari, L A; Skottowe, H P; Skovpen, K Yu; Skubic, P; Slater, M; Slavicek, T; Sliwa, K; Smakhtin, V; Smart, B H; Smestad, L; Smirnov, S Yu; Smirnov, Y; Smirnova, L N; Smirnova, O; Smith, K M; Smizanska, M; Smolek, K; Snesarev, A A; Snidero, G; Snow, J; Snyder, S; Sobie, R; Socher, F; Sodomka, J; Soffer, A; Soh, D A; Solans, C A; Solar, M; Solc, J; Soldatov, E Yu; Soldevila, U; Solfaroli Camillocci, E; Solodkov, A A; Solovyanov, O V; Solovyev, V; Soni, N; Sood, A; Sopko, V; Sopko, B; Sosebee, M; Soualah, R; Soueid, P; Soukharev, A M; South, D; Spagnolo, S; Spanò, F; Spearman, W R; Spighi, R; Spigo, G; Spousta, M; Spreitzer, T; Spurlock, B; St Denis, R D; Stahlman, J; Stamen, R; Stanecka, E; Stanek, R W; Stanescu, C; Stanescu-Bellu, M; Stanitzki, M M; Stapnes, S; Starchenko, E A; Stark, J; Staroba, P; Starovoitov, P; Staszewski, R; Stavina, P; Steele, G; Steinbach, P; Steinberg, P; Stekl, I; Stelzer, B; Stelzer, H J; Stelzer-Chilton, O; Stenzel, H; Stern, S; Stewart, G A; Stillings, J A; Stockton, M C; Stoebe, M; Stoerig, K; Stoicea, G; Stonjek, S; Stradling, A R; Straessner, A; Strandberg, J; Strandberg, S; Strandlie, A; Strauss, E; Strauss, M; Strizenec, P; Ströhmer, R; Strom, D M; Stroynowski, R; Stucci, S A; Stugu, B; Stumer, I; Stupak, J; Sturm, P; Styles, N A; Su, D; Su, J; Subramania, Hs; Subramaniam, R; Succurro, A; Sugaya, Y; Suhr, C; Suk, M; Sulin, V V; Sultansoy, S; Sumida, T; Sun, X; Sundermann, J E; Suruliz, K; Susinno, G; Sutton, M R; Suzuki, Y; Svatos, M; Swedish, S; Swiatlowski, M; Sykora, I; Sykora, T; Ta, D; Tackmann, K; Taenzer, J; Taffard, A; Tafirout, R; Taiblum, N; Takahashi, Y; Takai, H; Takashima, R; Takeda, H; Takeshita, T; Takubo, Y; Talby, M; Talyshev, A A; Tam, J Y C; Tamsett, M C; Tan, K G; Tanaka, J; Tanaka, R; Tanaka, S; Tanaka, S; Tanasijczuk, A J; Tani, K; Tannoury, N; Tapprogge, S; Tarem, S; Tarrade, F; Tartarelli, G F; Tas, P; Tasevsky, M; Tashiro, T; Tassi, E; Tavares Delgado, A; Tayalati, Y; Taylor, C; Taylor, F E; Taylor, G N; Taylor, W; Teischinger, F A; Teixeira Dias Castanheira, M; Teixeira-Dias, P; Temming, K K; Ten Kate, H; Teng, P K; Terada, S; Terashi, K; Terron, J; Terzo, S; Testa, M; Teuscher, R J; Therhaag, J; Theveneaux-Pelzer, T; Thoma, S; Thomas, J P; Thompson, E N; Thompson, P D; Thompson, P D; Thompson, A S; Thomsen, L A; Thomson, E; Thomson, M; Thong, W M; Thun, R P; Tian, F; Tibbetts, M J; Tic, T; Tikhomirov, V O; Tikhonov, Yu A; Timoshenko, S; Tiouchichine, E; Tipton, P; Tisserant, S; Todorov, T; Todorova-Nova, S; Toggerson, B; Tojo, J; Tokár, S; Tokushuku, K; Tollefson, K; Tomlinson, L; Tomoto, M; Tompkins, L; Toms, K; Topilin, N D; Torrence, E; Torres, H; Torró Pastor, E; Toth, J; Touchard, F; Tovey, D R; Tran, H L; Trefzger, T; Tremblet, L; Tricoli, A; Trigger, I M; Trincaz-Duvoid, S; Tripiana, M F; Triplett, N; Trischuk, W; Trocmé, B; Troncon, C; Trottier-McDonald, M; Trovatelli, M; True, P; Trzebinski, M; Trzupek, A; Tsarouchas, C; Tseng, J C-L; Tsiareshka, P V; Tsionou, D; Tsipolitis, G; Tsirintanis, N; Tsiskaridze, S; Tsiskaridze, V; Tskhadadze, E G; Tsukerman, I I; Tsulaia, V; Tsung, J-W; Tsuno, S; Tsybychev, D; Tua, A; Tudorache, A; Tudorache, V; Tuggle, J M; Tuna, A N; Tupputi, S A; Turchikhin, S; Turecek, D; Turk Cakir, I; Turra, R; Tuts, P M; Tykhonov, A; Tylmad, M; Tyndel, M; Uchida, K; Ueda, I; Ueno, R; Ughetto, M; Ugland, M; Uhlenbrock, M; Ukegawa, F; Unal, G; Undrus, A; Unel, G; Ungaro, F C; Unno, Y; Urbaniec, D; Urquijo, P; Usai, G; Usanova, A; Vacavant, L; Vacek, V; Vachon, B; Valencic, N; Valentinetti, S; Valero, A; Valery, L; Valkar, S; Valladolid Gallego, E; Vallecorsa, S; Valls Ferrer, J A; Van Berg, R; Van Der Deijl, P C; van der Geer, R; van der Graaf, H; Van Der Leeuw, R; van der Ster, D; van Eldik, N; van Gemmeren, P; Van Nieuwkoop, J; van Vulpen, I; van Woerden, M C; Vanadia, M; Vandelli, W; Vaniachine, A; Vankov, P; Vannucci, F; Vardanyan, G; Vari, R; Varnes, E W; Varol, T; Varouchas, D; Vartapetian, A; Varvell, K E; Vassilakopoulos, V I; Vazeille, F; Vazquez Schroeder, T; Veatch, J; Veloso, F; Veneziano, S; Ventura, A; Ventura, D; Venturi, M; Venturi, N; Venturini, A; Vercesi, V; Verducci, M; Verkerke, W; Vermeulen, J C; Vest, A; Vetterli, M C; Viazlo, O; Vichou, I; Vickey, T; Vickey Boeriu, O E; Viehhauser, G H A; Viel, S; Vigne, R; Villa, M; Villaplana Perez, M; Vilucchi, E; Vincter, M G; Vinogradov, V B; Virzi, J; Vitells, O; Viti, M; Vivarelli, I; Vives Vaque, F; Vlachos, S; Vladoiu, D; Vlasak, M; Vogel, A; Vokac, P; Volpi, G; Volpi, M; Volpini, G; von der Schmitt, H; von Radziewski, H; von Toerne, E; Vorobel, V; Vos, M; Voss, R; Vossebeld, J H; Vranjes, N; Vranjes Milosavljevic, M; Vrba, V; Vreeswijk, M; Vu Anh, T; Vuillermet, R; Vukotic, I; Vykydal, Z; Wagner, W; Wagner, P; Wahrmund, S; Wakabayashi, J; Walch, S; Walder, J; Walker, R; Walkowiak, W; Wall, R; Waller, P; Walsh, B; Wang, C; Wang, H; Wang, H; Wang, J; Wang, J; Wang, K; Wang, R; Wang, S M; Wang, T; Wang, X; Warburton, A; Ward, C P; Wardrope, D R; Warsinsky, M; Washbrook, A; Wasicki, C; Watanabe, I; Watkins, P M; Watson, A T; Watson, I J; Watson, M F; Watts, G; Watts, S; Waugh, A T; Waugh, B M; Webb, S; Weber, M S; Weber, S W; Webster, J S; Weidberg, A R; Weigell, P; Weingarten, J; Weiser, C; Weits, H; Wells, P S; Wenaus, T; Wendland, D; Weng, Z; Wengler, T; Wenig, S; Wermes, N; Werner, M; Werner, P; Wessels, M; Wetter, J; Whalen, K; White, A; White, M J; White, R; White, S; Whiteson, D; Whittington, D; Wicke, D; Wickens, F J; Wiedenmann, W; Wielers, M; Wienemann, P; Wiglesworth, C; Wiik-Fuchs, L A M; Wijeratne, P A; Wildauer, A; Wildt, M A; Wilhelm, I; Wilkens, H G; Will, J Z; Williams, H H; Williams, S; Willis, W; Willocq, S; Wilson, J A; Wilson, A; Wingerter-Seez, I; Winkelmann, S; Winklmeier, F; Wittgen, M; Wittig, T; Wittkowski, J; Wollstadt, S J; Wolter, M W; Wolters, H; Wong, W C; Wosiek, B K; Wotschack, J; Woudstra, M J; Wozniak, K W; Wraight, K; Wright, M; Wu, S L; Wu, X; Wu, Y; Wulf, E; Wyatt, T R; Wynne, B M; Xella, S; Xiao, M; Xu, C; Xu, D; Xu, L; Yabsley, B; Yacoob, S; Yamada, M; Yamaguchi, H; Yamaguchi, Y; Yamamoto, A; Yamamoto, K; Yamamoto, S; Yamamura, T; Yamanaka, T; Yamauchi, K; Yamazaki, Y; Yan, Z; Yang, H; Yang, H; Yang, U K; Yang, Y; Yanush, S; Yao, L; Yasu, Y; Yatsenko, E; Yau Wong, K H; Ye, J; Ye, S; Yen, A L; Yildirim, E; Yilmaz, M; Yoosoofmiya, R; Yorita, K; Yoshida, R; Yoshihara, K; Young, C; Young, C J S; Youssef, S; Yu, D R; Yu, J; Yu, J; Yuan, L; Yurkewicz, A; Zabinski, B; Zaidan, R; Zaitsev, A M; Zaman, A; Zambito, S; Zanello, L; Zanzi, D; Zaytsev, A; Zeitnitz, C; Zeman, M; Zemla, A; Zengel, K; Zenin, O; Ženiš, T; Zerwas, D; Zevi Della Porta, G; Zhang, D; Zhang, H; Zhang, J; Zhang, L; Zhang, X; Zhang, Z; Zhao, Z; Zhemchugov, A; Zhong, J; Zhou, B; Zhou, L; Zhou, N; Zhu, C G; Zhu, H; Zhu, J; Zhu, Y; Zhuang, X; Zibell, A; Zieminska, D; Zimine, N I; Zimmermann, C; Zimmermann, R; Zimmermann, S; Zimmermann, S; Zinonos, Z; Ziolkowski, M; Zitoun, R; Zobernig, G; Zoccoli, A; Zur Nedden, M; Zurzolo, G; Zutshi, V; Zwalinski, L

    The jet energy scale (JES) and its systematic uncertainty are determined for jets measured with the ATLAS detector using proton-proton collision data with a centre-of-mass energy of [Formula: see text] TeV corresponding to an integrated luminosity of [Formula: see text][Formula: see text]. Jets are reconstructed from energy deposits forming topological clusters of calorimeter cells using the anti-[Formula: see text] algorithm with distance parameters [Formula: see text] or [Formula: see text], and are calibrated using MC simulations. A residual JES correction is applied to account for differences between data and MC simulations. This correction and its systematic uncertainty are estimated using a combination of in situ techniques exploiting the transverse momentum balance between a jet and a reference object such as a photon or a [Formula: see text] boson, for [Formula: see text] and pseudorapidities [Formula: see text]. The effect of multiple proton-proton interactions is corrected for, and an uncertainty is evaluated using in situ techniques. The smallest JES uncertainty of less than 1 % is found in the central calorimeter region ([Formula: see text]) for jets with [Formula: see text]. For central jets at lower [Formula: see text], the uncertainty is about 3 %. A consistent JES estimate is found using measurements of the calorimeter response of single hadrons in proton-proton collisions and test-beam data, which also provide the estimate for [Formula: see text] TeV. The calibration of forward jets is derived from dijet [Formula: see text] balance measurements. The resulting uncertainty reaches its largest value of 6 % for low-[Formula: see text] jets at [Formula: see text]. Additional JES uncertainties due to specific event topologies, such as close-by jets or selections of event samples with an enhanced content of jets originating from light quarks or gluons, are also discussed. The magnitude of these uncertainties depends on the event sample used in a given physics analysis, but typically amounts to 0.5-3 %.

  9. Jet energy measurement and its systematic uncertainty in proton–proton collisions at √s = 7 TeV with the ATLAS detector

    DOE PAGES

    Aad, G.

    2015-01-15

    The jet energy scale (JES) and its systematic uncertainty are determined for jets measured with the ATLAS detector using proton–proton collision data with a centre-of-mass energy of \\(\\sqrt{s}=7\\) TeV corresponding to an integrated luminosity of \\(4.7\\) \\(\\,\\,\\text{ fb }^{-1}\\). Jets are reconstructed from energy deposits forming topological clusters of calorimeter cells using the anti-\\(k_{t}\\) algorithm with distance parameters \\(R=0.4\\) or \\(R=0.6\\), and are calibrated using MC simulations. A residual JES correction is applied to account for differences between data and MC simulations. This correction and its systematic uncertainty are estimated using a combination of in situ techniques exploiting the transversemore » momentum balance between a jet and a reference object such as a photon or a \\(Z\\) boson, for \\({20} \\le p_{\\mathrm {T}}^\\mathrm {jet}<{1000}\\, ~\\mathrm{GeV }\\) and pseudorapidities \\(|\\eta |<{4.5}\\). The effect of multiple proton–proton interactions is corrected for, and an uncertainty is evaluated using in situ techniques. The smallest JES uncertainty of less than 1 % is found in the central calorimeter region (\\(|\\eta |<{1.2}\\)) for jets with \\({55} \\le p_{\\mathrm {T}}^\\mathrm {jet}<{500}\\, ~\\mathrm{GeV }\\). For central jets at lower \\(p_{\\mathrm {T}}\\), the uncertainty is about 3 %. A consistent JES estimate is found using measurements of the calorimeter response of single hadrons in proton–proton collisions and test-beam data, which also provide the estimate for \\(p_{\\mathrm {T}}^\\mathrm {jet}> 1\\) TeV. The calibration of forward jets is derived from dijet \\(p_{\\mathrm {T}}\\) balance measurements. The resulting uncertainty reaches its largest value of 6 % for low-\\(p_{\\mathrm {T}}\\) jets at \\(|\\eta |=4.5\\). In addition, JES uncertainties due to specific event topologies, such as close-by jets or selections of event samples with an enhanced content of jets originating from light quarks or gluons, are also discussed. The magnitude of these uncertainties depends on the event sample used in a given physics analysis, but typically amounts to 0.5–3 %.« less

  10. Decision analysis for conservation breeding: Maximizing production for reintroduction of whooping cranes

    USGS Publications Warehouse

    Smith, Des H.V.; Converse, Sarah J.; Gibson, Keith; Moehrenschlager, Axel; Link, William A.; Olsen, Glenn H.; Maguire, Kelly

    2011-01-01

    Captive breeding is key to management of severely endangered species, but maximizing captive production can be challenging because of poor knowledge of species breeding biology and the complexity of evaluating different management options. In the face of uncertainty and complexity, decision-analytic approaches can be used to identify optimal management options for maximizing captive production. Building decision-analytic models requires iterations of model conception, data analysis, model building and evaluation, identification of remaining uncertainty, further research and monitoring to reduce uncertainty, and integration of new data into the model. We initiated such a process to maximize captive production of the whooping crane (Grus americana), the world's most endangered crane, which is managed through captive breeding and reintroduction. We collected 15 years of captive breeding data from 3 institutions and used Bayesian analysis and model selection to identify predictors of whooping crane hatching success. The strongest predictor, and that with clear management relevance, was incubation environment. The incubation period of whooping crane eggs is split across two environments: crane nests and artificial incubators. Although artificial incubators are useful for allowing breeding pairs to produce multiple clutches, our results indicate that crane incubation is most effective at promoting hatching success. Hatching probability increased the longer an egg spent in a crane nest, from 40% hatching probability for eggs receiving 1 day of crane incubation to 95% for those receiving 30 days (time incubated in each environment varied independently of total incubation period). Because birds will lay fewer eggs when they are incubating longer, a tradeoff exists between the number of clutches produced and egg hatching probability. We developed a decision-analytic model that estimated 16 to be the optimal number of days of crane incubation needed to maximize the number of offspring produced. These results show that using decision-analytic tools to account for uncertainty in captive breeding can improve the rate at which such programs contribute to wildlife reintroductions. 

  11. Probabilistic Integrated Assessment of ``Dangerous'' Climate Change

    NASA Astrophysics Data System (ADS)

    Mastrandrea, Michael D.; Schneider, Stephen H.

    2004-04-01

    Climate policy decisions are being made despite layers of uncertainty. Such decisions directly influence the potential for ``dangerous anthropogenic interference with the climate system.'' We mapped a metric for this concept, based on Intergovernmental Panel on Climate Change assessment of climate impacts, onto probability distributions of future climate change produced from uncertainty in key parameters of the coupled social-natural system-climate sensitivity, climate damages, and discount rate. Analyses with a simple integrated assessment model found that, under midrange assumptions, endogenously calculated, optimal climate policy controls can reduce the probability of dangerous anthropogenic interference from ~45% under minimal controls to near zero.

  12. Conflation and integration of archived geologic maps and associated uncertainties

    USGS Publications Warehouse

    Shoberg, Thomas G.

    2016-01-01

    Old, archived geologic maps are often available with little or no associated metadata. This creates special problems in terms of extracting their data to use with a modern database. This research focuses on some problems and uncertainties associated with conflating older geologic maps in regions where modern geologic maps are, as yet, non-existent as well as vertically integrating the conflated maps with layers of modern GIS data (in this case, The National Map of the U.S. Geological Survey). Ste. Genevieve County, Missouri was chosen as the test area. It is covered by six archived geologic maps constructed in the years between 1928 and 1994. Conflating these maps results in a map that is internally consistent with these six maps, is digitally integrated with hydrography, elevation and orthoimagery data, and has a 95% confidence interval useful for further data set integration.

  13. Joint analysis of input and parametric uncertainties in watershed water quality modeling: A formal Bayesian approach

    NASA Astrophysics Data System (ADS)

    Han, Feng; Zheng, Yi

    2018-06-01

    Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.

  14. A computationally inexpensive model for estimating dimensional measurement uncertainty due to x-ray computed tomography instrument misalignments

    NASA Astrophysics Data System (ADS)

    Ametova, Evelina; Ferrucci, Massimiliano; Chilingaryan, Suren; Dewulf, Wim

    2018-06-01

    The recent emergence of advanced manufacturing techniques such as additive manufacturing and an increased demand on the integrity of components have motivated research on the application of x-ray computed tomography (CT) for dimensional quality control. While CT has shown significant empirical potential for this purpose, there is a need for metrological research to accelerate the acceptance of CT as a measuring instrument. The accuracy in CT-based measurements is vulnerable to the instrument geometrical configuration during data acquisition, namely the relative position and orientation of x-ray source, rotation stage, and detector. Consistency between the actual instrument geometry and the corresponding parameters used in the reconstruction algorithm is critical. Currently available procedures provide users with only estimates of geometrical parameters. Quantification and propagation of uncertainty in the measured geometrical parameters must be considered to provide a complete uncertainty analysis and to establish confidence intervals for CT dimensional measurements. In this paper, we propose a computationally inexpensive model to approximate the influence of errors in CT geometrical parameters on dimensional measurement results. We use surface points extracted from a computer-aided design (CAD) model to model discrepancies in the radiographic image coordinates assigned to the projected edges between an aligned system and a system with misalignments. The efficacy of the proposed method was confirmed on simulated and experimental data in the presence of various geometrical uncertainty contributors.

  15. Reducing risk and increasing confidence of decision making at a lower cost: In-situ pXRF assessment of metal-contaminated sites.

    PubMed

    Rouillon, Marek; Taylor, Mark P; Dong, Chenyin

    2017-10-01

    This study evaluates the in-situ use of field portable X-ray Fluorescence (pXRF) for metal-contaminated site assessments, and assesses the advantages of increased sampling to reduce risk, and increase confidence of decision making at a lower cost. Five metal-contaminated sites were assessed using both in-situ pXRF and ex-situ inductively coupled plasma mass spectrometry (ICP-MS) analyses at various sampling resolutions. Twenty second in-situ pXRF measurements of Mn, Zn and Pb were corrected using a subset of parallel ICP-MS measurements taken at each site. Field and analytical duplicates revealed sampling as the major contributor (>95% variation) to measurement uncertainties. This study shows that increased sampling led to several benefits including more representative site characterisation, higher soil-metal mapping resolution, reduced uncertainty around the site mean, and reduced sampling uncertainty. Real time pXRF data enabled efficient, on-site decision making for further judgemental sampling, without the need to return to the site. Additionally, in-situ pXRF was more cost effective than the current approach of ex-situ sampling and ICP-MS analysis, even with higher sampling at each site. Lastly, a probabilistic site assessment approach was applied to demonstrate the advantages of integrating estimated measurement uncertainties into site reporting. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Hydraulic Conductivity Estimation using Bayesian Model Averaging and Generalized Parameterization

    NASA Astrophysics Data System (ADS)

    Tsai, F. T.; Li, X.

    2006-12-01

    Non-uniqueness in parameterization scheme is an inherent problem in groundwater inverse modeling due to limited data. To cope with the non-uniqueness problem of parameterization, we introduce a Bayesian Model Averaging (BMA) method to integrate a set of selected parameterization methods. The estimation uncertainty in BMA includes the uncertainty in individual parameterization methods as the within-parameterization variance and the uncertainty from using different parameterization methods as the between-parameterization variance. Moreover, the generalized parameterization (GP) method is considered in the geostatistical framework in this study. The GP method aims at increasing the flexibility of parameterization through the combination of a zonation structure and an interpolation method. The use of BMP with GP avoids over-confidence in a single parameterization method. A normalized least-squares estimation (NLSE) is adopted to calculate the posterior probability for each GP. We employee the adjoint state method for the sensitivity analysis on the weighting coefficients in the GP method. The adjoint state method is also applied to the NLSE problem. The proposed methodology is implemented to the Alamitos Barrier Project (ABP) in California, where the spatially distributed hydraulic conductivity is estimated. The optimal weighting coefficients embedded in GP are identified through the maximum likelihood estimation (MLE) where the misfits between the observed and calculated groundwater heads are minimized. The conditional mean and conditional variance of the estimated hydraulic conductivity distribution using BMA are obtained to assess the estimation uncertainty.

  17. Accounting for methodological, structural, and parameter uncertainty in decision-analytic models: a practical guide.

    PubMed

    Bilcke, Joke; Beutels, Philippe; Brisson, Marc; Jit, Mark

    2011-01-01

    Accounting for uncertainty is now a standard part of decision-analytic modeling and is recommended by many health technology agencies and published guidelines. However, the scope of such analyses is often limited, even though techniques have been developed for presenting the effects of methodological, structural, and parameter uncertainty on model results. To help bring these techniques into mainstream use, the authors present a step-by-step guide that offers an integrated approach to account for different kinds of uncertainty in the same model, along with a checklist for assessing the way in which uncertainty has been incorporated. The guide also addresses special situations such as when a source of uncertainty is difficult to parameterize, resources are limited for an ideal exploration of uncertainty, or evidence to inform the model is not available or not reliable. for identifying the sources of uncertainty that influence results most are also described. Besides guiding analysts, the guide and checklist may be useful to decision makers who need to assess how well uncertainty has been accounted for in a decision-analytic model before using the results to make a decision.

  18. Quantitative Analysis of Uncertainty in Medical Reporting: Creating a Standardized and Objective Methodology.

    PubMed

    Reiner, Bruce I

    2018-04-01

    Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.

  19. Patients' and partners' perspectives of chronic illness and its management.

    PubMed

    Checton, Maria G; Greene, Kathryn; Magsamen-Conrad, Kate; Venetis, Maria K

    2012-06-01

    This study is framed in theories of illness uncertainty (Babrow, A. S., 2007, Problematic integration theory. In B. B. Whaley & W. Samter (Eds.), Explaining communication: Contemporary theories and exemplars (pp. 181-200). Mahwah, NJ: Erlbaum; Babrow & Matthias, 2009; Brashers, D. E., 2007, A theory of communication and uncertainty management. In B. B. Whaley & W. Samter (Eds.), Explaining communication: Contemporary theories and exemplars (pp. 201-218). Mahwah, NJ: Erlbaum; Hogan, T. P., & Brashers, D. E. (2009). The theory of communication and uncertainty management: Implications for the wider realm of information behavior. In T. D. Afifi & W. A. Afifi (Eds.), Uncertainty and information regulation in interpersonal contexts: Theories and applications, (pp. 45-66). New York, NY: Routledge; Mishel, M. H. (1999). Uncertainty in chronic illness. Annual Review of Nursing Research, 17, 269-294; Mishel, M. H., & Clayton, M. F., 2003, Theories of uncertainty. In M. J. Smith & P. R. Liehr (Eds.), Middle range theory for nursing (pp. 25-48). New York, NY: Springer) and health information management (Afifi, W. A., & Weiner, J. L., 2004, Toward a theory of motivated information management. Communication Theory, 14, 167-190. doi:10.1111/j.1468-2885.2004.tb00310.x; Greene, K., 2009, An integrated model of health disclosure decision-making. In T. D. Afifi & W. A. Afifi (Eds.), Uncertainty and information regulation in interpersonal contexts: Theories and applications (pp. 226-253). New York, NY: Routledge) and examines how couples experience uncertainty and interference related to one partner's chronic health condition. Specifically, a model is hypothesized in which illness uncertainty (i.e., stigma, prognosis, and symptom) and illness interference predict communication efficacy and health condition management. Participants include 308 dyads in which one partner has a chronic health condition. Data were analyzed using structural equation modeling. Results indicate that there are significant differences in (a) how patients and partners experience illness uncertainty and illness interference and (b) how appraisals of illness uncertainty and illness interference influence communication efficacy and health condition management. We discuss the findings and implications of the study.

  20. Advanced Computational Framework for Environmental Management ZEM, Version 1.x

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vesselinov, Velimir V.; O'Malley, Daniel; Pandey, Sachin

    2016-11-04

    Typically environmental management problems require analysis of large and complex data sets originating from concurrent data streams with different data collection frequencies and pedigree. These big data sets require on-the-fly integration into a series of models with different complexity for various types of model analyses where the data are applied as soft and hard model constraints. This is needed to provide fast iterative model analyses based on the latest available data to guide decision-making. Furthermore, the data and model are associated with uncertainties. The uncertainties are probabilistic (e.g. measurement errors) and non-probabilistic (unknowns, e.g. alternative conceptual models characterizing site conditions).more » To address all of these issues, we have developed an integrated framework for real-time data and model analyses for environmental decision-making called ZEM. The framework allows for seamless and on-the-fly integration of data and modeling results for robust and scientifically-defensible decision-making applying advanced decision analyses tools such as Bayesian- Information-Gap Decision Theory (BIG-DT). The framework also includes advanced methods for optimization that are capable of dealing with a large number of unknown model parameters, and surrogate (reduced order) modeling capabilities based on support vector regression techniques. The framework is coded in Julia, a state-of-the-art high-performance programing language (http://julialang.org). The ZEM framework is open-source and can be applied to any environmental management site. The framework will be open-source and released under GPL V3 license.« less

Top