Uncertainty of quantitative microbiological methods of pharmaceutical analysis.
Gunar, O V; Sakhno, N G
2015-12-30
The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Ciurean, R. L.; Glade, T.
2012-04-01
Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.
Relating Data and Models to Characterize Parameter and Prediction Uncertainty
Applying PBPK models in risk analysis requires that we realistically assess the uncertainty of relevant model predictions in as quantitative a way as possible. The reality of human variability may add a confusing feature to the overall uncertainty assessment, as uncertainty and v...
Identifying influences on model uncertainty: an application using a forest carbon budget model
James E. Smith; Linda S. Heath
2001-01-01
Uncertainty is an important consideration for both developers and users of environmental simulation models. Establishing quantitative estimates of uncertainty for deterministic models can be difficult when the underlying bases for such information are scarce. We demonstrate an application of probabilistic uncertainty analysis that provides for refinements in...
Enhancing the Characterization of Epistemic Uncertainties in PM2.5 Risk Analyses.
Smith, Anne E; Gans, Will
2015-03-01
The Environmental Benefits Mapping and Analysis Program (BenMAP) is a software tool developed by the U.S. Environmental Protection Agency (EPA) that is widely used inside and outside of EPA to produce quantitative estimates of public health risks from fine particulate matter (PM2.5 ). This article discusses the purpose and appropriate role of a risk analysis tool to support risk management deliberations, and evaluates the functions of BenMAP in this context. It highlights the importance in quantitative risk analyses of characterization of epistemic uncertainty, or outright lack of knowledge, about the true risk relationships being quantified. This article describes and quantitatively illustrates sensitivities of PM2.5 risk estimates to several key forms of epistemic uncertainty that pervade those calculations: the risk coefficient, shape of the risk function, and the relative toxicity of individual PM2.5 constituents. It also summarizes findings from a review of U.S.-based epidemiological evidence regarding the PM2.5 risk coefficient for mortality from long-term exposure. That review shows that the set of risk coefficients embedded in BenMAP substantially understates the range in the literature. We conclude that BenMAP would more usefully fulfill its role as a risk analysis support tool if its functions were extended to better enable and prompt its users to characterize the epistemic uncertainties in their risk calculations. This requires expanded automatic sensitivity analysis functions and more recognition of the full range of uncertainty in risk coefficients. © 2014 Society for Risk Analysis.
Ledford, Christy J W; Cafferty, Lauren A; Seehusen, Dean A
2015-01-01
Uncertainty is a central theme in the practice of medicine and particularly primary care. This study explored how family medicine resident physicians react to uncertainty in their practice. This study incorporated a two-phase mixed methods approach, including semi-structured personal interviews (n=21) and longitudinal self-report surveys (n=21) with family medicine residents. Qualitative analysis showed that though residents described uncertainty as an implicit part of their identity, they still developed tactics to minimize or manage uncertainty in their practice. Residents described increasing comfort with uncertainty the longer they practiced and anticipated that growth continuing throughout their careers. Quantitative surveys showed that reactions to uncertainty were more positive over time; however, the difference was not statistically significant. Qualitative and quantitative results show that as family medicine residents practice medicine their perception of uncertainty changes. To reduce uncertainty, residents use relational information-seeking strategies. From a broader view of practice, residents describe uncertainty neutrally, asserting that uncertainty is simply part of the practice of family medicine.
Uncertainty Estimation Cheat Sheet for Probabilistic Risk Assessment
NASA Technical Reports Server (NTRS)
Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.
2017-01-01
"Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This paper will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.
Lognormal Uncertainty Estimation for Failure Rates
NASA Technical Reports Server (NTRS)
Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.
2017-01-01
"Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain. Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This presentation will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.
NASA Astrophysics Data System (ADS)
Toman, Blaza; Nelson, Michael A.; Lippa, Katrice A.
2016-10-01
Chemical purity assessment using quantitative 1H-nuclear magnetic resonance spectroscopy is a method based on ratio references of mass and signal intensity of the analyte species to that of chemical standards of known purity. As such, it is an example of a calculation using a known measurement equation with multiple inputs. Though multiple samples are often analyzed during purity evaluations in order to assess measurement repeatability, the uncertainty evaluation must also account for contributions from inputs to the measurement equation. Furthermore, there may be other uncertainty components inherent in the experimental design, such as independent implementation of multiple calibration standards. As such, the uncertainty evaluation is not purely bottom up (based on the measurement equation) or top down (based on the experimental design), but inherently contains elements of both. This hybrid form of uncertainty analysis is readily implemented with Bayesian statistical analysis. In this article we describe this type of analysis in detail and illustrate it using data from an evaluation of chemical purity and its uncertainty for a folic acid material.
A subagging regression method for estimating the qualitative and quantitative state of groundwater
NASA Astrophysics Data System (ADS)
Jeong, J.; Park, E.; Choi, J.; Han, W. S.; Yun, S. T.
2016-12-01
A subagging regression (SBR) method for the analysis of groundwater data pertaining to the estimation of trend and the associated uncertainty is proposed. The SBR method is validated against synthetic data competitively with other conventional robust and non-robust methods. From the results, it is verified that the estimation accuracies of the SBR method are consistent and superior to those of the other methods and the uncertainties are reasonably estimated where the others have no uncertainty analysis option. To validate further, real quantitative and qualitative data are employed and analyzed comparatively with Gaussian process regression (GPR). For all cases, the trend and the associated uncertainties are reasonably estimated by SBR, whereas the GPR has limitations in representing the variability of non-Gaussian skewed data. From the implementations, it is determined that the SBR method has potential to be further developed as an effective tool of anomaly detection or outlier identification in groundwater state data.
Reiner, Bruce I
2018-04-01
Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.
Hamui-Sutton, Alicia; Vives-Varela, Tania; Gutiérrez-Barreto, Samuel; Leenen, Iwin; Sánchez-Mendiola, Melchor
2015-11-04
Medical uncertainty is inherently related to the practice of the physician and generally affects his or her patient care, job satisfaction, continuing education, as well as the overall goals of the health care system. In this paper, some new types of uncertainty, which extend existing typologies, are identified and the contexts and strategies to deal with them are studied. We carried out a mixed-methods study, consisting of a qualitative and a quantitative phase. For the qualitative study, 128 residents reported critical incidents in their clinical practice and described how they coped with the uncertainty in the situation. Each critical incident was analyzed and the most salient situations, 45 in total, were retained. In the quantitative phase, a distinct group of 120 medical residents indicated for each of these situations whether they have been involved in the described situations and, if so, which coping strategy they applied. The analysis examines the relation between characteristics of the situation and the coping strategies. From the qualitative study, a new typology of uncertainty was derived which distinguishes between technical, conceptual, communicational, systemic, and ethical uncertainty. The quantitative analysis showed that, independently of the type of uncertainty, critical incidents are most frequently resolved by consulting senior physicians (49 % overall), which underscores the importance of the hierarchical relationships in the hospital. The insights gained by this study are combined into an integrative model of uncertainty in medical residencies, which combines the type and perceived level of uncertainty, the strategies employed to deal with it, and context elements such as the actors present in the situation. The model considers the final resolution at each of three levels: the patient, the health system, and the physician's personal level. This study gives insight into how medical residents make decisions under different types of uncertainty, giving account of the context in which the interactions take place and of the strategies used to resolve the incidents. These insights may guide the development of organizational policies that reduce uncertainty and stress in residents during their clinical training.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, D.W.; Yambert, M.W.; Kocher, D.C.
1994-12-31
A performance assessment of the operating Solid Waste Storage Area 6 (SWSA 6) facility for the disposal of low-level radioactive waste at the Oak Ridge National Laboratory has been prepared to provide the technical basis for demonstrating compliance with the performance objectives of DOE Order 5820.2A, Chapter 111.2 An analysis of the uncertainty incorporated into the assessment was performed which addressed the quantitative uncertainty in the data used by the models, the subjective uncertainty associated with the models used for assessing performance of the disposal facility and site, and the uncertainty in the models used for estimating dose and humanmore » exposure. The results of the uncertainty analysis were used to interpret results and to formulate conclusions about the performance assessment. This paper discusses the approach taken in analyzing the uncertainty in the performance assessment and the role of uncertainty in performance assessment.« less
Measurement uncertainty of liquid chromatographic analyses visualized by Ishikawa diagrams.
Meyer, Veronika R
2003-09-01
Ishikawa, or cause-and-effect diagrams, help to visualize the parameters that influence a chromatographic analysis. Therefore, they facilitate the set up of the uncertainty budget of the analysis, which can then be expressed in mathematical form. If the uncertainty is calculated as the Gaussian sum of all uncertainty parameters, it is necessary to quantitate them all, a task that is usually not practical. The other possible approach is to use the intermediate precision as a base for the uncertainty calculation. In this case, it is at least necessary to consider the uncertainty of the purity of the reference material in addition to the precision data. The Ishikawa diagram is then very simple, and so is the uncertainty calculation. This advantage is given by the loss of information about the parameters that influence the measurement uncertainty.
Uncertainty Estimation Cheat Sheet for Probabilistic Risk Assessment
NASA Technical Reports Server (NTRS)
Britton, Paul; Al Hassan, Mohammad; Ring, Robert
2017-01-01
Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This paper will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.
Error and Uncertainty Analysis for Ecological Modeling and Simulation
2001-12-01
management (LRAM) accounting for environmental, training, and economic factors. In the ELVS methodology, soil erosion status is used as a quantitative...Monte-Carlo approach. The optimization is realized through economic functions or on decision constraints, such as, unit sample cost, number of samples... nitrate flux to the Gulf of Mexico. Nature (Brief Communication) 414: 166-167. (Uncertainty analysis done with SERDP software) Gertner, G., G
Chardon, Jurgen; Swart, Arno
2016-07-01
In the consumer phase of a typical quantitative microbiological risk assessment (QMRA), mathematical equations identify data gaps. To acquire useful data we designed a food consumption and food handling survey (2,226 respondents) for QMRA applications that is especially aimed at obtaining quantitative data. For a broad spectrum of food products, the survey covered the following topics: processing status at retail, consumer storage, preparation, and consumption. Questions were designed to facilitate distribution fitting. In the statistical analysis, special attention was given to the selection of the most adequate distribution to describe the data. Bootstrap procedures were used to describe uncertainty. The final result was a coherent quantitative consumer phase food survey and parameter estimates for food handling and consumption practices in The Netherlands, including variation over individuals and uncertainty estimates.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kurtz, Sarah; Repins, Ingrid L; Hacke, Peter L
Continued growth of PV system deployment would be enhanced by quantitative, low-uncertainty predictions of the degradation and failure rates of PV modules and systems. The intended product lifetime (decades) far exceeds the product development cycle (months), limiting our ability to reduce the uncertainty of the predictions for this rapidly changing technology. Yet, business decisions (setting insurance rates, analyzing return on investment, etc.) require quantitative risk assessment. Moving toward more quantitative assessments requires consideration of many factors, including the intended application, consequence of a possible failure, variability in the manufacturing, installation, and operation, as well as uncertainty in the measured accelerationmore » factors, which provide the basis for predictions based on accelerated tests. As the industry matures, it is useful to periodically assess the overall strategy for standards development and prioritization of research to provide a technical basis both for the standards and the analysis related to the application of those. To this end, this paper suggests a tiered approach to creating risk assessments. Recent and planned potential improvements in international standards are also summarized.« less
Uncertainty Analysis for Angle Calibrations Using Circle Closure
Estler, W. Tyler
1998-01-01
We analyze two types of full-circle angle calibrations: a simple closure in which a single set of unknown angular segments is sequentially compared with an unknown reference angle, and a dual closure in which two divided circles are simultaneously calibrated by intercomparison. In each case, the constraint of circle closure provides auxiliary information that (1) enables a complete calibration process without reference to separately calibrated reference artifacts, and (2) serves to reduce measurement uncertainty. We derive closed-form expressions for the combined standard uncertainties of angle calibrations, following guidelines published by the International Organization for Standardization (ISO) and NIST. The analysis includes methods for the quantitative evaluation of the standard uncertainty of small angle measurement using electronic autocollimators, including the effects of calibration uncertainty and air turbulence. PMID:28009359
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schmittroth, F.
1979-09-01
A documentation of the FERRET data analysis code is given. The code provides a way to combine related measurements and calculations in a consistent evaluation. Basically a very general least-squares code, it is oriented towards problems frequently encountered in nuclear data and reactor physics. A strong emphasis is on the proper treatment of uncertainties and correlations and in providing quantitative uncertainty estimates. Documentation includes a review of the method, structure of the code, input formats, and examples.
NASA Technical Reports Server (NTRS)
Karandikar, Harsh M.
1997-01-01
An approach for objective and quantitative technical and cost risk analysis during product development, which is applicable from the earliest stages, is discussed. The approach is supported by a software tool called the Analytical System for Uncertainty and Risk Estimation (ASURE). Details of ASURE, the underlying concepts and its application history, are provided.
NASA Astrophysics Data System (ADS)
Cabalín, L. M.; González, A.; Ruiz, J.; Laserna, J. J.
2010-08-01
Statistical uncertainty in the quantitative analysis of solid samples in motion by laser-induced breakdown spectroscopy (LIBS) has been assessed. For this purpose, a LIBS demonstrator was designed and constructed in our laboratory. The LIBS system consisted of a laboratory-scale conveyor belt, a compact optical module and a Nd:YAG laser operating at 532 nm. The speed of the conveyor belt was variable and could be adjusted up to a maximum speed of 2 m s - 1 . Statistical uncertainty in the analytical measurements was estimated in terms of precision (reproducibility and repeatability) and accuracy. The results obtained by LIBS on shredded scrap samples under real conditions have demonstrated that the analytical precision and accuracy of LIBS is dependent on the sample geometry, position on the conveyor belt and surface cleanliness. Flat, relatively clean scrap samples exhibited acceptable reproducibility and repeatability; by contrast, samples with an irregular shape or a dirty surface exhibited a poor relative standard deviation.
NASA Astrophysics Data System (ADS)
Watanabe, Kenichi; Minniti, Triestino; Kockelmann, Winfried; Dalgliesh, Robert; Burca, Genoveva; Tremsin, Anton S.
2017-07-01
The uncertainties and the stability of a neutron sensitive MCP/Timepix detector when operating in the event timing mode for quantitative image analysis at a pulsed neutron source were investigated. The dominant component to the uncertainty arises from the counting statistics. The contribution of the overlap correction to the uncertainty was concluded to be negligible from considerations based on the error propagation even if a pixel occupation probability is more than 50%. We, additionally, have taken into account the multiple counting effect in consideration of the counting statistics. Furthermore, the detection efficiency of this detector system changes under relatively high neutron fluxes due to the ageing effects of current Microchannel Plates. Since this efficiency change is position-dependent, it induces a memory image. The memory effect can be significantly reduced with correction procedures using the rate equations describing the permanent gain degradation and the scrubbing effect on the inner surfaces of the MCP pores.
NASA Astrophysics Data System (ADS)
Dittes, Beatrice; Špačková, Olga; Ebrahimian, Negin; Kaiser, Maria; Rieger, Wolfgang; Disse, Markus; Straub, Daniel
2017-04-01
Flood risk estimates are subject to significant uncertainties, e.g. due to limited records of historic flood events, uncertainty in flood modeling, uncertain impact of climate change or uncertainty in the exposure and loss estimates. In traditional design of flood protection systems, these uncertainties are typically just accounted for implicitly, based on engineering judgment. In the AdaptRisk project, we develop a fully quantitative framework for planning of flood protection systems under current and future uncertainties using quantitative pre-posterior Bayesian decision analysis. In this contribution, we focus on the quantification of the uncertainties and study their relative influence on the flood risk estimate and on the planning of flood protection systems. The following uncertainty components are included using a Bayesian approach: 1) inherent and statistical (i.e. limited record length) uncertainty; 2) climate uncertainty that can be learned from an ensemble of GCM-RCM models; 3) estimates of climate uncertainty components not covered in 2), such as bias correction, incomplete ensemble, local specifics not captured by the GCM-RCM models; 4) uncertainty in the inundation modelling; 5) uncertainty in damage estimation. We also investigate how these uncertainties are possibly reduced in the future when new evidence - such as new climate models, observed extreme events, and socio-economic data - becomes available. Finally, we look into how this new evidence influences the risk assessment and effectivity of flood protection systems. We demonstrate our methodology for a pre-alpine catchment in southern Germany: the Mangfall catchment in Bavaria that includes the city of Rosenheim, which suffered significant losses during the 2013 flood event.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gauntt, Randall O.; Mattie, Patrick D.; Bixler, Nathan E.
2014-02-01
This paper describes the knowledge advancements from the uncertainty analysis for the State-of- the-Art Reactor Consequence Analyses (SOARCA) unmitigated long-term station blackout accident scenario at the Peach Bottom Atomic Power Station. This work assessed key MELCOR and MELCOR Accident Consequence Code System, Version 2 (MACCS2) modeling uncertainties in an integrated fashion to quantify the relative importance of each uncertain input on potential accident progression, radiological releases, and off-site consequences. This quantitative uncertainty analysis provides measures of the effects on consequences, of each of the selected uncertain parameters both individually and in interaction with other parameters. The results measure the modelmore » response (e.g., variance in the output) to uncertainty in the selected input. Investigation into the important uncertain parameters in turn yields insights into important phenomena for accident progression and off-site consequences. This uncertainty analysis confirmed the known importance of some parameters, such as failure rate of the Safety Relief Valve in accident progression modeling and the dry deposition velocity in off-site consequence modeling. The analysis also revealed some new insights, such as dependent effect of cesium chemical form for different accident progressions. (auth)« less
Advancing the Fork detector for quantitative spent nuclear fuel verification
Vaccaro, S.; Gauld, I. C.; Hu, J.; ...
2018-01-31
The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations.more » A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This study describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms. Finally, the results are summarized, and sources and magnitudes of uncertainties are identified, and the impact of analysis uncertainties on the ability to confirm operator declarations is quantified.« less
Advancing the Fork detector for quantitative spent nuclear fuel verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vaccaro, S.; Gauld, I. C.; Hu, J.
The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations.more » A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This study describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms. Finally, the results are summarized, and sources and magnitudes of uncertainties are identified, and the impact of analysis uncertainties on the ability to confirm operator declarations is quantified.« less
Advancing the Fork detector for quantitative spent nuclear fuel verification
NASA Astrophysics Data System (ADS)
Vaccaro, S.; Gauld, I. C.; Hu, J.; De Baere, P.; Peterson, J.; Schwalbach, P.; Smejkal, A.; Tomanin, A.; Sjöland, A.; Tobin, S.; Wiarda, D.
2018-04-01
The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations. A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This paper describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms. The results are summarized, and sources and magnitudes of uncertainties are identified, and the impact of analysis uncertainties on the ability to confirm operator declarations is quantified.
Good practices for quantitative bias analysis.
Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander
2014-12-01
Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage more widespread use of bias analysis to estimate the potential magnitude and direction of biases, as well as the uncertainty in estimates potentially influenced by the biases. © The Author 2014; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.
Extrapolation, uncertainty factors, and the precautionary principle.
Steel, Daniel
2011-09-01
This essay examines the relationship between the precautionary principle and uncertainty factors used by toxicologists to estimate acceptable exposure levels for toxic chemicals from animal experiments. It shows that the adoption of uncertainty factors in the United States in the 1950s can be understood by reference to the precautionary principle, but not by cost-benefit analysis because of a lack of relevant quantitative data at that time. In addition, it argues that uncertainty factors continue to be relevant to efforts to implement the precautionary principle and that the precautionary principle should not be restricted to cases involving unquantifiable hazards. Copyright © 2011 Elsevier Ltd. All rights reserved.
Overall uncertainty measurement for near infrared analysis of cryptotanshinone in tanshinone extract
NASA Astrophysics Data System (ADS)
Xue, Zhong; Xu, Bing; Shi, Xinyuan; Yang, Chan; Cui, Xianglong; Luo, Gan; Qiao, Yanjiang
2017-01-01
This study presented a new strategy of overall uncertainty measurement for near infrared (NIR) quantitative analysis of cryptotanshinone in tanshinone extract powders. The overall uncertainty of NIR analysis from validation data of precision, trueness and robustness study was fully investigated and discussed. Quality by design (QbD) elements, such as risk assessment and design of experiment (DOE) were utilized to organize the validation data. An "I × J × K" (series I, the number of repetitions J and level of concentrations K) full factorial design was used to calculate uncertainty from the precision and trueness data. And a 27-4 Plackett-Burmann matrix with four different influence factors resulted from the failure mode and effect analysis (FMEA) analysis was adapted for the robustness study. The overall uncertainty profile was introduced as a graphical decision making tool to evaluate the validity of NIR method over the predefined concentration range. In comparison with the T. Saffaj's method (Analyst, 2013, 138, 4677.) for overall uncertainty assessment, the proposed approach gave almost the same results, demonstrating that the proposed method was reasonable and valid. Moreover, the proposed method can help identify critical factors that influence the NIR prediction performance, which could be used for further optimization of the NIR analytical procedures in routine use.
NASA Technical Reports Server (NTRS)
Fehrman, A. L.; Masek, R. V.
1972-01-01
Quantitative estimates of the uncertainty in predicting aerodynamic heating rates for a fully reusable space shuttle system are developed and the impact of these uncertainties on Thermal Protection System (TPS) weight are discussed. The study approach consisted of statistical evaluations of the scatter of heating data on shuttle configurations about state-of-the-art heating prediction methods to define the uncertainty in these heating predictions. The uncertainties were then applied as heating rate increments to the nominal predicted heating rate to define the uncertainty in TPS weight. Separate evaluations were made for the booster and orbiter, for trajectories which included boost through reentry and touchdown. For purposes of analysis, the vehicle configuration is divided into areas in which a given prediction method is expected to apply, and separate uncertainty factors and corresponding uncertainty in TPS weight derived for each area.
Ji, Xiaoliang; Xie, Runting; Hao, Yun; Lu, Jun
2017-10-01
Quantitative identification of nitrate (NO 3 - -N) sources is critical to the control of nonpoint source nitrogen pollution in an agricultural watershed. Combined with water quality monitoring, we adopted the environmental isotope (δD-H 2 O, δ 18 O-H 2 O, δ 15 N-NO 3 - , and δ 18 O-NO 3 - ) analysis and the Markov Chain Monte Carlo (MCMC) mixing model to determine the proportions of riverine NO 3 - -N inputs from four potential NO 3 - -N sources, namely, atmospheric deposition (AD), chemical nitrogen fertilizer (NF), soil nitrogen (SN), and manure and sewage (M&S), in the ChangLe River watershed of eastern China. Results showed that NO 3 - -N was the main form of nitrogen in this watershed, accounting for approximately 74% of the total nitrogen concentration. A strong hydraulic interaction existed between the surface and groundwater for NO 3 - -N pollution. The variations of the isotopic composition in NO 3 - -N suggested that microbial nitrification was the dominant nitrogen transformation process in surface water, whereas significant denitrification was observed in groundwater. MCMC mixing model outputs revealed that M&S was the predominant contributor to riverine NO 3 - -N pollution (contributing 41.8% on average), followed by SN (34.0%), NF (21.9%), and AD (2.3%) sources. Finally, we constructed an uncertainty index, UI 90 , to quantitatively characterize the uncertainties inherent in NO 3 - -N source apportionment and discussed the reasons behind the uncertainties. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Huda, J.; Kauneckis, D. L.
2013-12-01
Climate change adaptation represents a number of unique policy-making challenges. Foremost among these is dealing with the range of future climate impacts to a wide scope of inter-related natural systems, their interaction with social and economic systems, and uncertainty resulting from the variety of downscaled climate model scenarios and climate science projections. These cascades of uncertainty have led to a number of new approaches as well as a reexamination of traditional methods for evaluating risk and uncertainty in policy-making. Policy makers are required to make decisions and formulate policy irrespective of the level of uncertainty involved and while a debate continues regarding the level of scientific certainty required in order to make a decision, incremental change in the climate policy continues at multiple governance levels. This project conducts a comparative analysis of the range of methodological approaches that are evolving to address uncertainty in climate change policy. It defines 'methodologies' to include a variety of quantitative and qualitative approaches involving both top-down and bottom-up policy processes that attempt to enable policymakers to synthesize climate information into the policy process. The analysis examines methodological approaches to decision-making in climate policy based on criteria such as sources of policy choice information, sectors to which the methodology has been applied, sources from which climate projections were derived, quantitative and qualitative methods used to deal with uncertainty, and the benefits and limitations of each. A typology is developed to better categorize the variety of approaches and methods, examine the scope of policy activities they are best suited for, and highlight areas for future research and development.
Macarthur, Roy; Feinberg, Max; Bertheau, Yves
2010-01-01
A method is presented for estimating the size of uncertainty associated with the measurement of products derived from genetically modified organisms (GMOs). The method is based on the uncertainty profile, which is an extension, for the estimation of uncertainty, of a recent graphical statistical tool called an accuracy profile that was developed for the validation of quantitative analytical methods. The application of uncertainty profiles as an aid to decision making and assessment of fitness for purpose is also presented. Results of the measurement of the quantity of GMOs in flour by PCR-based methods collected through a number of interlaboratory studies followed the log-normal distribution. Uncertainty profiles built using the results generally give an expected range for measurement results of 50-200% of reference concentrations for materials that contain at least 1% GMO. This range is consistent with European Network of GM Laboratories and the European Union (EU) Community Reference Laboratory validation criteria and can be used as a fitness for purpose criterion for measurement methods. The effect on the enforcement of EU labeling regulations is that, in general, an individual analytical result needs to be < 0.45% to demonstrate compliance, and > 1.8% to demonstrate noncompliance with a labeling threshold of 0.9%.
A methodology to estimate uncertainty for emission projections through sensitivity analysis.
Lumbreras, Julio; de Andrés, Juan Manuel; Pérez, Javier; Borge, Rafael; de la Paz, David; Rodríguez, María Encarnación
2015-04-01
Air pollution abatement policies must be based on quantitative information on current and future emissions of pollutants. As emission projections uncertainties are inevitable and traditional statistical treatments of uncertainty are highly time/resources consuming, a simplified methodology for nonstatistical uncertainty estimation based on sensitivity analysis is presented in this work. The methodology was applied to the "with measures" scenario for Spain, concretely over the 12 highest emitting sectors regarding greenhouse gas and air pollutants emissions. Examples of methodology application for two important sectors (power plants, and agriculture and livestock) are shown and explained in depth. Uncertainty bands were obtained up to 2020 by modifying the driving factors of the 12 selected sectors and the methodology was tested against a recomputed emission trend in a low economic-growth perspective and official figures for 2010, showing a very good performance. A solid understanding and quantification of uncertainties related to atmospheric emission inventories and projections provide useful information for policy negotiations. However, as many of those uncertainties are irreducible, there is an interest on how they could be managed in order to derive robust policy conclusions. Taking this into account, a method developed to use sensitivity analysis as a source of information to derive nonstatistical uncertainty bands for emission projections is presented and applied to Spain. This method simplifies uncertainty assessment and allows other countries to take advantage of their sensitivity analyses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gigase, Yves
2007-07-01
Available in abstract form only. Full text of publication follows: The uncertainty on characteristics of radioactive LILW waste packages is difficult to determine and often very large. This results from a lack of knowledge of the constitution of the waste package and of the composition of the radioactive sources inside. To calculate a quantitative estimate of the uncertainty on a characteristic of a waste package one has to combine these various uncertainties. This paper discusses an approach to this problem, based on the use of the log-normal distribution, which is both elegant and easy to use. It can provide asmore » example quantitative estimates of uncertainty intervals that 'make sense'. The purpose is to develop a pragmatic approach that can be integrated into existing characterization methods. In this paper we show how our method can be applied to the scaling factor method. We also explain how it can be used when estimating other more complex characteristics such as the total uncertainty of a collection of waste packages. This method could have applications in radioactive waste management, more in particular in those decision processes where the uncertainty on the amount of activity is considered to be important such as in probability risk assessment or the definition of criteria for acceptance or categorization. (author)« less
Framing of Uncertainty in Scientific Publications: Towards Recommendations for Decision Support
NASA Astrophysics Data System (ADS)
Guillaume, J. H. A.; Helgeson, C.; Elsawah, S.; Jakeman, A. J.; Kummu, M.
2016-12-01
Uncertainty is recognised as an essential issue in environmental decision making and decision support. As modellers, we notably use a variety of tools and techniques within an analysis, for example related to uncertainty quantification and model validation. We also address uncertainty by how we present results. For example, experienced modellers are careful to distinguish robust conclusions from those that need further work, and the precision of quantitative results is tailored to their accuracy. In doing so, the modeller frames how uncertainty should be interpreted by their audience. This is an area which extends beyond modelling to fields such as philosophy of science, semantics, discourse analysis, intercultural communication and rhetoric. We propose that framing of uncertainty deserves greater attention in the context of decision support, and that there are opportunities in this area for fundamental research, synthesis and knowledge transfer, development of teaching curricula, and significant advances in managing uncertainty in decision making. This presentation reports preliminary results of a study of framing practices. Specifically, we analyse the framing of uncertainty that is visible in the abstracts from a corpus of scientific articles. We do this through textual analysis of the content and structure of those abstracts. Each finding that appears in an abstract is classified according to the uncertainty framing approach used, using a classification scheme that was iteratively revised based on reflection and comparison amongst three coders. This analysis indicates how frequently the different framing approaches are used, and provides initial insights into relationships between frames, how the frames relate to interpretation of uncertainty, and how rhetorical devices are used by modellers to communicate uncertainty in their work. We propose initial hypotheses for how the resulting insights might influence decision support, and help advance decision making to better address uncertainty.
Puhan, Milo A; Yu, Tsung; Boyd, Cynthia M; Ter Riet, Gerben
2015-07-02
When faced with uncertainties about the effects of medical interventions regulatory agencies, guideline developers, clinicians, and researchers commonly ask for more research, and in particular for more randomized trials. The conduct of additional randomized trials is, however, sometimes not the most efficient way to reduce uncertainty. Instead, approaches such as value of information analysis or other approaches should be used to prioritize research that will most likely reduce uncertainty and inform decisions. In situations where additional research for specific interventions needs to be prioritized, we propose the use of quantitative benefit-harm assessments that illustrate how the benefit-harm balance may change as a consequence of additional research. The example of roflumilast for patients with chronic obstructive pulmonary disease shows that additional research on patient preferences (e.g., how important are exacerbations relative to psychiatric harms?) or outcome risks (e.g., what is the incidence of psychiatric outcomes in patients with chronic obstructive pulmonary disease without treatment?) is sometimes more valuable than additional randomized trials. We propose that quantitative benefit-harm assessments have the potential to explore the impact of additional research and to identify research priorities Our approach may be seen as another type of value of information analysis and as a useful approach to stimulate specific new research that has the potential to change current estimates of the benefit-harm balance and decision making.
[Quantitative surface analysis of Pt-Co, Cu-Au and Cu-Ag alloy films by XPS and AES].
Li, Lian-Zhong; Zhuo, Shang-Jun; Shen, Ru-Xiang; Qian, Rong; Gao, Jie
2013-11-01
In order to improve the quantitative analysis accuracy of AES, We associated XPS with AES and studied the method to reduce the error of AES quantitative analysis, selected Pt-Co, Cu-Au and Cu-Ag binary alloy thin-films as the samples, used XPS to correct AES quantitative analysis results by changing the auger sensitivity factors to make their quantitative analysis results more similar. Then we verified the accuracy of the quantitative analysis of AES when using the revised sensitivity factors by other samples with different composition ratio, and the results showed that the corrected relative sensitivity factors can reduce the error in quantitative analysis of AES to less than 10%. Peak defining is difficult in the form of the integral spectrum of AES analysis since choosing the starting point and ending point when determining the characteristic auger peak intensity area with great uncertainty, and to make analysis easier, we also processed data in the form of the differential spectrum, made quantitative analysis on the basis of peak to peak height instead of peak area, corrected the relative sensitivity factors, and verified the accuracy of quantitative analysis by the other samples with different composition ratio. The result showed that the analytical error in quantitative analysis of AES reduced to less than 9%. It showed that the accuracy of AES quantitative analysis can be highly improved by the way of associating XPS with AES to correct the auger sensitivity factors since the matrix effects are taken into account. Good consistency was presented, proving the feasibility of this method.
Uncertainty Analysis of Radar and Gauge Rainfall Estimates in the Russian River Basin
NASA Astrophysics Data System (ADS)
Cifelli, R.; Chen, H.; Willie, D.; Reynolds, D.; Campbell, C.; Sukovich, E.
2013-12-01
Radar Quantitative Precipitation Estimation (QPE) has been a very important application of weather radar since it was introduced and made widely available after World War II. Although great progress has been made over the last two decades, it is still a challenging process especially in regions of complex terrain such as the western U.S. It is also extremely difficult to make direct use of radar precipitation data in quantitative hydrologic forecasting models. To improve the understanding of rainfall estimation and distributions in the NOAA Hydrometeorology Testbed in northern California (HMT-West), extensive evaluation of radar and gauge QPE products has been performed using a set of independent rain gauge data. This study focuses on the rainfall evaluation in the Russian River Basin. The statistical properties of the different gridded QPE products will be compared quantitatively. The main emphasis of this study will be on the analysis of uncertainties of the radar and gauge rainfall products that are subject to various sources of error. The spatial variation analysis of the radar estimates is performed by measuring the statistical distribution of the radar base data such as reflectivity and by the comparison with a rain gauge cluster. The application of mean field bias values to the radar rainfall data will also be described. The uncertainty analysis of the gauge rainfall will be focused on the comparison of traditional kriging and conditional bias penalized kriging (Seo 2012) methods. This comparison is performed with the retrospective Multisensor Precipitation Estimator (MPE) system installed at the NOAA Earth System Research Laboratory. The independent gauge set will again be used as the verification tool for the newly generated rainfall products.
NASA Astrophysics Data System (ADS)
Oh, Won Jin; Jang, Jong Shik; Lee, Youn Seoung; Kim, Ansoon; Kim, Kyung Joong
2018-02-01
Quantitative analysis methods of multi-element alloy films were compared. The atomic fractions of Si1-xGex alloy films were measured by depth profiling analysis with secondary ion mass spectrometry (SIMS) and X-ray Photoelectron Spectroscopy (XPS). Intensity-to-composition conversion factor (ICF) was used as a mean to convert the intensities to compositions instead of the relative sensitivity factors. The ICFs were determined from a reference Si1-xGex alloy film by the conventional method, average intensity (AI) method and total number counting (TNC) method. In the case of SIMS, although the atomic fractions measured by oxygen ion beams were not quantitative due to severe matrix effect, the results by cesium ion beam were very quantitative. The quantitative analysis results by SIMS using MCs2+ ions are comparable to the results by XPS. In the case of XPS, the measurement uncertainty was highly improved by the AI method and TNC method.
NASA Astrophysics Data System (ADS)
Takatsuka, Toshiko; Hirata, Kouichi; Kobayashi, Yoshinori; Kuroiwa, Takayoshi; Miura, Tsutomu; Matsue, Hideaki
2008-11-01
Certified reference materials (CRMs) of shallow arsenic implants in silicon are now under development at the National Metrology Institute of Japan (NMIJ). The amount of ion-implanted arsenic atoms is quantified by Instrumental Neutron Activation Analysis (INAA) using research reactor JRR-3 in Japan Atomic Energy Agency (JAEA). It is found that this method can evaluate arsenic amounts of 1015 atoms/cm2 with small uncertainties, and is adaptable to shallower dopants. The estimated uncertainties can satisfy the industrial demands for reference materials to calibrate the implanted dose of arsenic at shallow junctions.
Not Normal: the uncertainties of scientific measurements
NASA Astrophysics Data System (ADS)
Bailey, David C.
2017-01-01
Judging the significance and reproducibility of quantitative research requires a good understanding of relevant uncertainties, but it is often unclear how well these have been evaluated and what they imply. Reported scientific uncertainties were studied by analysing 41 000 measurements of 3200 quantities from medicine, nuclear and particle physics, and interlaboratory comparisons ranging from chemistry to toxicology. Outliers are common, with 5σ disagreements up to five orders of magnitude more frequent than naively expected. Uncertainty-normalized differences between multiple measurements of the same quantity are consistent with heavy-tailed Student's t-distributions that are often almost Cauchy, far from a Gaussian Normal bell curve. Medical research uncertainties are generally as well evaluated as those in physics, but physics uncertainty improves more rapidly, making feasible simple significance criteria such as the 5σ discovery convention in particle physics. Contributions to measurement uncertainty from mistakes and unknown problems are not completely unpredictable. Such errors appear to have power-law distributions consistent with how designed complex systems fail, and how unknown systematic errors are constrained by researchers. This better understanding may help improve analysis and meta-analysis of data, and help scientists and the public have more realistic expectations of what scientific results imply.
Not Normal: the uncertainties of scientific measurements
2017-01-01
Judging the significance and reproducibility of quantitative research requires a good understanding of relevant uncertainties, but it is often unclear how well these have been evaluated and what they imply. Reported scientific uncertainties were studied by analysing 41 000 measurements of 3200 quantities from medicine, nuclear and particle physics, and interlaboratory comparisons ranging from chemistry to toxicology. Outliers are common, with 5σ disagreements up to five orders of magnitude more frequent than naively expected. Uncertainty-normalized differences between multiple measurements of the same quantity are consistent with heavy-tailed Student’s t-distributions that are often almost Cauchy, far from a Gaussian Normal bell curve. Medical research uncertainties are generally as well evaluated as those in physics, but physics uncertainty improves more rapidly, making feasible simple significance criteria such as the 5σ discovery convention in particle physics. Contributions to measurement uncertainty from mistakes and unknown problems are not completely unpredictable. Such errors appear to have power-law distributions consistent with how designed complex systems fail, and how unknown systematic errors are constrained by researchers. This better understanding may help improve analysis and meta-analysis of data, and help scientists and the public have more realistic expectations of what scientific results imply. PMID:28280557
Prioritizing Risks and Uncertainties from Intentional Release of Selected Category A Pathogens
Hong, Tao; Gurian, Patrick L.; Huang, Yin; Haas, Charles N.
2012-01-01
This paper synthesizes available information on five Category A pathogens (Bacillus anthracis, Yersinia pestis, Francisella tularensis, Variola major and Lassa) to develop quantitative guidelines for how environmental pathogen concentrations may be related to human health risk in an indoor environment. An integrated model of environmental transport and human health exposure to biological pathogens is constructed which 1) includes the effects of environmental attenuation, 2) considers fomite contact exposure as well as inhalational exposure, and 3) includes an uncertainty analysis to identify key input uncertainties, which may inform future research directions. The findings provide a framework for developing the many different environmental standards that are needed for making risk-informed response decisions, such as when prophylactic antibiotics should be distributed, and whether or not a contaminated area should be cleaned up. The approach is based on the assumption of uniform mixing in environmental compartments and is thus applicable to areas sufficiently removed in time and space from the initial release that mixing has produced relatively uniform concentrations. Results indicate that when pathogens are released into the air, risk from inhalation is the main component of the overall risk, while risk from ingestion (dermal contact for B. anthracis) is the main component of the overall risk when pathogens are present on surfaces. Concentrations sampled from untracked floor, walls and the filter of heating ventilation and air conditioning (HVAC) system are proposed as indicators of previous exposure risk, while samples taken from touched surfaces are proposed as indicators of future risk if the building is reoccupied. A Monte Carlo uncertainty analysis is conducted and input-output correlations used to identify important parameter uncertainties. An approach is proposed for integrating these quantitative assessments of parameter uncertainty with broader, qualitative considerations to identify future research priorities. PMID:22412915
Imprecise Probability Methods for Weapons UQ
DOE Office of Scientific and Technical Information (OSTI.GOV)
Picard, Richard Roy; Vander Wiel, Scott Alan
Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.
NASA Astrophysics Data System (ADS)
Ciriello, V.; Lauriola, I.; Bonvicini, S.; Cozzani, V.; Di Federico, V.; Tartakovsky, Daniel M.
2017-11-01
Ubiquitous hydrogeological uncertainty undermines the veracity of quantitative predictions of soil and groundwater contamination due to accidental hydrocarbon spills from onshore pipelines. Such predictions, therefore, must be accompanied by quantification of predictive uncertainty, especially when they are used for environmental risk assessment. We quantify the impact of parametric uncertainty on quantitative forecasting of temporal evolution of two key risk indices, volumes of unsaturated and saturated soil contaminated by a surface spill of light nonaqueous-phase liquids. This is accomplished by treating the relevant uncertain parameters as random variables and deploying two alternative probabilistic models to estimate their effect on predictive uncertainty. A physics-based model is solved with a stochastic collocation method and is supplemented by a global sensitivity analysis. A second model represents the quantities of interest as polynomials of random inputs and has a virtually negligible computational cost, which enables one to explore any number of risk-related contamination scenarios. For a typical oil-spill scenario, our method can be used to identify key flow and transport parameters affecting the risk indices, to elucidate texture-dependent behavior of different soils, and to evaluate, with a degree of confidence specified by the decision-maker, the extent of contamination and the correspondent remediation costs.
Development of probabilistic internal dosimetry computer code
NASA Astrophysics Data System (ADS)
Noh, Siwan; Kwon, Tae-Eun; Lee, Jai-Ki
2017-02-01
Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values ( e.g. the 2.5th, 5th, median, 95th, and 97.5th percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various situations. In cases of severe internal exposure, the causation probability of a deterministic health effect can be derived from the dose distribution, and a high statistical value ( e.g., the 95th percentile of the distribution) can be used to determine the appropriate intervention. The distribution-based sensitivity analysis can also be used to quantify the contribution of each factor to the dose uncertainty, which is essential information for reducing and optimizing the uncertainty in the internal dose assessment. Therefore, the present study can contribute to retrospective dose assessment for accidental internal exposure scenarios, as well as to internal dose monitoring optimization and uncertainty reduction.
Some suggested future directions of quantitative resource assessments
Singer, D.A.
2001-01-01
Future quantitative assessments will be expected to estimate quantities, values, and locations of undiscovered mineral resources in a form that conveys both economic viability and uncertainty associated with the resources. Historically, declining metal prices point to the need for larger deposits over time. Sensitivity analysis demonstrates that the greatest opportunity for reducing uncertainty in assessments lies in lowering uncertainty associated with tonnage estimates. Of all errors possible in assessments, those affecting tonnage estimates are by far the most important. Selecting the correct deposit model is the most important way of controlling errors because the dominance of tonnage-deposit models are the best known predictor of tonnage. Much of the surface is covered with apparently barren rocks and sediments in many large regions. Because many exposed mineral deposits are believed to have been found, a prime concern is the presence of possible mineralized rock under cover. Assessments of areas with resources under cover must rely on extrapolation from surrounding areas, new geologic maps of rocks under cover, or analogy with other well-explored areas that can be considered training tracts. Cover has a profound effect on uncertainty and on methods and procedures of assessments because geology is seldom known and geophysical methods typically have attenuated responses. Many earlier assessment methods were based on relationships of geochemical and geophysical variables to deposits learned from deposits exposed on the surface-these will need to be relearned based on covered deposits. Mineral-deposit models are important in quantitative resource assessments for two reasons: (1) grades and tonnages of most deposit types are significantly different, and (2) deposit types are present in different geologic settings that can be identified from geologic maps. Mineral-deposit models are the keystone in combining the diverse geoscience information on geology, mineral occurrences, geophysics, and geochemistry used in resource assessments and mineral exploration. Grade and tonnage models and development of quantitative descriptive, economic, and deposit density models will help reduce the uncertainty of these new assessments.
Parameter sensitivity analysis of a 1-D cold region lake model for land-surface schemes
NASA Astrophysics Data System (ADS)
Guerrero, José-Luis; Pernica, Patricia; Wheater, Howard; Mackay, Murray; Spence, Chris
2017-12-01
Lakes might be sentinels of climate change, but the uncertainty in their main feedback to the atmosphere - heat-exchange fluxes - is often not considered within climate models. Additionally, these fluxes are seldom measured, hindering critical evaluation of model output. Analysis of the Canadian Small Lake Model (CSLM), a one-dimensional integral lake model, was performed to assess its ability to reproduce diurnal and seasonal variations in heat fluxes and the sensitivity of simulated fluxes to changes in model parameters, i.e., turbulent transport parameters and the light extinction coefficient (Kd). A C++ open-source software package, Problem Solving environment for Uncertainty Analysis and Design Exploration (PSUADE), was used to perform sensitivity analysis (SA) and identify the parameters that dominate model behavior. The generalized likelihood uncertainty estimation (GLUE) was applied to quantify the fluxes' uncertainty, comparing daily-averaged eddy-covariance observations to the output of CSLM. Seven qualitative and two quantitative SA methods were tested, and the posterior likelihoods of the modeled parameters, obtained from the GLUE analysis, were used to determine the dominant parameters and the uncertainty in the modeled fluxes. Despite the ubiquity of the equifinality issue - different parameter-value combinations yielding equivalent results - the answer to the question was unequivocal: Kd, a measure of how much light penetrates the lake, dominates sensible and latent heat fluxes, and the uncertainty in their estimates is strongly related to the accuracy with which Kd is determined. This is important since accurate and continuous measurements of Kd could reduce modeling uncertainty.
Communication Behaviors and Trust in Collaborative Online Teams
ERIC Educational Resources Information Center
Bulu, Saniye Tugba; Yildirim, Zahide
2008-01-01
This study investigates preservice teachers' trust levels and collaborative communication behaviors namely leadership, feedback, social interaction, enthusiasm, task and technical uncertainties, and task-oriented interactions in online learning environment. A case study design involving qualitative and quantitative data collection and analysis was…
Remans, Tony; Keunen, Els; Bex, Geert Jan; Smeets, Karen; Vangronsveld, Jaco; Cuypers, Ann
2014-10-01
Reverse transcription-quantitative PCR (RT-qPCR) has been widely adopted to measure differences in mRNA levels; however, biological and technical variation strongly affects the accuracy of the reported differences. RT-qPCR specialists have warned that, unless researchers minimize this variability, they may report inaccurate differences and draw incorrect biological conclusions. The Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) guidelines describe procedures for conducting and reporting RT-qPCR experiments. The MIQE guidelines enable others to judge the reliability of reported results; however, a recent literature survey found low adherence to these guidelines. Additionally, even experiments that use appropriate procedures remain subject to individual variation that statistical methods cannot correct. For example, since ideal reference genes do not exist, the widely used method of normalizing RT-qPCR data to reference genes generates background noise that affects the accuracy of measured changes in mRNA levels. However, current RT-qPCR data reporting styles ignore this source of variation. In this commentary, we direct researchers to appropriate procedures, outline a method to present the remaining uncertainty in data accuracy, and propose an intuitive way to select reference genes to minimize uncertainty. Reporting the uncertainty in data accuracy also serves for quality assessment, enabling researchers and peer reviewers to confidently evaluate the reliability of gene expression data. © 2014 American Society of Plant Biologists. All rights reserved.
Characterizing nonconstant instrumental variance in emerging miniaturized analytical techniques.
Noblitt, Scott D; Berg, Kathleen E; Cate, David M; Henry, Charles S
2016-04-07
Measurement variance is a crucial aspect of quantitative chemical analysis. Variance directly affects important analytical figures of merit, including detection limit, quantitation limit, and confidence intervals. Most reported analyses for emerging analytical techniques implicitly assume constant variance (homoskedasticity) by using unweighted regression calibrations. Despite the assumption of constant variance, it is known that most instruments exhibit heteroskedasticity, where variance changes with signal intensity. Ignoring nonconstant variance results in suboptimal calibrations, invalid uncertainty estimates, and incorrect detection limits. Three techniques where homoskedasticity is often assumed were covered in this work to evaluate if heteroskedasticity had a significant quantitative impact-naked-eye, distance-based detection using paper-based analytical devices (PADs), cathodic stripping voltammetry (CSV) with disposable carbon-ink electrode devices, and microchip electrophoresis (MCE) with conductivity detection. Despite these techniques representing a wide range of chemistries and precision, heteroskedastic behavior was confirmed for each. The general variance forms were analyzed, and recommendations for accounting for nonconstant variance discussed. Monte Carlo simulations of instrument responses were performed to quantify the benefits of weighted regression, and the sensitivity to uncertainty in the variance function was tested. Results show that heteroskedasticity should be considered during development of new techniques; even moderate uncertainty (30%) in the variance function still results in weighted regression outperforming unweighted regressions. We recommend utilizing the power model of variance because it is easy to apply, requires little additional experimentation, and produces higher-precision results and more reliable uncertainty estimates than assuming homoskedasticity. Copyright © 2016 Elsevier B.V. All rights reserved.
Wallace, Jack
2010-05-01
While forensic laboratories will soon be required to estimate uncertainties of measurement for those quantitations reported to the end users of the information, the procedures for estimating this have been little discussed in the forensic literature. This article illustrates how proficiency test results provide the basis for estimating uncertainties in three instances: (i) For breath alcohol analyzers the interlaboratory precision is taken as a direct measure of uncertainty. This approach applies when the number of proficiency tests is small. (ii) For blood alcohol, the uncertainty is calculated from the differences between the laboratory's proficiency testing results and the mean quantitations determined by the participants; this approach applies when the laboratory has participated in a large number of tests. (iii) For toxicology, either of these approaches is useful for estimating comparability between laboratories, but not for estimating absolute accuracy. It is seen that data from proficiency tests enable estimates of uncertainty that are empirical, simple, thorough, and applicable to a wide range of concentrations.
Siebert, Uwe; Rochau, Ursula; Claxton, Karl
2013-01-01
Decision analysis (DA) and value-of-information (VOI) analysis provide a systematic, quantitative methodological framework that explicitly considers the uncertainty surrounding the currently available evidence to guide healthcare decisions. In medical decision making under uncertainty, there are two fundamental questions: 1) What decision should be made now given the best available evidence (and its uncertainty)?; 2) Subsequent to the current decision and given the magnitude of the remaining uncertainty, should we gather further evidence (i.e., perform additional studies), and if yes, which studies should be undertaken (e.g., efficacy, side effects, quality of life, costs), and what sample sizes are needed? Using the currently best available evidence, VoI analysis focuses on the likelihood of making a wrong decision if the new intervention is adopted. The value of performing further studies and gathering additional evidence is based on the extent to which the additional information will reduce this uncertainty. A quantitative framework allows for the valuation of the additional information that is generated by further research, and considers the decision maker's objectives and resource constraints. Claxton et al. summarise: "Value of information analysis can be used to inform a range of policy questions including whether a new technology should be approved based on existing evidence, whether it should be approved but additional research conducted or whether approval should be withheld until the additional evidence becomes available." [Claxton K. Value of information entry in Encyclopaedia of Health Economics, Elsevier, forthcoming 2014.] The purpose of this tutorial is to introduce the framework of systematic VoI analysis to guide further research. In our tutorial article, we explain the theoretical foundations and practical methods of decision analysis and value-of-information analysis. To illustrate, we use a simple case example of a foot ulcer (e.g., with diabetes) as well as key references from the literature, including examples for the use of the decision-analytic VoI framework by health technology assessment agencies to guide further research. These concepts may guide stakeholders involved or interested in how to determine whether or not and, if so, which additional evidence is needed to make decisions. Copyright © 2013. Published by Elsevier GmbH.
Bayesian uncertainty quantification in linear models for diffusion MRI.
Sjölund, Jens; Eklund, Anders; Özarslan, Evren; Herberthson, Magnus; Bånkestad, Maria; Knutsson, Hans
2018-03-29
Diffusion MRI (dMRI) is a valuable tool in the assessment of tissue microstructure. By fitting a model to the dMRI signal it is possible to derive various quantitative features. Several of the most popular dMRI signal models are expansions in an appropriately chosen basis, where the coefficients are determined using some variation of least-squares. However, such approaches lack any notion of uncertainty, which could be valuable in e.g. group analyses. In this work, we use a probabilistic interpretation of linear least-squares methods to recast popular dMRI models as Bayesian ones. This makes it possible to quantify the uncertainty of any derived quantity. In particular, for quantities that are affine functions of the coefficients, the posterior distribution can be expressed in closed-form. We simulated measurements from single- and double-tensor models where the correct values of several quantities are known, to validate that the theoretically derived quantiles agree with those observed empirically. We included results from residual bootstrap for comparison and found good agreement. The validation employed several different models: Diffusion Tensor Imaging (DTI), Mean Apparent Propagator MRI (MAP-MRI) and Constrained Spherical Deconvolution (CSD). We also used in vivo data to visualize maps of quantitative features and corresponding uncertainties, and to show how our approach can be used in a group analysis to downweight subjects with high uncertainty. In summary, we convert successful linear models for dMRI signal estimation to probabilistic models, capable of accurate uncertainty quantification. Copyright © 2018 Elsevier Inc. All rights reserved.
[The metrology of uncertainty: a study of vital statistics from Chile and Brazil].
Carvajal, Yuri; Kottow, Miguel
2012-11-01
This paper addresses the issue of uncertainty in the measurements used in public health analysis and decision-making. The Shannon-Wiener entropy measure was adapted to express the uncertainty contained in counting causes of death in official vital statistics from Chile. Based on the findings, the authors conclude that metrological requirements in public health are as important as the measurements themselves. The study also considers and argues for the existence of uncertainty associated with the statistics' performative properties, both by the way the data are structured as a sort of syntax of reality and by exclusion of what remains beyond the quantitative modeling used in each case. Following the legacy of pragmatic thinking and using conceptual tools from the sociology of translation, the authors emphasize that by taking uncertainty into account, public health can contribute to a discussion on the relationship between technology, democracy, and formation of a participatory public.
Development of iPad application "Postima" for quantitative analysis of the effects of manual therapy
NASA Astrophysics Data System (ADS)
Sugiyama, Naruhisa; Shirakawa, Tomohiro
2017-07-01
The technical difficulty of diagnosing joint misalignment and/or dysfunction by quantitative evaluation is commonly acknowledged among manual therapists. Usually, manual therapists make a diagnosis based on a combination of observing patient symptoms and performing physical examinations, both of which rely on subjective criteria and thus contain some uncertainty. We thus sought to investigate the correlations among posture, skeletal misalignment, and pain severity over the course of manual therapy treatment, and to explore the possibility of establishing objective criteria for diagnosis. For this purpose, we developed an iPad application that realizes the measurement of patients' postures and analyzes them quantitatively. We also discuss the results and effectiveness of the measurement and analysis.
Experimental uncertainty and drag measurements in the national transonic facility
NASA Technical Reports Server (NTRS)
Batill, Stephen M.
1994-01-01
This report documents the results of a study which was conducted in order to establish a framework for the quantitative description of the uncertainty in measurements conducted in the National Transonic Facility (NTF). The importance of uncertainty analysis in both experiment planning and reporting results has grown significantly in the past few years. Various methodologies have been proposed and the engineering community appears to be 'converging' on certain accepted practices. The practical application of these methods to the complex wind tunnel testing environment at the NASA Langley Research Center was based upon terminology and methods established in the American National Standards Institute (ANSI) and the American Society of Mechanical Engineers (ASME) standards. The report overviews this methodology.
Uncertainty in the use of MAMA software to measure particle morphological parameters from SEM images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwartz, Daniel S.; Tandon, Lav
The MAMA software package developed at LANL is designed to make morphological measurements on a wide variety of digital images of objects. At LANL, we have focused on using MAMA to measure scanning electron microscope (SEM) images of particles, as this is a critical part of our forensic analysis of interdicted radiologic materials. In order to successfully use MAMA to make such measurements, we must understand the level of uncertainty involved in the process, so that we can rigorously support our quantitative conclusions.
Using the Nobel Laureates in Economics to Teach Quantitative Methods
ERIC Educational Resources Information Center
Becker, William E.; Greene, William H.
2005-01-01
The authors show how the work of Nobel Laureates in economics can enhance student understanding and bring them up to date on topics such as probability, uncertainty and decision theory, hypothesis testing, regression to the mean, instrumental variable techniques, discrete choice modeling, and time-series analysis. (Contains 2 notes.)
Developing and Assessing E-Learning Techniques for Teaching Forecasting
ERIC Educational Resources Information Center
Gel, Yulia R.; O'Hara Hines, R. Jeanette; Chen, He; Noguchi, Kimihiro; Schoner, Vivian
2014-01-01
In the modern business environment, managers are increasingly required to perform decision making and evaluate related risks based on quantitative information in the face of uncertainty, which in turn increases demand for business professionals with sound skills and hands-on experience with statistical data analysis. Computer-based training…
Motion compensation using origin ensembles in awake small animal positron emission tomography
NASA Astrophysics Data System (ADS)
Gillam, John E.; Angelis, Georgios I.; Kyme, Andre Z.; Meikle, Steven R.
2017-02-01
In emission tomographic imaging, the stochastic origin ensembles algorithm provides unique information regarding the detected counts given the measured data. Precision in both voxel and region-wise parameters may be determined for a single data set based on the posterior distribution of the count density allowing uncertainty estimates to be allocated to quantitative measures. Uncertainty estimates are of particular importance in awake animal neurological and behavioral studies for which head motion, unique for each acquired data set, perturbs the measured data. Motion compensation can be conducted when rigid head pose is measured during the scan. However, errors in pose measurements used for compensation can degrade the data and hence quantitative outcomes. In this investigation motion compensation and detector resolution models were incorporated into the basic origin ensembles algorithm and an efficient approach to computation was developed. The approach was validated against maximum liklihood—expectation maximisation and tested using simulated data. The resultant algorithm was then used to analyse quantitative uncertainty in regional activity estimates arising from changes in pose measurement precision. Finally, the posterior covariance acquired from a single data set was used to describe correlations between regions of interest providing information about pose measurement precision that may be useful in system analysis and design. The investigation demonstrates the use of origin ensembles as a powerful framework for evaluating statistical uncertainty of voxel and regional estimates. While in this investigation rigid motion was considered in the context of awake animal PET, the extension to arbitrary motion may provide clinical utility where respiratory or cardiac motion perturb the measured data.
Hattis, Dale; Goble, Robert; Chu, Margaret
2005-01-01
In an earlier report we developed a quantitative likelihood-based analysis of the differences in sensitivity of rodents to mutagenic carcinogens across three life stages (fetal, birth to weaning, and weaning to 60 days) relative to exposures in adult life. Here we draw implications for assessing human risks for full lifetime exposures, taking into account three types of uncertainties in making projections from the rodent data: uncertainty in the central estimates of the life-stage–specific sensitivity factors estimated earlier, uncertainty from chemical-to-chemical differences in life-stage–specific sensitivities for carcinogenesis, and uncertainty in the mapping of rodent life stages to human ages/exposure periods. Among the uncertainties analyzed, the mapping of rodent life stages to human ages/exposure periods is most important quantitatively (a range of several-fold in estimates of the duration of the human equivalent of the highest sensitivity “birth to weaning” period in rodents). The combined effects of these uncertainties are estimated with Monte Carlo analyses. Overall, the estimated population arithmetic mean risk from lifetime exposures at a constant milligrams per kilogram body weight level to a generic mutagenic carcinogen is about 2.8-fold larger than expected from adult-only exposure with 5–95% confidence limits of 1.5-to 6-fold. The mean estimates for the 0- to 2-year and 2- to 15-year periods are about 35–55% larger than the 10- and 3-fold sensitivity factor adjustments recently proposed by the U.S. Environmental Protection Agency. The present results are based on data for only nine chemicals, including five mutagens. Risk inferences will be altered as data become available for other chemicals. PMID:15811844
NASA Astrophysics Data System (ADS)
Wilbert, Stefan; Kleindiek, Stefan; Nouri, Bijan; Geuder, Norbert; Habte, Aron; Schwandt, Marko; Vignola, Frank
2016-05-01
Concentrating solar power projects require accurate direct normal irradiance (DNI) data including uncertainty specifications for plant layout and cost calculations. Ground measured data are necessary to obtain the required level of accuracy and are often obtained with Rotating Shadowband Irradiometers (RSI) that use photodiode pyranometers and correction functions to account for systematic effects. The uncertainty of Si-pyranometers has been investigated, but so far basically empirical studies were published or decisive uncertainty influences had to be estimated based on experience in analytical studies. One of the most crucial estimated influences is the spectral irradiance error because Si-photodiode-pyranometers only detect visible and color infrared radiation and have a spectral response that varies strongly within this wavelength interval. Furthermore, analytic studies did not discuss the role of correction functions and the uncertainty introduced by imperfect shading. In order to further improve the bankability of RSI and Si-pyranometer data, a detailed uncertainty analysis following the Guide to the Expression of Uncertainty in Measurement (GUM) has been carried out. The study defines a method for the derivation of the spectral error and spectral uncertainties and presents quantitative values of the spectral and overall uncertainties. Data from the PSA station in southern Spain was selected for the analysis. Average standard uncertainties for corrected 10 min data of 2 % for global horizontal irradiance (GHI), and 2.9 % for DNI (for GHI and DNI over 300 W/m²) were found for the 2012 yearly dataset when separate GHI and DHI calibration constants were used. Also the uncertainty in 1 min resolution was analyzed. The effect of correction functions is significant. The uncertainties found in this study are consistent with results of previous empirical studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilbert, Stefan; Kleindiek, Stefan; Nouri, Bijan
2016-05-31
Concentrating solar power projects require accurate direct normal irradiance (DNI) data including uncertainty specifications for plant layout and cost calculations. Ground measured data are necessary to obtain the required level of accuracy and are often obtained with Rotating Shadowband Irradiometers (RSI) that use photodiode pyranometers and correction functions to account for systematic effects. The uncertainty of Si-pyranometers has been investigated, but so far basically empirical studies were published or decisive uncertainty influences had to be estimated based on experience in analytical studies. One of the most crucial estimated influences is the spectral irradiance error because Si-photodiode-pyranometers only detect visible andmore » color infrared radiation and have a spectral response that varies strongly within this wavelength interval. Furthermore, analytic studies did not discuss the role of correction functions and the uncertainty introduced by imperfect shading. In order to further improve the bankability of RSI and Si-pyranometer data, a detailed uncertainty analysis following the Guide to the Expression of Uncertainty in Measurement (GUM) has been carried out. The study defines a method for the derivation of the spectral error and spectral uncertainties and presents quantitative values of the spectral and overall uncertainties. Data from the PSA station in southern Spain was selected for the analysis. Average standard uncertainties for corrected 10 min data of 2% for global horizontal irradiance (GHI), and 2.9% for DNI (for GHI and DNI over 300 W/m2) were found for the 2012 yearly dataset when separate GHI and DHI calibration constants were used. Also the uncertainty in 1 min resolution was analyzed. The effect of correction functions is significant. The uncertainties found in this study are consistent with results of previous empirical studies.« less
Leung, Gary N W; Ho, Emmie N M; Kwok, W Him; Leung, David K K; Tang, Francis P W; Wan, Terence S M; Wong, April S Y; Wong, Colton H F; Wong, Jenny K Y; Yu, Nola H
2007-09-07
Quantitative determination, particularly for threshold substances in biological samples, is much more demanding than qualitative identification. A proper assessment of any quantitative determination is the measurement uncertainty (MU) associated with the determined value. The International Standard ISO/IEC 17025, "General requirements for the competence of testing and calibration laboratories", has more prescriptive requirements on the MU than its superseded document, ISO/IEC Guide 25. Under the 2005 or 1999 versions of the new standard, an estimation of the MU is mandatory for all quantitative determinations. To comply with the new requirement, a protocol was established in the authors' laboratory in 2001. The protocol has since evolved based on our practical experience, and a refined version was adopted in 2004. This paper describes our approach in establishing the MU, as well as some other important considerations, for the quantification of threshold substances in biological samples as applied in the area of doping control for horses. The testing of threshold substances can be viewed as a compliance test (or testing to a specified limit). As such, it should only be necessary to establish the MU at the threshold level. The steps in a "Bottom-Up" approach adopted by us are similar to those described in the EURACHEM/CITAC guide, "Quantifying Uncertainty in Analytical Measurement". They involve first specifying the measurand, including the relationship between the measurand and the input quantities upon which it depends. This is followed by identifying all applicable uncertainty contributions using a "cause and effect" diagram. The magnitude of each uncertainty component is then calculated and converted to a standard uncertainty. A recovery study is also conducted to determine if the method bias is significant and whether a recovery (or correction) factor needs to be applied. All standard uncertainties with values greater than 30% of the largest one are then used to derive the combined standard uncertainty. Finally, an expanded uncertainty is calculated at 99% one-tailed confidence level by multiplying the standard uncertainty with an appropriate coverage factor (k). A sample is considered positive if the determined concentration of the threshold substance exceeds its threshold by the expanded uncertainty. In addition, other important considerations, which can have a significant impact on quantitative analyses, will be presented.
Fault and event tree analyses for process systems risk analysis: uncertainty handling formulations.
Ferdous, Refaul; Khan, Faisal; Sadiq, Rehan; Amyotte, Paul; Veitch, Brian
2011-01-01
Quantitative risk analysis (QRA) is a systematic approach for evaluating likelihood, consequences, and risk of adverse events. QRA based on event (ETA) and fault tree analyses (FTA) employs two basic assumptions. The first assumption is related to likelihood values of input events, and the second assumption is regarding interdependence among the events (for ETA) or basic events (for FTA). Traditionally, FTA and ETA both use crisp probabilities; however, to deal with uncertainties, the probability distributions of input event likelihoods are assumed. These probability distributions are often hard to come by and even if available, they are subject to incompleteness (partial ignorance) and imprecision. Furthermore, both FTA and ETA assume that events (or basic events) are independent. In practice, these two assumptions are often unrealistic. This article focuses on handling uncertainty in a QRA framework of a process system. Fuzzy set theory and evidence theory are used to describe the uncertainties in the input event likelihoods. A method based on a dependency coefficient is used to express interdependencies of events (or basic events) in ETA and FTA. To demonstrate the approach, two case studies are discussed. © 2010 Society for Risk Analysis.
Uncertainties in Air Exchange using Continuous-Injection, Long-Term Sampling Tracer-Gas Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sherman, Max H.; Walker, Iain S.; Lunden, Melissa M.
2013-12-01
The PerFluorocarbon Tracer (PFT) method is a low-cost approach commonly used for measuring air exchange in buildings using tracer gases. It is a specific application of the more general Continuous-Injection, Long-Term Sampling (CILTS) method. The technique is widely used but there has been little work on understanding the uncertainties (both precision and bias) associated with its use, particularly given that it is typically deployed by untrained or lightly trained people to minimize experimental costs. In this article we will conduct a first-principles error analysis to estimate the uncertainties and then compare that analysis to CILTS measurements that were over-sampled, throughmore » the use of multiple tracers and emitter and sampler distribution patterns, in three houses. We find that the CILTS method can have an overall uncertainty of 10-15percent in ideal circumstances, but that even in highly controlled field experiments done by trained experimenters expected uncertainties are about 20percent. In addition, there are many field conditions (such as open windows) where CILTS is not likely to provide any quantitative data. Even avoiding the worst situations of assumption violations CILTS should be considered as having a something like a ?factor of two? uncertainty for the broad field trials that it is typically used in. We provide guidance on how to deploy CILTS and design the experiment to minimize uncertainties.« less
Peng, Jiangtao; Peng, Silong; Xie, Qiong; Wei, Jiping
2011-04-01
In order to eliminate the lower order polynomial interferences, a new quantitative calibration algorithm "Baseline Correction Combined Partial Least Squares (BCC-PLS)", which combines baseline correction and conventional PLS, is proposed. By embedding baseline correction constraints into PLS weights selection, the proposed calibration algorithm overcomes the uncertainty in baseline correction and can meet the requirement of on-line attenuated total reflectance Fourier transform infrared (ATR-FTIR) quantitative analysis. The effectiveness of the algorithm is evaluated by the analysis of glucose and marzipan ATR-FTIR spectra. BCC-PLS algorithm shows improved prediction performance over PLS. The root mean square error of cross-validation (RMSECV) on marzipan spectra for the prediction of the moisture is found to be 0.53%, w/w (range 7-19%). The sugar content is predicted with a RMSECV of 2.04%, w/w (range 33-68%). Copyright © 2011 Elsevier B.V. All rights reserved.
Tennant, David; Bánáti, Diána; Kennedy, Marc; König, Jürgen; O'Mahony, Cian; Kettler, Susanne
2017-11-01
A previous publication described methods for assessing and reporting uncertainty in dietary exposure assessments. This follow-up publication uses a case study to develop proposals for representing and communicating uncertainty to risk managers. The food ingredient aspartame is used as the case study in a simple deterministic model (the EFSA FAIM template) and with more sophisticated probabilistic exposure assessment software (FACET). Parameter and model uncertainties are identified for each modelling approach and tabulated. The relative importance of each source of uncertainty is then evaluated using a semi-quantitative scale and the results expressed using two different forms of graphical summary. The value of this approach in expressing uncertainties in a manner that is relevant to the exposure assessment and useful to risk managers is then discussed. It was observed that the majority of uncertainties are often associated with data sources rather than the model itself. However, differences in modelling methods can have the greatest impact on uncertainties overall, particularly when the underlying data are the same. It was concluded that improved methods for communicating uncertainties for risk management is the research area where the greatest amount of effort is suggested to be placed in future. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Mesa-Frias, Marco; Chalabi, Zaid; Foss, Anna M
2014-01-01
Quantitative health impact assessment (HIA) is increasingly being used to assess the health impacts attributable to an environmental policy or intervention. As a consequence, there is a need to assess uncertainties in the assessments because of the uncertainty in the HIA models. In this paper, a framework is developed to quantify the uncertainty in the health impacts of environmental interventions and is applied to evaluate the impacts of poor housing ventilation. The paper describes the development of the framework through three steps: (i) selecting the relevant exposure metric and quantifying the evidence of potential health effects of the exposure; (ii) estimating the size of the population affected by the exposure and selecting the associated outcome measure; (iii) quantifying the health impact and its uncertainty. The framework introduces a novel application for the propagation of uncertainty in HIA, based on fuzzy set theory. Fuzzy sets are used to propagate parametric uncertainty in a non-probabilistic space and are applied to calculate the uncertainty in the morbidity burdens associated with three indoor ventilation exposure scenarios: poor, fair and adequate. The case-study example demonstrates how the framework can be used in practice, to quantify the uncertainty in health impact assessment where there is insufficient information to carry out a probabilistic uncertainty analysis. © 2013.
Visualizing Uncertainty of Point Phenomena by Redesigned Error Ellipses
NASA Astrophysics Data System (ADS)
Murphy, Christian E.
2018-05-01
Visualizing uncertainty remains one of the great challenges in modern cartography. There is no overarching strategy to display the nature of uncertainty, as an effective and efficient visualization depends, besides on the spatial data feature type, heavily on the type of uncertainty. This work presents a design strategy to visualize uncertainty con-nected to point features. The error ellipse, well-known from mathematical statistics, is adapted to display the uncer-tainty of point information originating from spatial generalization. Modified designs of the error ellipse show the po-tential of quantitative and qualitative symbolization and simultaneous point based uncertainty symbolization. The user can intuitively depict the centers of gravity, the major orientation of the point arrays as well as estimate the ex-tents and possible spatial distributions of multiple point phenomena. The error ellipse represents uncertainty in an intuitive way, particularly suitable for laymen. Furthermore it is shown how applicable an adapted design of the er-ror ellipse is to display the uncertainty of point features originating from incomplete data. The suitability of the error ellipse to display the uncertainty of point information is demonstrated within two showcases: (1) the analysis of formations of association football players, and (2) uncertain positioning of events on maps for the media.
NASA Astrophysics Data System (ADS)
Chou, Shuo-Ju
2011-12-01
In recent years the United States has shifted from a threat-based acquisition policy that developed systems for countering specific threats to a capabilities-based strategy that emphasizes the acquisition of systems that provide critical national defense capabilities. This shift in policy, in theory, allows for the creation of an "optimal force" that is robust against current and future threats regardless of the tactics and scenario involved. In broad terms, robustness can be defined as the insensitivity of an outcome to "noise" or non-controlled variables. Within this context, the outcome is the successful achievement of defense strategies and the noise variables are tactics and scenarios that will be associated with current and future enemies. Unfortunately, a lack of system capability, budget, and schedule robustness against technology performance and development uncertainties has led to major setbacks in recent acquisition programs. This lack of robustness stems from the fact that immature technologies have uncertainties in their expected performance, development cost, and schedule that cause to variations in system effectiveness and program development budget and schedule requirements. Unfortunately, the Technology Readiness Assessment process currently used by acquisition program managers and decision-makers to measure technology uncertainty during critical program decision junctions does not adequately capture the impact of technology performance and development uncertainty on program capability and development metrics. The Technology Readiness Level metric employed by the TRA to describe program technology elements uncertainties can only provide a qualitative and non-descript estimation of the technology uncertainties. In order to assess program robustness, specifically requirements robustness, against technology performance and development uncertainties, a new process is needed. This process should provide acquisition program managers and decision-makers with the ability to assess or measure the robustness of program requirements against such uncertainties. A literature review of techniques for forecasting technology performance and development uncertainties and subsequent impacts on capability, budget, and schedule requirements resulted in the conclusion that an analysis process that coupled a probabilistic analysis technique such as Monte Carlo Simulations with quantitative and parametric models of technology performance impact and technology development time and cost requirements would allow the probabilities of meeting specific constraints of these requirements to be established. These probabilities of requirements success metrics can then be used as a quantitative and probabilistic measure of program requirements robustness against technology uncertainties. Combined with a Multi-Objective Genetic Algorithm optimization process and computer-based Decision Support System, critical information regarding requirements robustness against technology uncertainties can be captured and quantified for acquisition decision-makers. This results in a more informed and justifiable selection of program technologies during initial program definition as well as formulation of program development and risk management strategies. To meet the stated research objective, the ENhanced TEchnology Robustness Prediction and RISk Evaluation (ENTERPRISE) methodology was formulated to provide a structured and transparent process for integrating these enabling techniques to provide a probabilistic and quantitative assessment of acquisition program requirements robustness against technology performance and development uncertainties. In order to demonstrate the capabilities of the ENTERPRISE method and test the research Hypotheses, an demonstration application of this method was performed on a notional program for acquiring the Carrier-based Suppression of Enemy Air Defenses (SEAD) using Unmanned Combat Aircraft Systems (UCAS) and their enabling technologies. The results of this implementation provided valuable insights regarding the benefits and inner workings of this methodology as well as its limitations that should be addressed in the future to narrow the gap between current state and the desired state.
Sajnóg, Adam; Hanć, Anetta; Barałkiewicz, Danuta
2018-05-15
Analysis of clinical specimens by imaging techniques allows to determine the content and distribution of trace elements on the surface of the examined sample. In order to obtain reliable results, the developed procedure should be based not only on the properly prepared sample and performed calibration. It is also necessary to carry out all phases of the procedure in accordance with the principles of chemical metrology whose main pillars are the use of validated analytical methods, establishing the traceability of the measurement results and the estimation of the uncertainty. This review paper discusses aspects related to sampling, preparation and analysis of clinical samples by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) with emphasis on metrological aspects, i.e. selected validation parameters of the analytical method, the traceability of the measurement result and the uncertainty of the result. This work promotes the introduction of metrology principles for chemical measurement with emphasis to the LA-ICP-MS which is the comparative method that requires studious approach to the development of the analytical procedure in order to acquire reliable quantitative results. Copyright © 2018 Elsevier B.V. All rights reserved.
A subagging regression method for estimating the qualitative and quantitative state of groundwater
NASA Astrophysics Data System (ADS)
Jeong, Jina; Park, Eungyu; Han, Weon Shik; Kim, Kue-Young
2017-08-01
A subsample aggregating (subagging) regression (SBR) method for the analysis of groundwater data pertaining to trend-estimation-associated uncertainty is proposed. The SBR method is validated against synthetic data competitively with other conventional robust and non-robust methods. From the results, it is verified that the estimation accuracies of the SBR method are consistent and superior to those of other methods, and the uncertainties are reasonably estimated; the others have no uncertainty analysis option. To validate further, actual groundwater data are employed and analyzed comparatively with Gaussian process regression (GPR). For all cases, the trend and the associated uncertainties are reasonably estimated by both SBR and GPR regardless of Gaussian or non-Gaussian skewed data. However, it is expected that GPR has a limitation in applications to severely corrupted data by outliers owing to its non-robustness. From the implementations, it is determined that the SBR method has the potential to be further developed as an effective tool of anomaly detection or outlier identification in groundwater state data such as the groundwater level and contaminant concentration.
REVIEW OF DRAFT REVISED BLUE BOOK ON ESTIMATING CANCER RISKS FROM EXPOSURE TO IONIZING RADIATION
In 1994, EPA published a report, referred to as the “Blue Book,” which lays out EPA’s current methodology for quantitatively estimating radiogenic cancer risks. A follow-on report made minor adjustments to the previous estimates and presented a partial analysis of the uncertainti...
An earlier paper (Hattis et al., 2003) developed a quantitative likelihood-based statistical analysis of the differences in apparent sensitivity of rodents to mutagenic carcinogens across three life stages (fetal, birth-weaning, and weaning-60 days) relative to exposures in adult...
Quantitative body DW-MRI biomarkers uncertainty estimation using unscented wild-bootstrap.
Freiman, M; Voss, S D; Mulkern, R V; Perez-Rossello, J M; Warfield, S K
2011-01-01
We present a new method for the uncertainty estimation of diffusion parameters for quantitative body DW-MRI assessment. Diffusion parameters uncertainty estimation from DW-MRI is necessary for clinical applications that use these parameters to assess pathology. However, uncertainty estimation using traditional techniques requires repeated acquisitions, which is undesirable in routine clinical use. Model-based bootstrap techniques, for example, assume an underlying linear model for residuals rescaling and cannot be utilized directly for body diffusion parameters uncertainty estimation due to the non-linearity of the body diffusion model. To offset this limitation, our method uses the Unscented transform to compute the residuals rescaling parameters from the non-linear body diffusion model, and then applies the wild-bootstrap method to infer the body diffusion parameters uncertainty. Validation through phantom and human subject experiments shows that our method identify the regions with higher uncertainty in body DWI-MRI model parameters correctly with realtive error of -36% in the uncertainty values.
Meier, D C; Benkstein, K D; Hurst, W S; Chu, P M
2017-05-01
Performance standard specifications for point chemical vapor detectors are established in ASTM E 2885-13 and ASTM E 2933-13. The performance evaluation of the detectors requires the accurate delivery of known concentrations of the chemical target to the system under test. Referee methods enable the analyte test concentration and associated uncertainties in the analyte test concentration to be validated by independent analysis, which is especially important for reactive analytes. This work extends the capability of a previously demonstrated method for using Fourier transform infrared (FT-IR) absorption spectroscopy for quantitatively evaluating the composition of vapor streams containing hazardous materials at Acute Exposure Guideline Levels (AEGL) to include test conditions colder than laboratory ambient temperatures. The described method covers the use of primary reference spectra to establish analyte concentrations, the generation of secondary reference spectra suitable for measuring analyte concentrations under specified testing environments, and the use of additional reference spectra and spectral profile strategies to mitigate the uncertainties due to impurities and water condensation within the low-temperature (7 °C, -5 °C) test cell. Important benefits of this approach include verification of the test analyte concentration with characterized uncertainties by in situ measurements co-located with the detector under test, near-real-time feedback, and broad applicability to toxic industrial chemicals.
Meier, D.C.; Benkstein, K.D.; Hurst, W.S.; Chu, P.M.
2016-01-01
Performance standard specifications for point chemical vapor detectors are established in ASTM E 2885-13 and ASTM E 2933-13. The performance evaluation of the detectors requires the accurate delivery of known concentrations of the chemical target to the system under test. Referee methods enable the analyte test concentration and associated uncertainties in the analyte test concentration to be validated by independent analysis, which is especially important for reactive analytes. This work extends the capability of a previously demonstrated method for using Fourier transform infrared (FT-IR) absorption spectroscopy for quantitatively evaluating the composition of vapor streams containing hazardous materials at Acute Exposure Guideline Levels (AEGL) to include test conditions colder than laboratory ambient temperatures. The described method covers the use of primary reference spectra to establish analyte concentrations, the generation of secondary reference spectra suitable for measuring analyte concentrations under specified testing environments, and the use of additional reference spectra and spectral profile strategies to mitigate the uncertainties due to impurities and water condensation within the low-temperature (7 °C, −5 °C) test cell. Important benefits of this approach include verification of the test analyte concentration with characterized uncertainties by in situ measurements co-located with the detector under test, near-real-time feedback, and broad applicability to toxic industrial chemicals. PMID:28090126
NASA Astrophysics Data System (ADS)
Li, Y.; Kinzelbach, W.; Zhou, J.; Cheng, G. D.; Li, X.
2012-05-01
The hydrologic model HYDRUS-1-D and the crop growth model WOFOST are coupled to efficiently manage water resources in agriculture and improve the prediction of crop production. The results of the coupled model are validated by experimental studies of irrigated-maize done in the middle reaches of northwest China's Heihe River, a semi-arid to arid region. Good agreement is achieved between the simulated evapotranspiration, soil moisture and crop production and their respective field measurements made under current maize irrigation and fertilization. Based on the calibrated model, the scenario analysis reveals that the most optimal amount of irrigation is 500-600 mm in this region. However, for regions without detailed observation, the results of the numerical simulation can be unreliable for irrigation decision making owing to the shortage of calibrated model boundary conditions and parameters. So, we develop a method of combining model ensemble simulations and uncertainty/sensitivity analysis to speculate the probability of crop production. In our studies, the uncertainty analysis is used to reveal the risk of facing a loss of crop production as irrigation decreases. The global sensitivity analysis is used to test the coupled model and further quantitatively analyse the impact of the uncertainty of coupled model parameters and environmental scenarios on crop production. This method can be used for estimation in regions with no or reduced data availability.
Uncertainty and sensitivity analysis of fission gas behavior in engineering-scale fuel modeling
Pastore, Giovanni; Swiler, L. P.; Hales, Jason D.; ...
2014-10-12
The role of uncertainties in fission gas behavior calculations as part of engineering-scale nuclear fuel modeling is investigated using the BISON fuel performance code and a recently implemented physics-based model for the coupled fission gas release and swelling. Through the integration of BISON with the DAKOTA software, a sensitivity analysis of the results to selected model parameters is carried out based on UO2 single-pellet simulations covering different power regimes. The parameters are varied within ranges representative of the relative uncertainties and consistent with the information from the open literature. The study leads to an initial quantitative assessment of the uncertaintymore » in fission gas behavior modeling with the parameter characterization presently available. Also, the relative importance of the single parameters is evaluated. Moreover, a sensitivity analysis is carried out based on simulations of a fuel rod irradiation experiment, pointing out a significant impact of the considered uncertainties on the calculated fission gas release and cladding diametral strain. The results of the study indicate that the commonly accepted deviation between calculated and measured fission gas release by a factor of 2 approximately corresponds to the inherent modeling uncertainty at high fission gas release. Nevertheless, higher deviations may be expected for values around 10% and lower. Implications are discussed in terms of directions of research for the improved modeling of fission gas behavior for engineering purposes.« less
NASA Astrophysics Data System (ADS)
Ginn, T. R.; Scheibe, T. D.
2006-12-01
Hydrogeology is among the most data-limited of the earth sciences, so that uncertainty arises in every aspect of subsurface flow and transport modeling, from conceptual model to spatial discretization to parameter values. Thus treatment of uncertainty is unavoidable, and the literature and conference proceedings are replete with approaches, templates, paradigms and such for doing so. However, such tools remain not well used, especially those of the stochastic analytic sort, leading recently to explicit inquiries about why this is the case, in response to which entire journal issues have been dedicated. In an effort to continue this discussion in a constructive way we report on an informal yet extensive survey of hydrogeology practitioners, as the "marketplace" for techniques to deal with uncertainty. We include scientists, engineers, regulators, and others in the survey, that reports on quantitative (or not) methods for uncertainty characterization and analysis, frequency and level of usage, and reasons behind the selection or avoidance of available methods. Results shed light on fruitful directions for future research in uncertainty quantification in hydrogeology.
QuASAR: quantitative allele-specific analysis of reads
Harvey, Chris T.; Moyerbrailean, Gregory A.; Davis, Gordon O.; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger
2015-01-01
Motivation: Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. Results: We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. Availability and implementation: http://github.com/piquelab/QuASAR. Contact: fluca@wayne.edu or rpique@wayne.edu Supplementary information: Supplementary Material is available at Bioinformatics online. PMID:25480375
Rastogi, L.; Dash, K.; Arunachalam, J.
2013-01-01
The quantitative analysis of glutathione (GSH) is important in different fields like medicine, biology, and biotechnology. Accurate quantitative measurements of this analyte have been hampered by the lack of well characterized reference standards. The proposed procedure is intended to provide an accurate and definitive method for the quantitation of GSH for reference measurements. Measurement of the stoichiometrically existing sulfur content in purified GSH offers an approach for its quantitation and calibration through an appropriate characterized reference material (CRM) for sulfur would provide a methodology for the certification of GSH quantity, that is traceable to SI (International system of units). The inductively coupled plasma optical emission spectrometry (ICP-OES) approach negates the need for any sample digestion. The sulfur content of the purified GSH is quantitatively converted into sulfate ions by microwave-assisted UV digestion in the presence of hydrogen peroxide prior to ion chromatography (IC) measurements. The measurement of sulfur by ICP-OES and IC (as sulfate) using the “high performance” methodology could be useful for characterizing primary calibration standards and certified reference materials with low uncertainties. The relative expanded uncertainties (% U) expressed at 95% confidence interval for ICP-OES analyses varied from 0.1% to 0.3%, while in the case of IC, they were between 0.2% and 1.2%. The described methods are more suitable for characterizing primary calibration standards and certifying reference materials of GSH, than for routine measurements. PMID:29403814
Traceable measurements of the electrical parameters of solid-state lighting products
NASA Astrophysics Data System (ADS)
Zhao, D.; Rietveld, G.; Braun, J.-P.; Overney, F.; Lippert, T.; Christensen, A.
2016-12-01
In order to perform traceable measurements of the electrical parameters of solid-state lighting (SSL) products, it is necessary to technically adequately define the measurement procedures and to identify the relevant uncertainty sources. The present published written standard for SSL products specifies test conditions, but it lacks an explanation of how adequate these test conditions are. More specifically, both an identification of uncertainty sources and a quantitative uncertainty analysis are absent. This paper fills the related gap in the present written standard. New uncertainty sources with respect to conventional lighting sources are determined and their effects are quantified. It shows that for power measurements, the main uncertainty sources are temperature deviation, power supply voltage distortion, and instability of the SSL product. For current RMS measurements, the influence of bandwidth, shunt resistor, power supply source impedance and ac frequency flatness are significant as well. The measurement uncertainty depends not only on the test equipment but is also a function of the characteristics of the device under test (DUT), for example, current harmonics spectrum and input impedance. Therefore, an online calculation tool is provided to help non-electrical experts. Following our procedures, unrealistic uncertainty estimations, unnecessary procedures and expensive equipment can be prevented.
Koornneef, Joris; Spruijt, Mark; Molag, Menso; Ramírez, Andrea; Turkenburg, Wim; Faaij, André
2010-05-15
A systematic assessment, based on an extensive literature review, of the impact of gaps and uncertainties on the results of quantitative risk assessments (QRAs) for CO(2) pipelines is presented. Sources of uncertainties that have been assessed are: failure rates, pipeline pressure, temperature, section length, diameter, orifice size, type and direction of release, meteorological conditions, jet diameter, vapour mass fraction in the release and the dose-effect relationship for CO(2). A sensitivity analysis with these parameters is performed using release, dispersion and impact models. The results show that the knowledge gaps and uncertainties have a large effect on the accuracy of the assessed risks of CO(2) pipelines. In this study it is found that the individual risk contour can vary between 0 and 204 m from the pipeline depending on assumptions made. In existing studies this range is found to be between <1m and 7.2 km. Mitigating the relevant risks is part of current practice, making them controllable. It is concluded that QRA for CO(2) pipelines can be improved by validation of release and dispersion models for high-pressure CO(2) releases, definition and adoption of a universal dose-effect relationship and development of a good practice guide for QRAs for CO(2) pipelines. Copyright (c) 2009 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Liu, P.
2013-12-01
Quantitative analysis of the risk for reservoir real-time operation is a hard task owing to the difficulty of accurate description of inflow uncertainties. The ensemble-based hydrologic forecasts directly depict the inflows not only the marginal distributions but also their persistence via scenarios. This motivates us to analyze the reservoir real-time operating risk with ensemble-based hydrologic forecasts as inputs. A method is developed by using the forecast horizon point to divide the future time into two stages, the forecast lead-time and the unpredicted time. The risk within the forecast lead-time is computed based on counting the failure number of forecast scenarios, and the risk in the unpredicted time is estimated using reservoir routing with the design floods and the reservoir water levels of forecast horizon point. As a result, a two-stage risk analysis method is set up to quantify the entire flood risks by defining the ratio of the number of scenarios that excessive the critical value to the total number of scenarios. The China's Three Gorges Reservoir (TGR) is selected as a case study, where the parameter and precipitation uncertainties are implemented to produce ensemble-based hydrologic forecasts. The Bayesian inference, Markov Chain Monte Carlo, is used to account for the parameter uncertainty. Two reservoir operation schemes, the real operated and scenario optimization, are evaluated for the flood risks and hydropower profits analysis. With the 2010 flood, it is found that the improvement of the hydrologic forecast accuracy is unnecessary to decrease the reservoir real-time operation risk, and most risks are from the forecast lead-time. It is therefore valuable to decrease the avarice of ensemble-based hydrologic forecasts with less bias for a reservoir operational purpose.
Performance Analysis of Transposition Models Simulating Solar Radiation on Inclined Surfaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Yu; Sengupta, Manajit
2016-06-02
Transposition models have been widely used in the solar energy industry to simulate solar radiation on inclined photovoltaic panels. Following numerous studies comparing the performance of transposition models, this work aims to understand the quantitative uncertainty in state-of-the-art transposition models and the sources leading to the uncertainty. Our results show significant differences between two highly used isotropic transposition models, with one substantially underestimating the diffuse plane-of-array irradiances when diffuse radiation is perfectly isotropic. In the empirical transposition models, the selection of the empirical coefficients and land surface albedo can both result in uncertainty in the output. This study can bemore » used as a guide for the future development of physics-based transposition models and evaluations of system performance.« less
Milton, Martin J T; Wang, Jian
2003-01-01
A new isotope dilution mass spectrometry (IDMS) method for high-accuracy quantitative analysis of gases has been developed and validated by the analysis of standard mixtures of carbon dioxide in nitrogen. The method does not require certified isotopic reference materials and does not require direct measurements of the highly enriched spike. The relative uncertainty of the method is shown to be 0.2%. Reproduced with the permission of Her Majesty's Stationery Office. Copyright Crown copyright 2003.
Quantitative assessment of human exposures and health effects due to air pollution involve detailed characterization of impacts of air quality on exposure and dose. A key challenge is to integrate these three components on a consistent spatial and temporal basis taking into acco...
Quantifying and managing uncertainty in operational modal analysis
NASA Astrophysics Data System (ADS)
Au, Siu-Kui; Brownjohn, James M. W.; Mottershead, John E.
2018-03-01
Operational modal analysis aims at identifying the modal properties (natural frequency, damping, etc.) of a structure using only the (output) vibration response measured under ambient conditions. Highly economical and feasible, it is becoming a common practice in full-scale vibration testing. In the absence of (input) loading information, however, the modal properties have significantly higher uncertainty than their counterparts identified from free or forced vibration (known input) tests. Mastering the relationship between identification uncertainty and test configuration is of great interest to both scientists and engineers, e.g., for achievable precision limits and test planning/budgeting. Addressing this challenge beyond the current state-of-the-art that are mostly concerned with identification algorithms, this work obtains closed form analytical expressions for the identification uncertainty (variance) of modal parameters that fundamentally explains the effect of test configuration. Collectively referred as 'uncertainty laws', these expressions are asymptotically correct for well-separated modes, small damping and long data; and are applicable under non-asymptotic situations. They provide a scientific basis for planning and standardization of ambient vibration tests, where factors such as channel noise, sensor number and location can be quantitatively accounted for. The work is reported comprehensively with verification through synthetic and experimental data (laboratory and field), scientific implications and practical guidelines for planning ambient vibration tests.
Quantification of Emission Factor Uncertainty
Emissions factors are important for estimating and characterizing emissions from sources of air pollution. There is no quantitative indication of uncertainty for these emission factors, most factors do not have an adequate data set to compute uncertainty, and it is very difficult...
The Second SeaWiFS HPLC Analysis Round-Robin Experiment (SeaHARRE-2)
NASA Technical Reports Server (NTRS)
2005-01-01
Eight international laboratories specializing in the determination of marine pigment concentrations using high performance liquid chromatography (HPLC) were intercompared using in situ samples and a variety of laboratory standards. The field samples were collected primarily from eutrophic waters, although mesotrophic waters were also sampled to create a dynamic range in chlorophyll concentration spanning approximately two orders of magnitude (0.3 25.8 mg m-3). The intercomparisons were used to establish the following: a) the uncertainties in quantitating individual pigments and higher-order variables (sums, ratios, and indices); b) an evaluation of spectrophotometric versus HPLC uncertainties in the determination of total chlorophyll a; and c) the reduction in uncertainties as a result of applying quality assurance (QA) procedures associated with extraction, separation, injection, degradation, detection, calibration, and reporting (particularly limits of detection and quantitation). In addition, the remote sensing requirements for the in situ determination of total chlorophyll a were investigated to determine whether or not the average uncertainty for this measurement is being satisfied. The culmination of the activity was a validation of the round-robin methodology plus the development of the requirements for validating an individual HPLC method. The validation process includes the measurements required to initially demonstrate a pigment is validated, and the measurements that must be made during sample analysis to confirm a method remains validated. The so-called performance-based metrics developed here describe a set of thresholds for a variety of easily-measured parameters with a corresponding set of performance categories. The aggregate set of performance parameters and categories establish a) the overall performance capability of the method, and b) whether or not the capability is consistent with the required accuracy objectives.
Assessing model uncertainty using hexavalent chromium and ...
Introduction: The National Research Council recommended quantitative evaluation of uncertainty in effect estimates for risk assessment. This analysis considers uncertainty across model forms and model parameterizations with hexavalent chromium [Cr(VI)] and lung cancer mortality as an example. The objective of this analysis is to characterize model uncertainty by evaluating the variance in estimates across several epidemiologic analyses.Methods: This analysis compared 7 publications analyzing two different chromate production sites in Ohio and Maryland. The Ohio cohort consisted of 482 workers employed from 1940-72, while the Maryland site employed 2,357 workers from 1950-74. Cox and Poisson models were the only model forms considered by study authors to assess the effect of Cr(VI) on lung cancer mortality. All models adjusted for smoking and included a 5-year exposure lag, however other latency periods and model covariates such as age and race were considered. Published effect estimates were standardized to the same units and normalized by their variances to produce a standardized metric to compare variability in estimates across and within model forms. A total of 7 similarly parameterized analyses were considered across model forms, and 23 analyses with alternative parameterizations were considered within model form (14 Cox; 9 Poisson). Results: Across Cox and Poisson model forms, adjusted cumulative exposure coefficients for 7 similar analyses ranged from 2.47
Welch, Lisa C; Lutfey, Karen E; Gerstenberger, Eric; Grace, Matthew
2012-09-01
Nonmedical factors and diagnostic certainty contribute to variation in clinical decision making, but the process by which this occurs remains unclear. We examine how physicians' interpretations of patient sex-gender affect diagnostic certainty and, in turn, decision making for coronary heart disease. Data are from a factorial experiment of 256 physicians who viewed 1 of 16 video vignettes with different patient-actors presenting the same symptoms of coronary heart disease. Physician participants completed a structured interview and provided a narrative about their decision-making processes. Quantitative analysis showed that diagnostic uncertainty reduces the likelihood that physicians will order tests and medications appropriate for an urgent cardiac condition in particular. Qualitative analysis revealed that a subset of physicians applied knowledge that women have "atypical symptoms" as a generalization, which engendered uncertainty for some. Findings are discussed in relation to social-psychological processes that underlie clinical decision making and the social framing of medical knowledge.
Welch, Lisa C.; Lutfey, Karen E.; Gerstenberger, Eric; Grace, Matthew
2013-01-01
Nonmedical factors and diagnostic certainty contribute to variation in clinical decision making, but the process by which this occurs remains unclear. We examine how physicians’ interpretations of patient sex/gender affect diagnostic certainty and, in turn, decision making for coronary heart disease (CHD). Data are from a factorial experiment of 256 physicians who viewed one of 16 video vignettes with different patient-actors presenting the same CHD symptoms. Physician participants completed a structured interview and provided a narrative about their decision-making processes. Quantitative analysis showed that diagnostic uncertainty reduces the likelihood that physicians will order tests and medications appropriate for an urgent cardiac condition in particular. Qualitative analysis revealed that a subset of physicians applied knowledge that women have “atypical symptoms” as a generalization, which engendered uncertainty for some. Findings are discussed in relation to social-psychological processes that underlie clinical decision making and the social framing of medical knowledge. PMID:22933590
Assessment of uncertainties of the models used in thermal-hydraulic computer codes
NASA Astrophysics Data System (ADS)
Gricay, A. S.; Migrov, Yu. A.
2015-09-01
The article deals with matters concerned with the problem of determining the statistical characteristics of variable parameters (the variation range and distribution law) in analyzing the uncertainty and sensitivity of calculation results to uncertainty in input data. A comparative analysis of modern approaches to uncertainty in input data is presented. The need to develop an alternative method for estimating the uncertainty of model parameters used in thermal-hydraulic computer codes, in particular, in the closing correlations of the loop thermal hydraulics block, is shown. Such a method shall feature the minimal degree of subjectivism and must be based on objective quantitative assessment criteria. The method includes three sequential stages: selecting experimental data satisfying the specified criteria, identifying the key closing correlation using a sensitivity analysis, and carrying out case calculations followed by statistical processing of the results. By using the method, one can estimate the uncertainty range of a variable parameter and establish its distribution law in the above-mentioned range provided that the experimental information is sufficiently representative. Practical application of the method is demonstrated taking as an example the problem of estimating the uncertainty of a parameter appearing in the model describing transition to post-burnout heat transfer that is used in the thermal-hydraulic computer code KORSAR. The performed study revealed the need to narrow the previously established uncertainty range of this parameter and to replace the uniform distribution law in the above-mentioned range by the Gaussian distribution law. The proposed method can be applied to different thermal-hydraulic computer codes. In some cases, application of the method can make it possible to achieve a smaller degree of conservatism in the expert estimates of uncertainties pertinent to the model parameters used in computer codes.
Dynamic Modelling under Uncertainty: The Case of Trypanosoma brucei Energy Metabolism
Achcar, Fiona; Kerkhoven, Eduard J.; Bakker, Barbara M.; Barrett, Michael P.; Breitling, Rainer
2012-01-01
Kinetic models of metabolism require detailed knowledge of kinetic parameters. However, due to measurement errors or lack of data this knowledge is often uncertain. The model of glycolysis in the parasitic protozoan Trypanosoma brucei is a particularly well analysed example of a quantitative metabolic model, but so far it has been studied with a fixed set of parameters only. Here we evaluate the effect of parameter uncertainty. In order to define probability distributions for each parameter, information about the experimental sources and confidence intervals for all parameters were collected. We created a wiki-based website dedicated to the detailed documentation of this information: the SilicoTryp wiki (http://silicotryp.ibls.gla.ac.uk/wiki/Glycolysis). Using information collected in the wiki, we then assigned probability distributions to all parameters of the model. This allowed us to sample sets of alternative models, accurately representing our degree of uncertainty. Some properties of the model, such as the repartition of the glycolytic flux between the glycerol and pyruvate producing branches, are robust to these uncertainties. However, our analysis also allowed us to identify fragilities of the model leading to the accumulation of 3-phosphoglycerate and/or pyruvate. The analysis of the control coefficients revealed the importance of taking into account the uncertainties about the parameters, as the ranking of the reactions can be greatly affected. This work will now form the basis for a comprehensive Bayesian analysis and extension of the model considering alternative topologies. PMID:22379410
Radomyski, Artur; Giubilato, Elisa; Ciffroy, Philippe; Critto, Andrea; Brochot, Céline; Marcomini, Antonio
2016-11-01
The study is focused on applying uncertainty and sensitivity analysis to support the application and evaluation of large exposure models where a significant number of parameters and complex exposure scenarios might be involved. The recently developed MERLIN-Expo exposure modelling tool was applied to probabilistically assess the ecological and human exposure to PCB 126 and 2,3,7,8-TCDD in the Venice lagoon (Italy). The 'Phytoplankton', 'Aquatic Invertebrate', 'Fish', 'Human intake' and PBPK models available in MERLIN-Expo library were integrated to create a specific food web to dynamically simulate bioaccumulation in various aquatic species and in the human body over individual lifetimes from 1932 until 1998. MERLIN-Expo is a high tier exposure modelling tool allowing propagation of uncertainty on the model predictions through Monte Carlo simulation. Uncertainty in model output can be further apportioned between parameters by applying built-in sensitivity analysis tools. In this study, uncertainty has been extensively addressed in the distribution functions to describe the data input and the effect on model results by applying sensitivity analysis techniques (screening Morris method, regression analysis, and variance-based method EFAST). In the exposure scenario developed for the Lagoon of Venice, the concentrations of 2,3,7,8-TCDD and PCB 126 in human blood turned out to be mainly influenced by a combination of parameters (half-lives of the chemicals, body weight variability, lipid fraction, food assimilation efficiency), physiological processes (uptake/elimination rates), environmental exposure concentrations (sediment, water, food) and eating behaviours (amount of food eaten). In conclusion, this case study demonstrated feasibility of MERLIN-Expo to be successfully employed in integrated, high tier exposure assessment. Copyright © 2016 Elsevier B.V. All rights reserved.
QuASAR: quantitative allele-specific analysis of reads.
Harvey, Chris T; Moyerbrailean, Gregory A; Davis, Gordon O; Wen, Xiaoquan; Luca, Francesca; Pique-Regi, Roger
2015-04-15
Expression quantitative trait loci (eQTL) studies have discovered thousands of genetic variants that regulate gene expression, enabling a better understanding of the functional role of non-coding sequences. However, eQTL studies are costly, requiring large sample sizes and genome-wide genotyping of each sample. In contrast, analysis of allele-specific expression (ASE) is becoming a popular approach to detect the effect of genetic variation on gene expression, even within a single individual. This is typically achieved by counting the number of RNA-seq reads matching each allele at heterozygous sites and testing the null hypothesis of a 1:1 allelic ratio. In principle, when genotype information is not readily available, it could be inferred from the RNA-seq reads directly. However, there are currently no existing methods that jointly infer genotypes and conduct ASE inference, while considering uncertainty in the genotype calls. We present QuASAR, quantitative allele-specific analysis of reads, a novel statistical learning method for jointly detecting heterozygous genotypes and inferring ASE. The proposed ASE inference step takes into consideration the uncertainty in the genotype calls, while including parameters that model base-call errors in sequencing and allelic over-dispersion. We validated our method with experimental data for which high-quality genotypes are available. Results for an additional dataset with multiple replicates at different sequencing depths demonstrate that QuASAR is a powerful tool for ASE analysis when genotypes are not available. http://github.com/piquelab/QuASAR. fluca@wayne.edu or rpique@wayne.edu Supplementary Material is available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Ilieva, T.; Iliev, I.; Pashov, A.
2016-12-01
In the traditional description of electronic states of diatomic molecules by means of molecular constants or Dunham coefficients, one of the important fitting parameters is the value of the zero point energy - the minimum of the potential curve or the energy of the lowest vibrational-rotational level - E00 . Their values are almost always the result of an extrapolation and it may be difficult to estimate their uncertainties, because they are connected not only with the uncertainty of the experimental data, but also with the distribution of experimentally observed energy levels and the particular realization of set of Dunham coefficients. This paper presents a comprehensive analysis based on Monte Carlo simulations, which aims to demonstrate the influence of all these factors on the uncertainty of the extrapolated minimum of the potential energy curve U (Re) and the value of E00 . The very good extrapolation properties of the Dunham coefficients are quantitatively confirmed and it is shown that for a proper estimate of the uncertainties, the ambiguity in the composition of the Dunham coefficients should be taken into account.
NASA Astrophysics Data System (ADS)
Arnold, J.; Gutmann, E. D.; Clark, M. P.; Nijssen, B.; Vano, J. A.; Addor, N.; Wood, A.; Newman, A. J.; Mizukami, N.; Brekke, L. D.; Rasmussen, R.; Mendoza, P. A.
2016-12-01
Climate change narratives for water-resource applications must represent the change signals contextualized by hydroclimatic process variability and uncertainty at multiple scales. Building narratives of plausible change includes assessing uncertainties across GCM structure, internal climate variability, climate downscaling methods, and hydrologic models. Work with this linked modeling chain has dealt mostly with GCM sampling directed separately to either model fidelity (does the model correctly reproduce the physical processes in the world?) or sensitivity (of different model responses to CO2 forcings) or diversity (of model type, structure, and complexity). This leaves unaddressed any interactions among those measures and with other components in the modeling chain used to identify water-resource vulnerabilities to specific climate threats. However, time-sensitive, real-world vulnerability studies typically cannot accommodate a full uncertainty ensemble across the whole modeling chain, so a gap has opened between current scientific knowledge and most routine applications for climate-changed hydrology. To close that gap, the US Army Corps of Engineers, the Bureau of Reclamation, and the National Center for Atmospheric Research are working on techniques to subsample uncertainties objectively across modeling chain components and to integrate results into quantitative hydrologic storylines of climate-changed futures. Importantly, these quantitative storylines are not drawn from a small sample of models or components. Rather, they stem from the more comprehensive characterization of the full uncertainty space for each component. Equally important from the perspective of water-resource practitioners, these quantitative hydrologic storylines are anchored in actual design and operations decisions potentially affected by climate change. This talk will describe part of our work characterizing variability and uncertainty across modeling chain components and their interactions using newly developed observational data, models and model outputs, and post-processing tools for making the resulting quantitative storylines most useful in practical hydrology applications.
Sixth international radiopharmaceutical dosimetry symposium: Proceedings. Volume 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
S.-Stelson, A.T.; Stabin, M.G.; Sparks, R.B.
1999-01-01
This conference was held May 7--10 in Gatlinburg, Tennessee. The purpose of this conference was to provide a multidisciplinary forum for exchange of state-of-the-art information on radiopharmaceutical dosimetry. Attention is focused on the following: quantitative analysis and treatment planning; cellular and small-scale dosimetry; dosimetric models; radiopharmaceutical kinetics and dosimetry; and animal models, extrapolation, and uncertainty.
The Third SeaWiFS HPLC Analysis Round-Robin Experiment (SeaHARRE-3)
NASA Technical Reports Server (NTRS)
Hooker, Stanford B.; VanHeukelem, Laurei; Thomas, Crystal S.; Claustre, Herve; Ras, Josephine; Schluter, Louise; Clementson, Lesley; vanderLinde, Dirk; Eker-Develi, Elif; Berthon, Jean-Francois;
2009-01-01
Seven international laboratories specializing in the determination of marine pigment concentrations using high performance liquid chromatography (HPLC) were intercompared using in situ samples and a mixed pigment sample. The field samples were collected primarily from oligotrophic waters, although mesotrophic and eutrophic waters were also sampled to create a dynamic range in chlorophyll concentration spanning approximately two orders of magnitude (0.020 1.366 mg m^{-3}) The intercomparisons were used to establish the following: a) the uncertainties in quantitating individual pigments and higher-order variables (sums, ratios, and indices); b) the reduction in uncertainties as a result of applying quality assurance (QA) procedures; c) the importance of establishing a properly defined referencing system in the computation of uncertainties; d) the analytical benefits of performance metrics, and e) the utility of a laboratory mix in understanding method performance. In addition, the remote sensing requirements for the in situ determination of total chlorophyll a were investigated to determine whether or not the average uncertainty for this measurement is being satisfied.
Savel'eva, N B; Bykovskaia, N Iu; Dikunets, M A; Bolotov, S L; Rodchenkov, G M
2010-01-01
The objective of this study was to demonstrate the possibility to use deuterated compounds as internal standards for the quantitative analysis of morphine by gas chromatography with mass-selective detection for the purpose of doping control. The paper is focused on the problems associated with the use of deuterated morphine-D3 as the internal standard. Quantitative characteristics of the calibration dependence thus documented are presented along with uncertainty values obtained in the measurements with the use of deuterated morphine-D6. An approach to the assessment of method bias associated with the application of morphine-D6 as the deuterated internal standard is described.
Hill, Shannon B; Faradzhev, Nadir S; Powell, Cedric J
2017-12-01
We discuss the problem of quantifying common sources of statistical uncertainties for analyses of trace levels of surface contamination using X-ray photoelectron spectroscopy. We examine the propagation of error for peak-area measurements using common forms of linear and polynomial background subtraction including the correlation of points used to determine both background and peak areas. This correlation has been neglected in previous analyses, but we show that it contributes significantly to the peak-area uncertainty near the detection limit. We introduce the concept of relative background subtraction variance (RBSV) which quantifies the uncertainty introduced by the method of background determination relative to the uncertainty of the background area itself. The uncertainties of the peak area and atomic concentration and of the detection limit are expressed using the RBSV, which separates the contributions from the acquisition parameters, the background-determination method, and the properties of the measured spectrum. These results are then combined to find acquisition strategies that minimize the total measurement time needed to achieve a desired detection limit or atomic-percentage uncertainty for a particular trace element. Minimization of data-acquisition time is important for samples that are sensitive to x-ray dose and also for laboratories that need to optimize throughput.
NASA Technical Reports Server (NTRS)
Russell, John M.
1993-01-01
Many helium mass spectrometer leak detectors at KSC employ sampling systems that feature hand held sniffer probes. Authors of general leakage-testing literature recommend sniffer probes for leak location but not for quantitative leakage measurement. Their use in the latter application at KSC involves assumptions that may be subtle. The purpose of the research effort reported herein was to establish the significance of indicated leak rates displayed by sniffer-probe equipped leak detectors and to determine whether the use of alternative hardware or testing procedures may reduce the uncertainty of leakage measurements made with them. The report classifies probe-type sampling systems for helium leak detectors according to their internal plumbing (direct or branched), presents a basic analysis of the fluid dynamics in the sampling system in the branched-conduit case, describes the usual test method for measuring the internal supply-to-sample flowrate ratio (a.k.a permeation ratio), and describes a concept for a sponge-tipped probe whose external supply-to-sample flowrate ratio promises to be lower than that of a simple-ended probe. One conclusion is that the main source of uncertainty in the use of probe-type sampling systems for leakage measurement is uncertainty in the external supply-to-sample flowrate ratio. In contrast, the present method for measuring the internal supply-to-sample flowrate ratio is quantitative and satisfactory. The implication is that probes of lower external supply-to-sample flowrate ratio must be developed before this uncertainty may be reduced significantly.
Comparison of the uncertainties of several European low-dose calibration facilities
NASA Astrophysics Data System (ADS)
Dombrowski, H.; Cornejo Díaz, N. A.; Toni, M. P.; Mihelic, M.; Röttger, A.
2018-04-01
The typical uncertainty of a low-dose rate calibration of a detector, which is calibrated in a dedicated secondary national calibration laboratory, is investigated, including measurements in the photon field of metrology institutes. Calibrations at low ambient dose equivalent rates (at the level of the natural ambient radiation) are needed when environmental radiation monitors are to be characterised. The uncertainties of calibration measurements in conventional irradiation facilities above ground are compared with those obtained in a low-dose rate irradiation facility located deep underground. Four laboratories quantitatively evaluated the uncertainties of their calibration facilities, in particular for calibrations at low dose rates (250 nSv/h and 1 μSv/h). For the first time, typical uncertainties of European calibration facilities are documented in a comparison and the main sources of uncertainty are revealed. All sources of uncertainties are analysed, including the irradiation geometry, scattering, deviations of real spectra from standardised spectra, etc. As a fundamental metrological consequence, no instrument calibrated in such a facility can have a lower total uncertainty in subsequent measurements. For the first time, the need to perform calibrations at very low dose rates (< 100 nSv/h) deep underground is underpinned on the basis of quantitative data.
Characterizing spatial uncertainty when integrating social data in conservation planning.
Lechner, A M; Raymond, C M; Adams, V M; Polyakov, M; Gordon, A; Rhodes, J R; Mills, M; Stein, A; Ives, C D; Lefroy, E C
2014-12-01
Recent conservation planning studies have presented approaches for integrating spatially referenced social (SRS) data with a view to improving the feasibility of conservation action. We reviewed the growing conservation literature on SRS data, focusing on elicited or stated preferences derived through social survey methods such as choice experiments and public participation geographic information systems. Elicited SRS data includes the spatial distribution of willingness to sell, willingness to pay, willingness to act, and assessments of social and cultural values. We developed a typology for assessing elicited SRS data uncertainty which describes how social survey uncertainty propagates when projected spatially and the importance of accounting for spatial uncertainty such as scale effects and data quality. These uncertainties will propagate when elicited SRS data is integrated with biophysical data for conservation planning and may have important consequences for assessing the feasibility of conservation actions. To explore this issue further, we conducted a systematic review of the elicited SRS data literature. We found that social survey uncertainty was commonly tested for, but that these uncertainties were ignored when projected spatially. Based on these results we developed a framework which will help researchers and practitioners estimate social survey uncertainty and use these quantitative estimates to systematically address uncertainty within an analysis. This is important when using SRS data in conservation applications because decisions need to be made irrespective of data quality and well characterized uncertainty can be incorporated into decision theoretic approaches. © 2014 Society for Conservation Biology.
Different methodologies to quantify uncertainties of air emissions.
Romano, Daniela; Bernetti, Antonella; De Lauretis, Riccardo
2004-10-01
Characterization of the uncertainty associated with air emission estimates is of critical importance especially in the compilation of air emission inventories. In this paper, two different theories are discussed and applied to evaluate air emissions uncertainty. In addition to numerical analysis, which is also recommended in the framework of the United Nation Convention on Climate Change guidelines with reference to Monte Carlo and Bootstrap simulation models, fuzzy analysis is also proposed. The methodologies are discussed and applied to an Italian example case study. Air concentration values are measured from two electric power plants: a coal plant, consisting of two boilers and a fuel oil plant, of four boilers; the pollutants considered are sulphur dioxide (SO(2)), nitrogen oxides (NO(X)), carbon monoxide (CO) and particulate matter (PM). Monte Carlo, Bootstrap and fuzzy methods have been applied to estimate uncertainty of these data. Regarding Monte Carlo, the most accurate results apply to Gaussian distributions; a good approximation is also observed for other distributions with almost regular features either positive asymmetrical or negative asymmetrical. Bootstrap, on the other hand, gives a good uncertainty estimation for irregular and asymmetrical distributions. The logic of fuzzy analysis, where data are represented as vague and indefinite in opposition to the traditional conception of neatness, certain classification and exactness of the data, follows a different description. In addition to randomness (stochastic variability) only, fuzzy theory deals with imprecision (vagueness) of data. Fuzzy variance of the data set was calculated; the results cannot be directly compared with empirical data but the overall performance of the theory is analysed. Fuzzy theory may appear more suitable for qualitative reasoning than for a quantitative estimation of uncertainty, but it suits well when little information and few measurements are available and when distributions of data are not properly known.
Olea, R.A.; Luppens, J.A.; Tewalt, S.J.
2011-01-01
A common practice for characterizing uncertainty in coal resource assessments has been the itemization of tonnage at the mining unit level and the classification of such units according to distance to drilling holes. Distance criteria, such as those used in U.S. Geological Survey Circular 891, are still widely used for public disclosure. A major deficiency of distance methods is that they do not provide a quantitative measure of uncertainty. Additionally, relying on distance between data points alone does not take into consideration other factors known to have an influence on uncertainty, such as spatial correlation, type of probability distribution followed by the data, geological discontinuities, and boundary of the deposit. Several geostatistical methods have been combined to formulate a quantitative characterization for appraising uncertainty. Drill hole datasets ranging from widespread exploration drilling to detailed development drilling from a lignite deposit in Texas were used to illustrate the modeling. The results show that distance to the nearest drill hole is almost completely unrelated to uncertainty, which confirms the inadequacy of characterizing uncertainty based solely on a simple classification of resources by distance classes. The more complex statistical methods used in this study quantify uncertainty and show good agreement between confidence intervals in the uncertainty predictions and data from additional drilling. ?? 2010.
Uncertainty Optimization Applied to the Monte Carlo Analysis of Planetary Entry Trajectories
NASA Technical Reports Server (NTRS)
Olds, John; Way, David
2001-01-01
Recently, strong evidence of liquid water under the surface of Mars and a meteorite that might contain ancient microbes have renewed interest in Mars exploration. With this renewed interest, NASA plans to send spacecraft to Mars approx. every 26 months. These future spacecraft will return higher-resolution images, make precision landings, engage in longer-ranging surface maneuvers, and even return Martian soil and rock samples to Earth. Future robotic missions and any human missions to Mars will require precise entries to ensure safe landings near science objective and pre-employed assets. Potential sources of water and other interesting geographic features are often located near hazards, such as within craters or along canyon walls. In order for more accurate landings to be made, spacecraft entering the Martian atmosphere need to use lift to actively control the entry. This active guidance results in much smaller landing footprints. Planning for these missions will depend heavily on Monte Carlo analysis. Monte Carlo trajectory simulations have been used with a high degree of success in recent planetary exploration missions. These analyses ascertain the impact of off-nominal conditions during a flight and account for uncertainty. Uncertainties generally stem from limitations in manufacturing tolerances, measurement capabilities, analysis accuracies, and environmental unknowns. Thousands of off-nominal trajectories are simulated by randomly dispersing uncertainty variables and collecting statistics on forecast variables. The dependability of Monte Carlo forecasts, however, is limited by the accuracy and completeness of the assumed uncertainties. This is because Monte Carlo analysis is a forward driven problem; beginning with the input uncertainties and proceeding to the forecasts outputs. It lacks a mechanism to affect or alter the uncertainties based on the forecast results. If the results are unacceptable, the current practice is to use an iterative, trial-and-error approach to reconcile discrepancies. Therefore, an improvement to the Monte Carlo analysis is needed that will allow the problem to be worked in reverse. In this way, the largest allowable dispersions that achieve the required mission objectives can be determined quantitatively.
Probabilistic Mass Growth Uncertainties
NASA Technical Reports Server (NTRS)
Plumer, Eric; Elliott, Darren
2013-01-01
Mass has been widely used as a variable input parameter for Cost Estimating Relationships (CER) for space systems. As these space systems progress from early concept studies and drawing boards to the launch pad, their masses tend to grow substantially, hence adversely affecting a primary input to most modeling CERs. Modeling and predicting mass uncertainty, based on historical and analogous data, is therefore critical and is an integral part of modeling cost risk. This paper presents the results of a NASA on-going effort to publish mass growth datasheet for adjusting single-point Technical Baseline Estimates (TBE) of masses of space instruments as well as spacecraft, for both earth orbiting and deep space missions at various stages of a project's lifecycle. This paper will also discusses the long term strategy of NASA Headquarters in publishing similar results, using a variety of cost driving metrics, on an annual basis. This paper provides quantitative results that show decreasing mass growth uncertainties as mass estimate maturity increases. This paper's analysis is based on historical data obtained from the NASA Cost Analysis Data Requirements (CADRe) database.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greg Ruskauff
2009-02-01
As prescribed in the Pahute Mesa Corrective Action Investigation Plan (CAIP) (DOE/NV, 1999) and Appendix VI of the Federal Facility Agreement and Consent Order (FFACO) (1996, as amended February 2008), the ultimate goal of transport analysis is to develop stochastic predictions of a contaminant boundary at a specified level of uncertainty. However, because of the significant uncertainty of the model results, the primary goal of this report was modified through mutual agreement between the DOE and the State of Nevada to assess the primary model components that contribute to this uncertainty and to postpone defining the contaminant boundary until additionalmore » model refinement is completed. Therefore, the role of this analysis has been to understand the behavior of radionuclide migration in the Pahute Mesa (PM) Corrective Action Unit (CAU) model and to define, both qualitatively and quantitatively, the sensitivity of such behavior to (flow) model conceptualization and (flow and transport) parameterization.« less
Bennett, Erin R; Clausen, Jay; Linkov, Eugene; Linkov, Igor
2009-11-01
Reliable, up-front information on physical and biological properties of emerging materials is essential before making a decision and investment to formulate, synthesize, scale-up, test, and manufacture a new material for use in both military and civilian applications. Multiple quantitative structure-activity relationships (QSARs) software tools are available for predicting a material's physical/chemical properties and environmental effects. Even though information on emerging materials is often limited, QSAR software output is treated without sufficient uncertainty analysis. We hypothesize that uncertainty and variability in material properties and uncertainty in model prediction can be too large to provide meaningful results. To test this hypothesis, we predicted octanol water partitioning coefficients (logP) for multiple, similar compounds with limited physical-chemical properties using six different commercial logP calculators (KOWWIN, MarvinSketch, ACD/Labs, ALogP, CLogP, SPARC). Analysis was done for materials with largely uncertain properties that were similar, based on molecular formula, to military compounds (RDX, BTTN, TNT) and pharmaceuticals (Carbamazepine, Gemfibrizol). We have also compared QSAR modeling results for a well-studied pesticide and pesticide breakdown product (Atrazine, DDE). Our analysis shows variability due to structural variations of the emerging chemicals may be several orders of magnitude. The model uncertainty across six software packages was very high (10 orders of magnitude) for emerging materials while it was low for traditional chemicals (e.g. Atrazine). Thus the use of QSAR models for emerging materials screening requires extensive model validation and coupling QSAR output with available empirical data and other relevant information.
Hyperspectral imaging spectro radiometer improves radiometric accuracy
NASA Astrophysics Data System (ADS)
Prel, Florent; Moreau, Louis; Bouchard, Robert; Bullis, Ritchie D.; Roy, Claude; Vallières, Christian; Levesque, Luc
2013-06-01
Reliable and accurate infrared characterization is necessary to measure the specific spectral signatures of aircrafts and associated infrared counter-measures protections (i.e. flares). Infrared characterization is essential to improve counter measures efficiency, improve friend-foe identification and reduce the risk of friendly fire. Typical infrared characterization measurement setups include a variety of panchromatic cameras and spectroradiometers. Each instrument brings essential information; cameras measure the spatial distribution of targets and spectroradiometers provide the spectral distribution of the emitted energy. However, the combination of separate instruments brings out possible radiometric errors and uncertainties that can be reduced with Hyperspectral imagers. These instruments combine both spectral and spatial information into the same data. These instruments measure both the spectral and spatial distribution of the energy at the same time ensuring the temporal and spatial cohesion of collected information. This paper presents a quantitative analysis of the main contributors of radiometric uncertainties and shows how a hyperspectral imager can reduce these uncertainties.
Measuring Research Data Uncertainty in the 2010 NRC Assessment of Geography Graduate Education
ERIC Educational Resources Information Center
Shortridge, Ashton; Goldsberry, Kirk; Weessies, Kathleen
2011-01-01
This article characterizes and measures errors in the 2010 National Research Council (NRC) assessment of research-doctorate programs in geography. This article provides a conceptual model for data-based sources of uncertainty and reports on a quantitative assessment of NRC research data uncertainty for a particular geography doctoral program.…
NASA Astrophysics Data System (ADS)
Foster, L. K.; Clark, B. R.; Duncan, L. L.; Tebo, D. T.; White, J.
2017-12-01
Several historical groundwater models exist within the Coastal Lowlands Aquifer System (CLAS), which spans the Gulf Coastal Plain in Texas, Louisiana, Mississippi, Alabama, and Florida. The largest of these models, called the Gulf Coast Regional Aquifer System Analysis (RASA) model, has been brought into a new framework using the Newton formulation for MODFLOW-2005 (MODFLOW-NWT) and serves as the starting point of a new investigation underway by the U.S. Geological Survey to improve understanding of the CLAS and provide predictions of future groundwater availability within an uncertainty quantification (UQ) framework. The use of an UQ framework will not only provide estimates of water-level observation worth, hydraulic parameter uncertainty, boundary-condition uncertainty, and uncertainty of future potential predictions, but it will also guide the model development process. Traditionally, model development proceeds from dataset construction to the process of deterministic history matching, followed by deterministic predictions using the model. This investigation will combine the use of UQ with existing historical models of the study area to assess in a quantitative framework the effect model package and property improvements have on the ability to represent past-system states, as well as the effect on the model's ability to make certain predictions of water levels, water budgets, and base-flow estimates. Estimates of hydraulic property information and boundary conditions from the existing models and literature, forming the prior, will be used to make initial estimates of model forecasts and their corresponding uncertainty, along with an uncalibrated groundwater model run within an unconstrained Monte Carlo analysis. First-Order Second-Moment (FOSM) analysis will also be used to investigate parameter and predictive uncertainty, and guide next steps in model development prior to rigorous history matching by using PEST++ parameter estimation code.
SU-F-T-301: Planar Dose Pass Rate Inflation Due to the MapCHECK Measurement Uncertainty Function
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bailey, D; Spaans, J; Kumaraswamy, L
Purpose: To quantify the effect of the Measurement Uncertainty function on planar dosimetry pass rates, as analyzed with Sun Nuclear Corporation analytic software (“MapCHECK” or “SNC Patient”). This optional function is toggled on by default upon software installation, and automatically increases the user-defined dose percent difference (%Diff) tolerance for each planar dose comparison. Methods: Dose planes from 109 IMRT fields and 40 VMAT arcs were measured with the MapCHECK 2 diode array, and compared to calculated planes from a commercial treatment planning system. Pass rates were calculated within the SNC analytic software using varying calculation parameters, including Measurement Uncertainty onmore » and off. By varying the %Diff criterion for each dose comparison performed with Measurement Uncertainty turned off, an effective %Diff criterion was defined for each field/arc corresponding to the pass rate achieved with MapCHECK Uncertainty turned on. Results: For 3%/3mm analysis, the Measurement Uncertainty function increases the user-defined %Diff by 0.8–1.1% average, depending on plan type and calculation technique, for an average pass rate increase of 1.0–3.5% (maximum +8.7%). For 2%, 2 mm analysis, the Measurement Uncertainty function increases the user-defined %Diff by 0.7–1.2% average, for an average pass rate increase of 3.5–8.1% (maximum +14.2%). The largest increases in pass rate are generally seen with poorly-matched planar dose comparisons; the MapCHECK Uncertainty effect is markedly smaller as pass rates approach 100%. Conclusion: The Measurement Uncertainty function may substantially inflate planar dose comparison pass rates for typical IMRT and VMAT planes. The types of uncertainties incorporated into the function (and their associated quantitative estimates) as described in the software user’s manual may not accurately estimate realistic measurement uncertainty for the user’s measurement conditions. Pass rates listed in published reports or otherwise compared to the results of other users or vendors should clearly indicate whether the Measurement Uncertainty function is used.« less
Li, Wei Bo; Greiter, Matthias; Oeh, Uwe; Hoeschen, Christoph
2011-12-01
The reliability of biokinetic models is essential in internal dose assessments and radiation risk analysis for the public, occupational workers, and patients exposed to radionuclides. In this paper, a method for assessing the reliability of biokinetic models by means of uncertainty and sensitivity analysis was developed. The paper is divided into two parts. In the first part of the study published here, the uncertainty sources of the model parameters for zirconium (Zr), developed by the International Commission on Radiological Protection (ICRP), were identified and analyzed. Furthermore, the uncertainty of the biokinetic experimental measurement performed at the Helmholtz Zentrum München-German Research Center for Environmental Health (HMGU) for developing a new biokinetic model of Zr was analyzed according to the Guide to the Expression of Uncertainty in Measurement, published by the International Organization for Standardization. The confidence interval and distribution of model parameters of the ICRP and HMGU Zr biokinetic models were evaluated. As a result of computer biokinetic modelings, the mean, standard uncertainty, and confidence interval of model prediction calculated based on the model parameter uncertainty were presented and compared to the plasma clearance and urinary excretion measured after intravenous administration. It was shown that for the most important compartment, the plasma, the uncertainty evaluated for the HMGU model was much smaller than that for the ICRP model; that phenomenon was observed for other organs and tissues as well. The uncertainty of the integral of the radioactivity of Zr up to 50 y calculated by the HMGU model after ingestion by adult members of the public was shown to be smaller by a factor of two than that of the ICRP model. It was also shown that the distribution type of the model parameter strongly influences the model prediction, and the correlation of the model input parameters affects the model prediction to a certain extent depending on the strength of the correlation. In the case of model prediction, the qualitative comparison of the model predictions with the measured plasma and urinary data showed the HMGU model to be more reliable than the ICRP model; quantitatively, the uncertainty model prediction by the HMGU systemic biokinetic model is smaller than that of the ICRP model. The uncertainty information on the model parameters analyzed in this study was used in the second part of the paper regarding a sensitivity analysis of the Zr biokinetic models.
Chander, G.; Helder, D.L.; Aaron, David; Mishra, N.; Shrestha, A.K.
2013-01-01
Cross-calibration of satellite sensors permits the quantitative comparison of measurements obtained from different Earth Observing (EO) systems. Cross-calibration studies usually use simultaneous or near-simultaneous observations from several spaceborne sensors to develop band-by-band relationships through regression analysis. The investigation described in this paper focuses on evaluation of the uncertainties inherent in the cross-calibration process, including contributions due to different spectral responses, spectral resolution, spectral filter shift, geometric misregistrations, and spatial resolutions. The hyperspectral data from the Environmental Satellite SCanning Imaging Absorption SpectroMeter for Atmospheric CartograpHY and the EO-1 Hyperion, along with the relative spectral responses (RSRs) from the Landsat 7 Enhanced Thematic Mapper (TM) Plus and the Terra Moderate Resolution Imaging Spectroradiometer sensors, were used for the spectral uncertainty study. The data from Landsat 5 TM over five representative land cover types (desert, rangeland, grassland, deciduous forest, and coniferous forest) were used for the geometric misregistrations and spatial-resolution study. The spectral resolution uncertainty was found to be within 0.25%, spectral filter shift within 2.5%, geometric misregistrations within 0.35%, and spatial-resolution effects within 0.1% for the Libya 4 site. The one-sigma uncertainties presented in this paper are uncorrelated, and therefore, the uncertainties can be summed orthogonally. Furthermore, an overall total uncertainty was developed. In general, the results suggested that the spectral uncertainty is more dominant compared to other uncertainties presented in this paper. Therefore, the effect of the sensor RSR differences needs to be quantified and compensated to avoid large uncertainties in cross-calibration results.
NASA Astrophysics Data System (ADS)
Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten
2016-04-01
Landslides have many negative economic and societal impacts, including the potential for significant loss of life and damage to infrastructure. Slope stability assessment can be used to guide decisions about the management of landslide risk, but its usefulness can be challenged by high levels of uncertainty in predicting landslide occurrence. Prediction uncertainty may be associated with the choice of model that is used to assess slope stability, the quality of the available input data, or a lack of knowledge of how future climatic and socio-economic changes may affect future landslide risk. While some of these uncertainties can be characterised by relatively well-defined probability distributions, for other uncertainties, such as those linked to climate change, no probability distribution is available to characterise them. This latter type of uncertainty, often referred to as deep uncertainty, means that robust policies need to be developed that are expected to perform acceptably well over a wide range of future conditions. In our study the impact of deep uncertainty on slope stability predictions is assessed in a quantitative and structured manner using Global Sensitivity Analysis (GSA) and the Combined Hydrology and Stability Model (CHASM). In particular, we use several GSA methods including the Method of Morris, Regional Sensitivity Analysis and Classification and Regression Trees (CART), as well as advanced visualization tools, to assess the combination of conditions that may lead to slope failure. Our example application is a slope in the Caribbean, an area that is naturally susceptible to landslides due to a combination of high rainfall rates during the hurricane season, steep slopes, and highly weathered residual soils. Rapid unplanned urbanisation and changing climate may further exacerbate landslide risk in the future. Our example shows how we can gain useful information in the presence of deep uncertainty by combining physically based models with GSA in a scenario discovery framework.
The Uncertainty of Local Background Magnetic Field Orientation in Anisotropic Plasma Turbulence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerick, F.; Saur, J.; Papen, M. von, E-mail: felix.gerick@uni-koeln.de
In order to resolve and characterize anisotropy in turbulent plasma flows, a proper estimation of the background magnetic field is crucially important. Various approaches to calculating the background magnetic field, ranging from local to globally averaged fields, are commonly used in the analysis of turbulent data. We investigate how the uncertainty in the orientation of a scale-dependent background magnetic field influences the ability to resolve anisotropy. Therefore, we introduce a quantitative measure, the angle uncertainty, that characterizes the uncertainty of the orientation of the background magnetic field that turbulent structures are exposed to. The angle uncertainty can be used asmore » a condition to estimate the ability to resolve anisotropy with certain accuracy. We apply our description to resolve the spectral anisotropy in fast solar wind data. We show that, if the angle uncertainty grows too large, the power of the turbulent fluctuations is attributed to false local magnetic field angles, which may lead to an incorrect estimation of the spectral indices. In our results, an apparent robustness of the spectral anisotropy to false local magnetic field angles is observed, which can be explained by a stronger increase of power for lower frequencies when the scale of the local magnetic field is increased. The frequency-dependent angle uncertainty is a measure that can be applied to any turbulent system.« less
Knoblauch, Theresa A K; Stauffacher, Michael; Trutnevyte, Evelina
2018-04-01
Subsurface energy activities entail the risk of induced seismicity including low-probability high-consequence (LPHC) events. For designing respective risk communication, the scientific literature lacks empirical evidence of how the public reacts to different written risk communication formats about such LPHC events and to related uncertainty or expert confidence. This study presents findings from an online experiment (N = 590) that empirically tested the public's responses to risk communication about induced seismicity and to different technology frames, namely deep geothermal energy (DGE) and shale gas (between-subject design). Three incrementally different formats of written risk communication were tested: (i) qualitative, (ii) qualitative and quantitative, and (iii) qualitative and quantitative with risk comparison. Respondents found the latter two the easiest to understand, the most exact, and liked them the most. Adding uncertainty and expert confidence statements made the risk communication less clear, less easy to understand and increased concern. Above all, the technology for which risks are communicated and its acceptance mattered strongly: respondents in the shale gas condition found the identical risk communication less trustworthy and more concerning than in the DGE conditions. They also liked the risk communication overall less. For practitioners in DGE or shale gas projects, the study shows that the public would appreciate efforts in describing LPHC risks with numbers and optionally risk comparisons. However, there seems to be a trade-off between aiming for transparency by disclosing uncertainty and limited expert confidence, and thereby decreasing clarity and increasing concern in the view of the public. © 2017 Society for Risk Analysis.
Quantitative assessment of building fire risk to life safety.
Guanquan, Chu; Jinhua, Sun
2008-06-01
This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.
NASA Astrophysics Data System (ADS)
Shen, Chengcheng; Shi, Honghua; Liu, Yongzhi; Li, Fen; Ding, Dewen
2016-07-01
Marine ecosystem dynamic models (MEDMs) are important tools for the simulation and prediction of marine ecosystems. This article summarizes the methods and strategies used for the improvement and assessment of MEDM skill, and it attempts to establish a technical framework to inspire further ideas concerning MEDM skill improvement. The skill of MEDMs can be improved by parameter optimization (PO), which is an important step in model calibration. An efficient approach to solve the problem of PO constrained by MEDMs is the global treatment of both sensitivity analysis and PO. Model validation is an essential step following PO, which validates the efficiency of model calibration by analyzing and estimating the goodness-of-fit of the optimized model. Additionally, by focusing on the degree of impact of various factors on model skill, model uncertainty analysis can supply model users with a quantitative assessment of model confidence. Research on MEDMs is ongoing; however, improvement in model skill still lacks global treatments and its assessment is not integrated. Thus, the predictive performance of MEDMs is not strong and model uncertainties lack quantitative descriptions, limiting their application. Therefore, a large number of case studies concerning model skill should be performed to promote the development of a scientific and normative technical framework for the improvement of MEDM skill.
NASA Astrophysics Data System (ADS)
Aleksankina, Ksenia; Heal, Mathew R.; Dore, Anthony J.; Van Oijen, Marcel; Reis, Stefan
2018-04-01
Atmospheric chemistry transport models (ACTMs) are widely used to underpin policy decisions associated with the impact of potential changes in emissions on future pollutant concentrations and deposition. It is therefore essential to have a quantitative understanding of the uncertainty in model output arising from uncertainties in the input pollutant emissions. ACTMs incorporate complex and non-linear descriptions of chemical and physical processes which means that interactions and non-linearities in input-output relationships may not be revealed through the local one-at-a-time sensitivity analysis typically used. The aim of this work is to demonstrate a global sensitivity and uncertainty analysis approach for an ACTM, using as an example the FRAME model, which is extensively employed in the UK to generate source-receptor matrices for the UK Integrated Assessment Model and to estimate critical load exceedances. An optimised Latin hypercube sampling design was used to construct model runs within ±40 % variation range for the UK emissions of SO2, NOx, and NH3, from which regression coefficients for each input-output combination and each model grid ( > 10 000 across the UK) were calculated. Surface concentrations of SO2, NOx, and NH3 (and of deposition of S and N) were found to be predominantly sensitive to the emissions of the respective pollutant, while sensitivities of secondary species such as HNO3 and particulate SO42-, NO3-, and NH4+ to pollutant emissions were more complex and geographically variable. The uncertainties in model output variables were propagated from the uncertainty ranges reported by the UK National Atmospheric Emissions Inventory for the emissions of SO2, NOx, and NH3 (±4, ±10, and ±20 % respectively). The uncertainties in the surface concentrations of NH3 and NOx and the depositions of NHx and NOy were dominated by the uncertainties in emissions of NH3, and NOx respectively, whilst concentrations of SO2 and deposition of SOy were affected by the uncertainties in both SO2 and NH3 emissions. Likewise, the relative uncertainties in the modelled surface concentrations of each of the secondary pollutant variables (NH4+, NO3-, SO42-, and HNO3) were due to uncertainties in at least two input variables. In all cases the spatial distribution of relative uncertainty was found to be geographically heterogeneous. The global methods used here can be applied to conduct sensitivity and uncertainty analyses of other ACTMs.
Development of a Probabilistic Tsunami Hazard Analysis in Japan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toshiaki Sakai; Tomoyoshi Takeda; Hiroshi Soraoka
2006-07-01
It is meaningful for tsunami assessment to evaluate phenomena beyond the design basis as well as seismic design. Because once we set the design basis tsunami height, we still have possibilities tsunami height may exceeds the determined design tsunami height due to uncertainties regarding the tsunami phenomena. Probabilistic tsunami risk assessment consists of estimating for tsunami hazard and fragility of structures and executing system analysis. In this report, we apply a method for probabilistic tsunami hazard analysis (PTHA). We introduce a logic tree approach to estimate tsunami hazard curves (relationships between tsunami height and probability of excess) and present anmore » example for Japan. Examples of tsunami hazard curves are illustrated, and uncertainty in the tsunami hazard is displayed by 5-, 16-, 50-, 84- and 95-percentile and mean hazard curves. The result of PTHA will be used for quantitative assessment of the tsunami risk for important facilities located on coastal area. Tsunami hazard curves are the reasonable input data for structures and system analysis. However the evaluation method for estimating fragility of structures and the procedure of system analysis is now being developed. (authors)« less
Prioritizing Chemicals and Data Requirements for Screening-Level Exposure and Risk Assessment
Brown, Trevor N.; Wania, Frank; Breivik, Knut; McLachlan, Michael S.
2012-01-01
Background: Scientists and regulatory agencies strive to identify chemicals that may cause harmful effects to humans and the environment; however, prioritization is challenging because of the large number of chemicals requiring evaluation and limited data and resources. Objectives: We aimed to prioritize chemicals for exposure and exposure potential and obtain a quantitative perspective on research needs to better address uncertainty in screening assessments. Methods: We used a multimedia mass balance model to prioritize > 12,000 organic chemicals using four far-field human exposure metrics. The propagation of variance (uncertainty) in key chemical information used as model input for calculating exposure metrics was quantified. Results: Modeled human concentrations and intake rates span approximately 17 and 15 orders of magnitude, respectively. Estimates of exposure potential using human concentrations and a unit emission rate span approximately 13 orders of magnitude, and intake fractions span 7 orders of magnitude. The actual chemical emission rate contributes the greatest variance (uncertainty) in exposure estimates. The human biotransformation half-life is the second greatest source of uncertainty in estimated concentrations. In general, biotransformation and biodegradation half-lives are greater sources of uncertainty in modeled exposure and exposure potential than chemical partition coefficients. Conclusions: Mechanistic exposure modeling is suitable for screening and prioritizing large numbers of chemicals. By including uncertainty analysis and uncertainty in chemical information in the exposure estimates, these methods can help identify and address the important sources of uncertainty in human exposure and risk assessment in a systematic manner. PMID:23008278
NASA Astrophysics Data System (ADS)
Wang, S.; Huang, G. H.; Veawab, A.
2013-03-01
This study proposes a sequential factorial analysis (SFA) approach for supporting regional air quality management under uncertainty. SFA is capable not only of examining the interactive effects of input parameters, but also of analyzing the effects of constraints. When there are too many factors involved in practical applications, SFA has the advantage of conducting a sequence of factorial analyses for characterizing the effects of factors in a systematic manner. The factor-screening strategy employed in SFA is effective in greatly reducing the computational effort. The proposed SFA approach is applied to a regional air quality management problem for demonstrating its applicability. The results indicate that the effects of factors are evaluated quantitatively, which can help decision makers identify the key factors that have significant influence on system performance and explore the valuable information that may be veiled beneath their interrelationships.
Lidar arc scan uncertainty reduction through scanning geometry optimization
NASA Astrophysics Data System (ADS)
Wang, Hui; Barthelmie, Rebecca J.; Pryor, Sara C.; Brown, Gareth.
2016-04-01
Doppler lidars are frequently operated in a mode referred to as arc scans, wherein the lidar beam scans across a sector with a fixed elevation angle and the resulting measurements are used to derive an estimate of the n minute horizontal mean wind velocity (speed and direction). Previous studies have shown that the uncertainty in the measured wind speed originates from turbulent wind fluctuations and depends on the scan geometry (the arc span and the arc orientation). This paper is designed to provide guidance on optimal scan geometries for two key applications in the wind energy industry: wind turbine power performance analysis and annual energy production prediction. We present a quantitative analysis of the retrieved wind speed uncertainty derived using a theoretical model with the assumption of isotropic and frozen turbulence, and observations from three sites that are onshore with flat terrain, onshore with complex terrain and offshore, respectively. The results from both the theoretical model and observations show that the uncertainty is scaled with the turbulence intensity such that the relative standard error on the 10 min mean wind speed is about 30 % of the turbulence intensity. The uncertainty in both retrieved wind speeds and derived wind energy production estimates can be reduced by aligning lidar beams with the dominant wind direction, increasing the arc span and lowering the number of beams per arc scan. Large arc spans should be used at sites with high turbulence intensity and/or large wind direction variation.
Lidar arc scan uncertainty reduction through scanning geometry optimization
NASA Astrophysics Data System (ADS)
Wang, H.; Barthelmie, R. J.; Pryor, S. C.; Brown, G.
2015-10-01
Doppler lidars are frequently operated in a mode referred to as arc scans, wherein the lidar beam scans across a sector with a fixed elevation angle and the resulting measurements are used to derive an estimate of the n minute horizontal mean wind velocity (speed and direction). Previous studies have shown that the uncertainty in the measured wind speed originates from turbulent wind fluctuations and depends on the scan geometry (the arc span and the arc orientation). This paper is designed to provide guidance on optimal scan geometries for two key applications in the wind energy industry: wind turbine power performance analysis and annual energy production. We present a quantitative analysis of the retrieved wind speed uncertainty derived using a theoretical model with the assumption of isotropic and frozen turbulence, and observations from three sites that are onshore with flat terrain, onshore with complex terrain and offshore, respectively. The results from both the theoretical model and observations show that the uncertainty is scaled with the turbulence intensity such that the relative standard error on the 10 min mean wind speed is about 30 % of the turbulence intensity. The uncertainty in both retrieved wind speeds and derived wind energy production estimates can be reduced by aligning lidar beams with the dominant wind direction, increasing the arc span and lowering the number of beams per arc scan. Large arc spans should be used at sites with high turbulence intensity and/or large wind direction variation when arc scans are used for wind resource assessment.
NASA Astrophysics Data System (ADS)
Gan, Luping; Li, Yan-Feng; Zhu, Shun-Peng; Yang, Yuan-Jian; Huang, Hong-Zhong
2014-06-01
Failure mode, effects and criticality analysis (FMECA) and Fault tree analysis (FTA) are powerful tools to evaluate reliability of systems. Although single failure mode issue can be efficiently addressed by traditional FMECA, multiple failure modes and component correlations in complex systems cannot be effectively evaluated. In addition, correlated variables and parameters are often assumed to be precisely known in quantitative analysis. In fact, due to the lack of information, epistemic uncertainty commonly exists in engineering design. To solve these problems, the advantages of FMECA, FTA, fuzzy theory, and Copula theory are integrated into a unified hybrid method called fuzzy probability weighted geometric mean (FPWGM) risk priority number (RPN) method. The epistemic uncertainty of risk variables and parameters are characterized by fuzzy number to obtain fuzzy weighted geometric mean (FWGM) RPN for single failure mode. Multiple failure modes are connected using minimum cut sets (MCS), and Boolean logic is used to combine fuzzy risk priority number (FRPN) of each MCS. Moreover, Copula theory is applied to analyze the correlation of multiple failure modes in order to derive the failure probabilities of each MCS. Compared to the case where dependency among multiple failure modes is not considered, the Copula modeling approach eliminates the error of reliability analysis. Furthermore, for purpose of quantitative analysis, probabilities importance weight from failure probabilities are assigned to FWGM RPN to reassess the risk priority, which generalize the definition of probability weight and FRPN, resulting in a more accurate estimation than that of the traditional models. Finally, a basic fatigue analysis case drawn from turbine and compressor blades in aeroengine is used to demonstrate the effectiveness and robustness of the presented method. The result provides some important insights on fatigue reliability analysis and risk priority assessment of structural system under failure correlations.
Wu, Jianyong; Gronewold, Andrew D; Rodriguez, Roberto A; Stewart, Jill R; Sobsey, Mark D
2014-02-01
Rapid quantification of viral pathogens in drinking and recreational water can help reduce waterborne disease risks. For this purpose, samples in small volume (e.g. 1L) are favored because of the convenience of collection, transportation and processing. However, the results of viral analysis are often subject to uncertainty. To overcome this limitation, we propose an approach that integrates Bayesian statistics, efficient concentration methods, and quantitative PCR (qPCR) to quantify viral pathogens in water. Using this approach, we quantified human adenoviruses (HAdVs) in eighteen samples of source water collected from six drinking water treatment plants. HAdVs were found in seven samples. In the other eleven samples, HAdVs were not detected by qPCR, but might have existed based on Bayesian inference. Our integrated approach that quantifies uncertainty provides a better understanding than conventional assessments of potential risks to public health, particularly in cases when pathogens may present a threat but cannot be detected by traditional methods. © 2013 Elsevier B.V. All rights reserved.
Moses, Wesley J.; Bowles, Jeffrey H.; Corson, Michael R.
2015-01-01
Using simulated data, we investigated the effect of noise in a spaceborne hyperspectral sensor on the accuracy of the atmospheric correction of at-sensor radiances and the consequent uncertainties in retrieved water quality parameters. Specifically, we investigated the improvement expected as the F-number of the sensor is changed from 3.5, which is the smallest among existing operational spaceborne hyperspectral sensors, to 1.0, which is foreseeable in the near future. With the change in F-number, the uncertainties in the atmospherically corrected reflectance decreased by more than 90% across the visible-near-infrared spectrum, the number of pixels with negative reflectance (caused by over-correction) decreased to almost one-third, and the uncertainties in the retrieved water quality parameters decreased by more than 50% and up to 92%. The analysis was based on the sensor model of the Hyperspectral Imager for the Coastal Ocean (HICO) but using a 30-m spatial resolution instead of HICO’s 96 m. Atmospheric correction was performed using Tafkaa. Water quality parameters were retrieved using a numerical method and a semi-analytical algorithm. The results emphasize the effect of sensor noise on water quality parameter retrieval and the need for sensors with high Signal-to-Noise Ratio for quantitative remote sensing of optically complex waters. PMID:25781507
Feasibility study for the quantitative assessment of mineral resources in asteroids
Keszthelyi, Laszlo; Hagerty, Justin; Bowers, Amanda; Ellefsen, Karl; Ridley, Ian; King, Trude; Trilling, David; Moskovitz, Nicholas; Grundy, Will
2017-04-21
This study was undertaken to determine if the U.S. Geological Survey’s process for conducting mineral resource assessments on Earth can be applied to asteroids. Successful completion of the assessment, using water and iron resources to test the workflow, has resulted in identification of the minimal adjustments required to conduct full resource assessments beyond Earth. We also identify the types of future studies that would greatly reduce uncertainties in an actual future assessment. Whereas this is a feasibility study and does not include a complete and robust analysis of uncertainty, it is clear that the water and metal resources in near-Earth asteroids are sufficient to support humanity should it become a fully space-faring species.
Validating an Air Traffic Management Concept of Operation Using Statistical Modeling
NASA Technical Reports Server (NTRS)
He, Yuning; Davies, Misty Dawn
2013-01-01
Validating a concept of operation for a complex, safety-critical system (like the National Airspace System) is challenging because of the high dimensionality of the controllable parameters and the infinite number of states of the system. In this paper, we use statistical modeling techniques to explore the behavior of a conflict detection and resolution algorithm designed for the terminal airspace. These techniques predict the robustness of the system simulation to both nominal and off-nominal behaviors within the overall airspace. They also can be used to evaluate the output of the simulation against recorded airspace data. Additionally, the techniques carry with them a mathematical value of the worth of each prediction-a statistical uncertainty for any robustness estimate. Uncertainty Quantification (UQ) is the process of quantitative characterization and ultimately a reduction of uncertainties in complex systems. UQ is important for understanding the influence of uncertainties on the behavior of a system and therefore is valuable for design, analysis, and verification and validation. In this paper, we apply advanced statistical modeling methodologies and techniques on an advanced air traffic management system, namely the Terminal Tactical Separation Assured Flight Environment (T-TSAFE). We show initial results for a parameter analysis and safety boundary (envelope) detection in the high-dimensional parameter space. For our boundary analysis, we developed a new sequential approach based upon the design of computer experiments, allowing us to incorporate knowledge from domain experts into our modeling and to determine the most likely boundary shapes and its parameters. We carried out the analysis on system parameters and describe an initial approach that will allow us to include time-series inputs, such as the radar track data, into the analysis
LCA to choose among alternative design solutions: the case study of a new Italian incineration line.
Scipioni, A; Mazzi, A; Niero, M; Boatto, T
2009-09-01
At international level LCA is being increasingly used to objectively evaluate the performances of different Municipal Solid Waste (MSW) management solutions. One of the more important waste management options concerns MSW incineration. LCA is usually applied to existing incineration plants. In this study LCA methodology was applied to a new Italian incineration line, to facilitate the prediction, during the design phase, of its potential environmental impacts in terms of damage to human health, ecosystem quality and consumption of resources. The aim of the study was to analyse three different design alternatives: an incineration system with dry flue gas cleaning (without- and with-energy recovery) and one with wet flue gas cleaning. The last two technological solutions both incorporating facilities for energy recovery were compared. From the results of the study, the system with energy recovery and dry flue gas cleaning revealed lower environmental impacts in relation to the ecosystem quality. As LCA results are greatly affected by uncertainties of different types, the second part of the work provides for an uncertainty analysis aimed at detecting the extent output data from life cycle analysis are influenced by uncertainty of input data, and employs both qualitative (pedigree matrix) and quantitative methods (Monte Carlo analysis).
ERIC Educational Resources Information Center
Kramer, Michael W.; Dougherty, Debbie S.; Pierce, Tamyra A.
2004-01-01
This study examined pilots' (N at T1 = 140; N at T2 = 126; N at T3 = 104) reactions to communication and uncertainty during the acquisition of their airline by another airline. Quantitative results indicate that communication helped to reduce uncertainty and was predictive of affective responses to the acquisition. However, contrary to…
NASA Astrophysics Data System (ADS)
Albano, R.; Sole, A.; Mancusi, L.; Cantisani, A.; Perrone, A.
2017-12-01
The considerable increase of flood damages in the the past decades has shifted in Europe the attention from protection against floods to managing flood risks. In this context, the expected damages assessment represents a crucial information within the overall flood risk management process. The present paper proposes an open source software, called FloodRisk, that is able to operatively support stakeholders in the decision making processes with a what-if approach by carrying out the rapid assessment of the flood consequences, in terms of direct economic damage and loss of human lives. The evaluation of the damage scenarios, trough the use of the GIS software proposed here, is essential for cost-benefit or multi-criteria analysis of risk mitigation alternatives. However, considering that quantitative assessment of flood damages scenarios is characterized by intrinsic uncertainty, a scheme has been developed to identify and quantify the role of the input parameters in the total uncertainty of flood loss model application in urban areas with mild terrain and complex topography. By the concept of parallel models, the contribution of different module and input parameters to the total uncertainty is quantified. The results of the present case study have exhibited a high epistemic uncertainty on the damage estimation module and, in particular, on the type and form of the utilized damage functions, which have been adapted and transferred from different geographic and socio-economic contexts because there aren't depth-damage functions that are specifically developed for Italy. Considering that uncertainty and sensitivity depend considerably on local characteristics, the epistemic uncertainty associated with the risk estimate is reduced by introducing additional information into the risk analysis. In the light of the obtained results, it is evident the need to produce and disseminate (open) data to develop micro-scale vulnerability curves. Moreover, the urgent need to push forward research into the implementation of methods and models for the assimilation of uncertainties in decision-making processes emerges.
Birnbrauer, Kristina; Frohlich, Dennis Owen; Treise, Debbie
2017-09-01
West Nile Virus (WNV) has been reported as one of the worst epidemics in US history. This study sought to understand how WNV news stories were framed and how risk information was portrayed from its 1999 arrival in the US through the year 2012. The authors conducted a quantitative content analysis of online news articles obtained through Google News ( N = 428). The results of this analysis were compared to the CDC's ArboNET surveillance system. The following story frames were identified in this study: action, conflict, consequence, new evidence, reassurance and uncertainty, with the action frame appearing most frequently. Risk was communicated quantitatively without context in the majority of articles, and only in 2006, the year with the third-highest reported deaths, was risk reported with statistical accuracy. The results from the analysis indicated that at-risk communities were potentially under-informed as accurate risks were not communicated. This study offers evidence about how disease outbreaks are covered in relation to actual disease surveillance data.
A Comprehensive Analysis of Uncertainties Affecting the Stellar Mass-Halo Mass Relation for 0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Behroozi, Peter S.; Conroy, Charlie; Wechsler, Risa H.
2010-06-07
We conduct a comprehensive analysis of the relationship between central galaxies and their host dark matter halos, as characterized by the stellar mass - halo mass (SM-HM) relation, with rigorous consideration of uncertainties. Our analysis focuses on results from the abundance matching technique, which assumes that every dark matter halo or subhalo above a specific mass threshold hosts one galaxy. We provide a robust estimate of the SM-HM relation for 0 < z < 1 and discuss the quantitative effects of uncertainties in observed galaxy stellar mass functions (GSMFs) (including stellar mass estimates and counting uncertainties), halo mass functions (includingmore » cosmology and uncertainties from substructure), and the abundance matching technique used to link galaxies to halos (including scatter in this connection). Our analysis results in a robust estimate of the SM-HM relation and its evolution from z=0 to z=4. The shape and evolution are well constrained for z < 1. The largest uncertainties at these redshifts are due to stellar mass estimates (0.25 dex uncertainty in normalization); however, failure to account for scatter in stellar masses at fixed halo mass can lead to errors of similar magnitude in the SM-HM relation for central galaxies in massive halos. We also investigate the SM-HM relation to z = 4, although the shape of the relation at higher redshifts remains fairly unconstrained when uncertainties are taken into account. We find that the integrated star formation at a given halo mass peaks at 10-20% of available baryons for all redshifts from 0 to 4. This peak occurs at a halo mass of 7 x 10{sup 11} M{sub {circle_dot}} at z = 0 and this mass increases by a factor of 5 to z = 4. At lower and higher masses, star formation is substantially less efficient, with stellar mass scaling as M{sub *} {approx} M{sub h}{sup 2.3} at low masses and M{sub *} {approx} M{sub h}{sup 0.29} at high masses. The typical stellar mass for halos with mass less than 10{sup 12} M{sub {circle_dot}} has increased by 0.3-0.45 dex for halos since z {approx} 1. These results will provide a powerful tool to inform galaxy evolution models.« less
Trust Measurement using Multimodal Behavioral Analysis and Uncertainty Aware Trust Calibration
2018-01-05
to estimate their performance based on their estimation on all prior trials. In the meanwhile via comparing the decisions of participants with the...it is easier compared with situations when more trials have been done. It should be noted that if a participant is good at memorizing the previous...them. The proposed study, being quantitative and explorative, are expected to reveal a number of findings that benefit interaction system design and
Hock, Sabrina; Hasenauer, Jan; Theis, Fabian J
2013-01-01
Diffusion is a key component of many biological processes such as chemotaxis, developmental differentiation and tissue morphogenesis. Since recently, the spatial gradients caused by diffusion can be assessed in-vitro and in-vivo using microscopy based imaging techniques. The resulting time-series of two dimensional, high-resolutions images in combination with mechanistic models enable the quantitative analysis of the underlying mechanisms. However, such a model-based analysis is still challenging due to measurement noise and sparse observations, which result in uncertainties of the model parameters. We introduce a likelihood function for image-based measurements with log-normal distributed noise. Based upon this likelihood function we formulate the maximum likelihood estimation problem, which is solved using PDE-constrained optimization methods. To assess the uncertainty and practical identifiability of the parameters we introduce profile likelihoods for diffusion processes. As proof of concept, we model certain aspects of the guidance of dendritic cells towards lymphatic vessels, an example for haptotaxis. Using a realistic set of artificial measurement data, we estimate the five kinetic parameters of this model and compute profile likelihoods. Our novel approach for the estimation of model parameters from image data as well as the proposed identifiability analysis approach is widely applicable to diffusion processes. The profile likelihood based method provides more rigorous uncertainty bounds in contrast to local approximation methods.
On the Applications of IBA Techniques to Biological Samples Analysis: PIXE and RBS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Falcon-Gonzalez, J. M.; Bernal-Alvarado, J.; Sosa, M.
2008-08-11
The analytical techniques based on ion beams or IBA techniques give quantitative information on elemental concentration in samples of a wide variety of nature. In this work, we focus on PIXE technique, analyzing thick target biological specimens (TTPIXE), using 3 MeV protons produced by an electrostatic accelerator. A nuclear microprobe was used performing PIXE and RBS simultaneously, in order to solve the uncertainties produced in the absolute PIXE quantifying. The advantages of using both techniques and a nuclear microprobe are discussed. Quantitative results are shown to illustrate the multielemental resolution of the PIXE technique; for this, a blood standard wasmore » used.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-05
.... This draft document describes the quantitative analyses that are being conducted as part of the review... primary (health-based) CO NAAQS, the Agency is conducting qualitative and quantitative assessments... results, observations, and related uncertainties associated with the quantitative analyses performed. An...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-19
... Review Draft. These two draft assessment documents describe the quantitative analyses the EPA is... NAAQS,\\3\\ the Agency is conducting quantitative assessments characterizing the: (1) Health risks... present the initial key results, observations, and related uncertainties associated with the quantitative...
NASA Astrophysics Data System (ADS)
Qian, Y.; Wang, C.; Huang, M.; Berg, L. K.; Duan, Q.; Feng, Z.; Shrivastava, M. B.; Shin, H. H.; Hong, S. Y.
2016-12-01
This study aims to quantify the relative importance and uncertainties of different physical processes and parameters in affecting simulated surface fluxes and land-atmosphere coupling strength over the Amazon region. We used two-legged coupling metrics, which include both terrestrial (soil moisture to surface fluxes) and atmospheric (surface fluxes to atmospheric state or precipitation) legs, to diagnose the land-atmosphere interaction and coupling strength. Observations made using the Department of Energy's Atmospheric Radiation Measurement (ARM) Mobile Facility during the GoAmazon field campaign together with satellite and reanalysis data are used to evaluate model performance. To quantify the uncertainty in physical parameterizations, we performed a 120 member ensemble of simulations with the WRF model using a stratified experimental design including 6 cloud microphysics, 3 convection, 6 PBL and surface layer, and 3 land surface schemes. A multiple-way analysis of variance approach is used to quantitatively analyze the inter- and intra-group (scheme) means and variances. To quantify parameter sensitivity, we conducted an additional 256 WRF simulations in which an efficient sampling algorithm is used to explore the multiple-dimensional parameter space. Three uncertainty quantification approaches are applied for sensitivity analysis (SA) of multiple variables of interest to 20 selected parameters in YSU PBL and MM5 surface layer schemes. Results show consistent parameter sensitivity across different SA methods. We found that 5 out of 20 parameters contribute more than 90% total variance, and first-order effects dominate comparing to the interaction effects. Results of this uncertainty quantification study serve as guidance for better understanding the roles of different physical processes in land-atmosphere interactions, quantifying model uncertainties from various sources such as physical processes, parameters and structural errors, and providing insights for improving the model physics parameterizations.
Bio-physical vs. Economic Uncertainty in the Analysis of Climate Change Impacts on World Agriculture
NASA Astrophysics Data System (ADS)
Hertel, T. W.; Lobell, D. B.
2010-12-01
Accumulating evidence suggests that agricultural production could be greatly affected by climate change, but there remains little quantitative understanding of how these agricultural impacts would affect economic livelihoods in poor countries. The recent paper by Hertel, Burke and Lobell (GEC, 2010) considers three scenarios of agricultural impacts of climate change, corresponding to the fifth, fiftieth, and ninety fifth percentiles of projected yield distributions for the world’s crops in 2030. They evaluate the resulting changes in global commodity prices, national economic welfare, and the incidence of poverty in a set of 15 developing countries. Although the small price changes under the medium scenario are consistent with previous findings, their low productivity scenario reveals the potential for much larger food price changes than reported in recent studies which have hitherto focused on the most likely outcomes. The poverty impacts of price changes under the extremely adverse scenario are quite heterogeneous and very significant in some population strata. They conclude that it is critical to look beyond central case climate shocks and beyond a simple focus on yields and highly aggregated poverty impacts. In this paper, we conduct a more formal, systematic sensitivity analysis (SSA) with respect to uncertainty in the biophysical impacts of climate change on agriculture, by explicitly specifying joint distributions for global yield changes - this time focusing on 2050. This permits us to place confidence intervals on the resulting price impacts and poverty results which reflect the uncertainty inherited from the biophysical side of the analysis. We contrast this with the economic uncertainty inherited from the global general equilibrium model (GTAP), by undertaking SSA with respect to the behavioral parameters in that model. This permits us to assess which type of uncertainty is more important for regional price and poverty outcomes. Finally, we undertake a combined SSA, wherein climate change-induced productivity shocks are permitted to interact with the uncertain economic parameters. This permits us to examine potential interactions between the two sources of uncertainty.
Tarn, Derjung M; Paterniti, Debora A; Wenger, Neil S
2016-08-01
Little is known about how providers communicate recommendations when scientific uncertainty exists. To compare provider recommendations to those in the scientific literature, with a focus on whether uncertainty was communicated. Qualitative (inductive systematic content analysis) and quantitative analysis of previously collected audio-recorded provider-patient office visits. Sixty-one providers and a socio-economically diverse convenience sample of 603 of their patients from outpatient community- and academic-based primary care, integrative medicine, and complementary and alternative medicine provider offices in Southern California. Comparison of provider information-giving about vitamin D to professional guidelines and scientific information for which conflicting recommendations or insufficient scientific evidence exists; certainty with which information was conveyed. Ninety-two (15.3 %) of 603 visit discussions touched upon issues related to vitamin D testing, management and benefits. Vitamin D deficiency screening was discussed with 23 (25 %) patients, the definition of vitamin D deficiency with 21 (22.8 %), the optimal range for vitamin D levels with 26 (28.3 %), vitamin D supplementation dosing with 50 (54.3 %), and benefits of supplementation with 46 (50 %). For each of the professional guidelines/scientific information examined, providers conveyed information that deviated from professional guidelines and the existing scientific evidence. Of 166 statements made about vitamin D in this study, providers conveyed 160 (96.4 %) with certainty, without mention of any equivocal or contradictory evidence in the scientific literature. No uncertainty was mentioned when vitamin D dosing was discussed, even when recommended dosing was higher than guideline recommendations. Providers convey the vast majority of information and recommendations about vitamin D with certainty, even though the scientific literature contains inconsistent recommendations and declarations of inadequate evidence. Not communicating uncertainty blurs the contrast between evidence-based recommendations and those without evidence. Providers should explore best practices for involving patients in decision-making by acknowledging the uncertainty behind their recommendations.
Assessing uncertainty in published risk estimates using ...
Introduction: The National Research Council recommended quantitative evaluation of uncertainty in effect estimates for risk assessment. This analysis considers uncertainty across model forms and model parameterizations with hexavalent chromium [Cr(VI)] and lung cancer mortality as an example. The objective is to characterize model uncertainty by evaluating estimates across published epidemiologic studies of the same cohort.Methods: This analysis was based on 5 studies analyzing a cohort of 2,357 workers employed from 1950-74 in a chromate production plant in Maryland. Cox and Poisson models were the only model forms considered by study authors to assess the effect of Cr(VI) on lung cancer mortality. All models adjusted for smoking and included a 5-year exposure lag, however other latency periods and model covariates such as age and race were considered. Published effect estimates were standardized to the same units and normalized by their variances to produce a standardized metric to compare variability within and between model forms. A total of 5 similarly parameterized analyses were considered across model form, and 16 analyses with alternative parameterizations were considered within model form (10 Cox; 6 Poisson). Results: Across Cox and Poisson model forms, adjusted cumulative exposure coefficients (betas) for 5 similar analyses ranged from 2.47 to 4.33 (mean=2.97, σ2=0.63). Within the 10 Cox models, coefficients ranged from 2.53 to 4.42 (mean=3.29, σ2=0.
Location error uncertainties - an advanced using of probabilistic inverse theory
NASA Astrophysics Data System (ADS)
Debski, Wojciech
2016-04-01
The spatial location of sources of seismic waves is one of the first tasks when transient waves from natural (uncontrolled) sources are analyzed in many branches of physics, including seismology, oceanology, to name a few. Source activity and its spatial variability in time, the geometry of recording network, the complexity and heterogeneity of wave velocity distribution are all factors influencing the performance of location algorithms and accuracy of the achieved results. While estimating of the earthquake foci location is relatively simple a quantitative estimation of the location accuracy is really a challenging task even if the probabilistic inverse method is used because it requires knowledge of statistics of observational, modelling, and apriori uncertainties. In this presentation we addressed this task when statistics of observational and/or modeling errors are unknown. This common situation requires introduction of apriori constraints on the likelihood (misfit) function which significantly influence the estimated errors. Based on the results of an analysis of 120 seismic events from the Rudna copper mine operating in southwestern Poland we illustrate an approach based on an analysis of Shanon's entropy calculated for the aposteriori distribution. We show that this meta-characteristic of the aposteriori distribution carries some information on uncertainties of the solution found.
Assessment of the magnitude of ammonia emissions in the United Kingdom
NASA Astrophysics Data System (ADS)
Sutton, M. A.; Place, C. J.; Eager, M.; Fowler, D.; Smith, R. I.
Estimates of ammonia emission in the U.K. have been critically reviewed with the aim of establishing the magnitude and uncertainty of each of the sources. European studies are also reviewed, with the U.K. providing a useful case study to highlight the uncertainties common to all ammonia emission inventories. This analysis of the emission factors and their application to U.K. sources supports an emission of 450 (231-715) Gg NH 3 yr -1. Agricultural activities are confirmed as the major source, providing 406 (215-630) Gg NH 3yr -1 (90% of the total), and therefore dominate uncertainties. Non-agricultural sources include sewage, pets, horses, humans, combustion and wild animals, though these contribute only 44 (16-85) Gg yr -1. Cattle represent the largest single uncertainty, accounting for 245 (119-389) Gg yr -1. The major uncertainties for cattle derive from estimation of the amount of nitrogen (N) excreted, the % N volatilized from land spreading of wastes, and the % N volatilized from stored farm-yard manure. Similar relative uncertainties apply to each of sheep, pigs and poultry, as well as fertilized crops, though these are quantitatively less important. Accounting; for regional differences in livestock demography, emission of 347, 63 and 40 Gg yr -1 are estimated for England & Wales, Scotland, and Northern Ireland, respectively. Though very uncertain, the total is in good agreement with estimates required to balance the U.K. atmospheric NH. budget.
Incorporating uncertainty into medical decision making: an approach to unexpected test results.
Bianchi, Matt T; Alexander, Brian M; Cash, Sydney S
2009-01-01
The utility of diagnostic tests derives from the ability to translate the population concepts of sensitivity and specificity into information that will be useful for the individual patient: the predictive value of the result. As the array of available diagnostic testing broadens, there is a temptation to de-emphasize history and physical findings and defer to the objective rigor of technology. However, diagnostic test interpretation is not always straightforward. One significant barrier to routine use of probability-based test interpretation is the uncertainty inherent in pretest probability estimation, the critical first step of Bayesian reasoning. The context in which this uncertainty presents the greatest challenge is when test results oppose clinical judgment. It is this situation when decision support would be most helpful. The authors propose a simple graphical approach that incorporates uncertainty in pretest probability and has specific application to the interpretation of unexpected results. This method quantitatively demonstrates how uncertainty in disease probability may be amplified when test results are unexpected (opposing clinical judgment), even for tests with high sensitivity and specificity. The authors provide a simple nomogram for determining whether an unexpected test result suggests that one should "switch diagnostic sides.'' This graphical framework overcomes the limitation of pretest probability uncertainty in Bayesian analysis and guides decision making when it is most challenging: interpretation of unexpected test results.
How to deal with climate change uncertainty in the planning of engineering systems
NASA Astrophysics Data System (ADS)
Spackova, Olga; Dittes, Beatrice; Straub, Daniel
2016-04-01
The effect of extreme events such as floods on the infrastructure and built environment is associated with significant uncertainties: These include the uncertain effect of climate change, uncertainty on extreme event frequency estimation due to limited historic data and imperfect models, and, not least, uncertainty on future socio-economic developments, which determine the damage potential. One option for dealing with these uncertainties is the use of adaptable (flexible) infrastructure that can easily be adjusted in the future without excessive costs. The challenge is in quantifying the value of adaptability and in finding the optimal sequence of decision. Is it worth to build a (potentially more expensive) adaptable system that can be adjusted in the future depending on the future conditions? Or is it more cost-effective to make a conservative design without counting with the possible future changes to the system? What is the optimal timing of the decision to build/adjust the system? We develop a quantitative decision-support framework for evaluation of alternative infrastructure designs under uncertainties, which: • probabilistically models the uncertain future (trough a Bayesian approach) • includes the adaptability of the systems (the costs of future changes) • takes into account the fact that future decisions will be made under uncertainty as well (using pre-posterior decision analysis) • allows to identify the optimal capacity and optimal timing to build/adjust the infrastructure. Application of the decision framework will be demonstrated on an example of flood mitigation planning in Bavaria.
Mueller, David S.
2017-01-01
This paper presents a method using Monte Carlo simulations for assessing uncertainty of moving-boat acoustic Doppler current profiler (ADCP) discharge measurements using a software tool known as QUant, which was developed for this purpose. Analysis was performed on 10 data sets from four Water Survey of Canada gauging stations in order to evaluate the relative contribution of a range of error sources to the total estimated uncertainty. The factors that differed among data sets included the fraction of unmeasured discharge relative to the total discharge, flow nonuniformity, and operator decisions about instrument programming and measurement cross section. As anticipated, it was found that the estimated uncertainty is dominated by uncertainty of the discharge in the unmeasured areas, highlighting the importance of appropriate selection of the site, the instrument, and the user inputs required to estimate the unmeasured discharge. The main contributor to uncertainty was invalid data, but spatial inhomogeneity in water velocity and bottom-track velocity also contributed, as did variation in the edge velocity, uncertainty in the edge distances, edge coefficients, and the top and bottom extrapolation methods. To a lesser extent, spatial inhomogeneity in the bottom depth also contributed to the total uncertainty, as did uncertainty in the ADCP draft at shallow sites. The estimated uncertainties from QUant can be used to assess the adequacy of standard operating procedures. They also provide quantitative feedback to the ADCP operators about the quality of their measurements, indicating which parameters are contributing most to uncertainty, and perhaps even highlighting ways in which uncertainty can be reduced. Additionally, QUant can be used to account for self-dependent error sources such as heading errors, which are a function of heading. The results demonstrate the importance of a Monte Carlo method tool such as QUant for quantifying random and bias errors when evaluating the uncertainty of moving-boat ADCP measurements.
Fienen, Michael N.; Doherty, John E.; Hunt, Randall J.; Reeves, Howard W.
2010-01-01
The importance of monitoring networks for resource-management decisions is becoming more recognized, in both theory and application. Quantitative computer models provide a science-based framework to evaluate the efficacy and efficiency of existing and possible future monitoring networks. In the study described herein, two suites of tools were used to evaluate the worth of new data for specific predictions, which in turn can support efficient use of resources needed to construct a monitoring network. The approach evaluates the uncertainty of a model prediction and, by using linear propagation of uncertainty, estimates how much uncertainty could be reduced if the model were calibrated with addition information (increased a priori knowledge of parameter values or new observations). The theoretical underpinnings of the two suites of tools addressing this technique are compared, and their application to a hypothetical model based on a local model inset into the Great Lakes Water Availability Pilot model are described. Results show that meaningful guidance for monitoring network design can be obtained by using the methods explored. The validity of this guidance depends substantially on the parameterization as well; hence, parameterization must be considered not only when designing the parameter-estimation paradigm but also-importantly-when designing the prediction-uncertainty paradigm.
Probabilistic short-term volcanic hazard in phases of unrest: A case study for tephra fallout
NASA Astrophysics Data System (ADS)
Selva, Jacopo; Costa, Antonio; Sandri, Laura; Macedonio, Giovanni; Marzocchi, Warner
2014-12-01
During volcanic crises, volcanologists estimate the impact of possible imminent eruptions usually through deterministic modeling of the effects of one or a few preestablished scenarios. Despite such an approach may bring an important information to the decision makers, the sole use of deterministic scenarios does not allow scientists to properly take into consideration all uncertainties, and it cannot be used to assess quantitatively the risk because the latter unavoidably requires a probabilistic approach. We present a model based on the concept of Bayesian event tree (hereinafter named BET_VH_ST, standing for Bayesian event tree for short-term volcanic hazard), for short-term near-real-time probabilistic volcanic hazard analysis formulated for any potential hazardous phenomenon accompanying an eruption. The specific goal of BET_VH_ST is to produce a quantitative assessment of the probability of exceedance of any potential level of intensity for a given volcanic hazard due to eruptions within restricted time windows (hours to days) in any area surrounding the volcano, accounting for all natural and epistemic uncertainties. BET_VH_ST properly assesses the conditional probability at each level of the event tree accounting for any relevant information derived from the monitoring system, theoretical models, and the past history of the volcano, propagating any relevant epistemic uncertainty underlying these assessments. As an application example of the model, we apply BET_VH_ST to assess short-term volcanic hazard related to tephra loading during Major Emergency Simulation Exercise, a major exercise at Mount Vesuvius that took place from 19 to 23 October 2006, consisting in a blind simulation of Vesuvius reactivation, from the early warning phase up to the final eruption, including the evacuation of a sample of about 2000 people from the area at risk. The results show that BET_VH_ST is able to produce short-term forecasts of the impact of tephra fall during a rapidly evolving crisis, accurately accounting for and propagating all uncertainties and enabling rational decision making under uncertainty.
Jha, Abhinav K.; Mena, Esther; Caffo, Brian; Ashrafinia, Saeed; Rahmim, Arman; Frey, Eric; Subramaniam, Rathan M.
2017-01-01
Abstract. Recently, a class of no-gold-standard (NGS) techniques have been proposed to evaluate quantitative imaging methods using patient data. These techniques provide figures of merit (FoMs) quantifying the precision of the estimated quantitative value without requiring repeated measurements and without requiring a gold standard. However, applying these techniques to patient data presents several practical difficulties including assessing the underlying assumptions, accounting for patient-sampling-related uncertainty, and assessing the reliability of the estimated FoMs. To address these issues, we propose statistical tests that provide confidence in the underlying assumptions and in the reliability of the estimated FoMs. Furthermore, the NGS technique is integrated within a bootstrap-based methodology to account for patient-sampling-related uncertainty. The developed NGS framework was applied to evaluate four methods for segmenting lesions from F-Fluoro-2-deoxyglucose positron emission tomography images of patients with head-and-neck cancer on the task of precisely measuring the metabolic tumor volume. The NGS technique consistently predicted the same segmentation method as the most precise method. The proposed framework provided confidence in these results, even when gold-standard data were not available. The bootstrap-based methodology indicated improved performance of the NGS technique with larger numbers of patient studies, as was expected, and yielded consistent results as long as data from more than 80 lesions were available for the analysis. PMID:28331883
Schram-Bijkerk, D; van Kempen, E; Knol, A B; Kruize, H; Staatsen, B; van Kamp, I
2009-10-01
Few quantitative health impact assessments (HIAs) of transport policies have been published so far and there is a lack of a common methodology for such assessments. To evaluate the usability of existing HIA methodology to quantify health effects of transport policies at the local level. Health impact of two simulated but realistic transport interventions - speed limit reduction and traffic re-allocation - was quantified by selecting traffic-related exposures and health endpoints, modelling of population exposure, selecting exposure-effect relations and estimating the number of local traffic-related cases and disease burden, expressed in disability-adjusted life-years (DALYs), before and after the intervention. Exposure information was difficult to retrieve because of the local scale of the interventions, and exposure-effect relations for subgroups and combined effects were missing. Given uncertainty in the outcomes originating from this kind of missing information, simulated changes in population health by two local traffic interventions were estimated to be small (<5%), except for the estimated reduction in DALYs by less traffic accidents (60%) due to speed limit reduction. Quantitative HIA of transport policies at a local scale is possible, provided that data on exposures, the exposed population and their baseline health status are available. The interpretation of the HIA information should be carried out in the context of the quality of input data and assumptions and uncertainties of the analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Yu; Sengupta, Manajit
2016-06-01
Transposition models are widely used in the solar energy industry to simulate solar radiation on inclined photovoltaic (PV) panels. These transposition models have been developed using various assumptions about the distribution of the diffuse radiation, and most of the parameterizations in these models have been developed using hourly ground data sets. Numerous studies have compared the performance of transposition models, but this paper aims to understand the quantitative uncertainty in the state-of-the-art transposition models and the sources leading to the uncertainty using high-resolution ground measurements in the plane of array. Our results suggest that the amount of aerosol optical depthmore » can affect the accuracy of isotropic models. The choice of empirical coefficients and the use of decomposition models can both result in uncertainty in the output from the transposition models. It is expected that the results of this study will ultimately lead to improvements of the parameterizations as well as the development of improved physical models.« less
Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Urrego-Blanco, Jorge Rolando; Urban, Nathan Mark; Hunke, Elizabeth Clare
Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual modelmore » parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. Lastly, it is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.« less
Lidar arc scan uncertainty reduction through scanning geometry optimization
Wang, Hui; Barthelmie, Rebecca J.; Pryor, Sara C.; ...
2016-04-13
Doppler lidars are frequently operated in a mode referred to as arc scans, wherein the lidar beam scans across a sector with a fixed elevation angle and the resulting measurements are used to derive an estimate of the n minute horizontal mean wind velocity (speed and direction). Previous studies have shown that the uncertainty in the measured wind speed originates from turbulent wind fluctuations and depends on the scan geometry (the arc span and the arc orientation). This paper is designed to provide guidance on optimal scan geometries for two key applications in the wind energy industry: wind turbine power performance analysis and annualmore » energy production prediction. We present a quantitative analysis of the retrieved wind speed uncertainty derived using a theoretical model with the assumption of isotropic and frozen turbulence, and observations from three sites that are onshore with flat terrain, onshore with complex terrain and offshore, respectively. The results from both the theoretical model and observations show that the uncertainty is scaled with the turbulence intensity such that the relative standard error on the 10 min mean wind speed is about 30% of the turbulence intensity. The uncertainty in both retrieved wind speeds and derived wind energy production estimates can be reduced by aligning lidar beams with the dominant wind direction, increasing the arc span and lowering the number of beams per arc scan. As a result, large arc spans should be used at sites with high turbulence intensity and/or large wind direction variation.« less
Lidar arc scan uncertainty reduction through scanning geometry optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Hui; Barthelmie, Rebecca J.; Pryor, Sara C.
Doppler lidars are frequently operated in a mode referred to as arc scans, wherein the lidar beam scans across a sector with a fixed elevation angle and the resulting measurements are used to derive an estimate of the n minute horizontal mean wind velocity (speed and direction). Previous studies have shown that the uncertainty in the measured wind speed originates from turbulent wind fluctuations and depends on the scan geometry (the arc span and the arc orientation). This paper is designed to provide guidance on optimal scan geometries for two key applications in the wind energy industry: wind turbine power performance analysis and annualmore » energy production prediction. We present a quantitative analysis of the retrieved wind speed uncertainty derived using a theoretical model with the assumption of isotropic and frozen turbulence, and observations from three sites that are onshore with flat terrain, onshore with complex terrain and offshore, respectively. The results from both the theoretical model and observations show that the uncertainty is scaled with the turbulence intensity such that the relative standard error on the 10 min mean wind speed is about 30% of the turbulence intensity. The uncertainty in both retrieved wind speeds and derived wind energy production estimates can be reduced by aligning lidar beams with the dominant wind direction, increasing the arc span and lowering the number of beams per arc scan. As a result, large arc spans should be used at sites with high turbulence intensity and/or large wind direction variation.« less
Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model
Urrego-Blanco, Jorge Rolando; Urban, Nathan Mark; Hunke, Elizabeth Clare; ...
2016-04-01
Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual modelmore » parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. Lastly, it is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.« less
Uncertainty quantification and global sensitivity analysis of the Los Alamos sea ice model
NASA Astrophysics Data System (ADS)
Urrego-Blanco, Jorge R.; Urban, Nathan M.; Hunke, Elizabeth C.; Turner, Adrian K.; Jeffery, Nicole
2016-04-01
Changes in the high-latitude climate system have the potential to affect global climate through feedbacks with the atmosphere and connections with midlatitudes. Sea ice and climate models used to understand these changes have uncertainties that need to be characterized and quantified. We present a quantitative way to assess uncertainty in complex computer models, which is a new approach in the analysis of sea ice models. We characterize parametric uncertainty in the Los Alamos sea ice model (CICE) in a standalone configuration and quantify the sensitivity of sea ice area, extent, and volume with respect to uncertainty in 39 individual model parameters. Unlike common sensitivity analyses conducted in previous studies where parameters are varied one at a time, this study uses a global variance-based approach in which Sobol' sequences are used to efficiently sample the full 39-dimensional parameter space. We implement a fast emulator of the sea ice model whose predictions of sea ice extent, area, and volume are used to compute the Sobol' sensitivity indices of the 39 parameters. Main effects and interactions among the most influential parameters are also estimated by a nonparametric regression technique based on generalized additive models. A ranking based on the sensitivity indices indicates that model predictions are most sensitive to snow parameters such as snow conductivity and grain size, and the drainage of melt ponds. It is recommended that research be prioritized toward more accurately determining these most influential parameter values by observational studies or by improving parameterizations in the sea ice model.
Neutron multiplicity counting: Confidence intervals for reconstruction parameters
Verbeke, Jerome M.
2016-03-09
From nuclear materials accountability to homeland security, the need for improved nuclear material detection, assay, and authentication has grown over the past decades. Starting in the 1940s, neutron multiplicity counting techniques have enabled quantitative evaluation of masses and multiplications of fissile materials. In this paper, we propose a new method to compute uncertainties on these parameters using a model-based sequential Bayesian processor, resulting in credible regions in the fissile material mass and multiplication space. These uncertainties will enable us to evaluate quantitatively proposed improvements to the theoretical fission chain model. Additionally, because the processor can calculate uncertainties in real time,more » it is a useful tool in applications such as portal monitoring: monitoring can stop as soon as a preset confidence of non-threat is reached.« less
Code of Federal Regulations, 2013 CFR
2013-01-01
... robust analytical methods. The Department seeks to use qualitative and quantitative analytical methods... uncertainties will be carried forward in subsequent analyses. The use of quantitative models will be... manufacturers and other interested parties. The use of quantitative models will be supplemented by qualitative...
Code of Federal Regulations, 2012 CFR
2012-01-01
... robust analytical methods. The Department seeks to use qualitative and quantitative analytical methods... uncertainties will be carried forward in subsequent analyses. The use of quantitative models will be... manufacturers and other interested parties. The use of quantitative models will be supplemented by qualitative...
Code of Federal Regulations, 2014 CFR
2014-01-01
... robust analytical methods. The Department seeks to use qualitative and quantitative analytical methods... uncertainties will be carried forward in subsequent analyses. The use of quantitative models will be... manufacturers and other interested parties. The use of quantitative models will be supplemented by qualitative...
NASA Astrophysics Data System (ADS)
Stockton, T. B.; Black, P. K.; Catlett, K. M.; Tauxe, J. D.
2002-05-01
Environmental modeling is an essential component in the evaluation of regulatory compliance of radioactive waste management sites (RWMSs) at the Nevada Test Site in southern Nevada, USA. For those sites that are currently operating, further goals are to support integrated decision analysis for the development of acceptance criteria for future wastes, as well as site maintenance, closure, and monitoring. At these RWMSs, the principal pathways for release of contamination to the environment are upward towards the ground surface rather than downwards towards the deep water table. Biotic processes, such as burrow excavation and plant uptake and turnover, dominate this upward transport. A combined multi-pathway contaminant transport and risk assessment model was constructed using the GoldSim modeling platform. This platform facilitates probabilistic analysis of environmental systems, and is especially well suited for assessments involving radionuclide decay chains. The model employs probabilistic definitions of key parameters governing contaminant transport, with the goals of quantifying cumulative uncertainty in the estimation of performance measures and providing information necessary to perform sensitivity analyses. This modeling differs from previous radiological performance assessments (PAs) in that the modeling parameters are intended to be representative of the current knowledge, and the uncertainty in that knowledge, of parameter values rather than reflective of a conservative assessment approach. While a conservative PA may be sufficient to demonstrate regulatory compliance, a parametrically honest PA can also be used for more general site decision-making. In particular, a parametrically honest probabilistic modeling approach allows both uncertainty and sensitivity analyses to be explicitly coupled to the decision framework using a single set of model realizations. For example, sensitivity analysis provides a guide for analyzing the value of collecting more information by quantifying the relative importance of each input parameter in predicting the model response. However, in these complex, high dimensional eco-system models, represented by the RWMS model, the dynamics of the systems can act in a non-linear manner. Quantitatively assessing the importance of input variables becomes more difficult as the dimensionality, the non-linearities, and the non-monotonicities of the model increase. Methods from data mining such as Multivariate Adaptive Regression Splines (MARS) and the Fourier Amplitude Sensitivity Test (FAST) provide tools that can be used in global sensitivity analysis in these high dimensional, non-linear situations. The enhanced interpretability of model output provided by the quantitative measures estimated by these global sensitivity analysis tools will be demonstrated using the RWMS model.
A Quantitative Approach to Analyzing Architectures in the Presence of Uncertainty
2009-07-01
SAR) 18. NUMBER OF PAGES 33 19a. NAME OF RESPONSIBLE PERSON a. REPORT unclassified b . ABSTRACT unclassified c. THIS PAGE unclassified Standard...hence requires appropriate tool support. 3 3.1 Architecture Modeling To facilitate this form of modeling, the modeling language must allow the archi ...can (a) capture the steady-state behavior of the model, ( b ) allow for the analysis of some property in the context of a specific state or condition
Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P
2014-09-01
Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.
The impacts of uncertainty and variability in groundwater-driven health risk assessment. (Invited)
NASA Astrophysics Data System (ADS)
Maxwell, R. M.
2010-12-01
Potential human health risk from contaminated groundwater is becoming an important, quantitative measure used in management decisions in a range of applications from Superfund to CO2 sequestration. Quantitatively assessing the potential human health risks from contaminated groundwater is challenging due to the many coupled processes, uncertainty in transport parameters and the variability in individual physiology and behavior. Perspective on human health risk assessment techniques will be presented and a framework used to predict potential, increased human health risk from contaminated groundwater will be discussed. This framework incorporates transport of contaminants through the subsurface from source to receptor and health risks to individuals via household exposure pathways. The subsurface is shown subject to both physical and chemical heterogeneity which affects downstream concentrations at receptors. Cases are presented where hydraulic conductivity can exhibit both uncertainty and spatial variability in addition to situations where hydraulic conductivity is the dominant source of uncertainty in risk assessment. Management implications, such as characterization and remediation will also be discussed.
Andreozzi, Stefano; Miskovic, Ljubisa; Hatzimanikatis, Vassily
2016-01-01
Accurate determination of physiological states of cellular metabolism requires detailed information about metabolic fluxes, metabolite concentrations and distribution of enzyme states. Integration of fluxomics and metabolomics data, and thermodynamics-based metabolic flux analysis contribute to improved understanding of steady-state properties of metabolism. However, knowledge about kinetics and enzyme activities though essential for quantitative understanding of metabolic dynamics remains scarce and involves uncertainty. Here, we present a computational methodology that allow us to determine and quantify the kinetic parameters that correspond to a certain physiology as it is described by a given metabolic flux profile and a given metabolite concentration vector. Though we initially determine kinetic parameters that involve a high degree of uncertainty, through the use of kinetic modeling and machine learning principles we are able to obtain more accurate ranges of kinetic parameters, and hence we are able to reduce the uncertainty in the model analysis. We computed the distribution of kinetic parameters for glucose-fed E. coli producing 1,4-butanediol and we discovered that the observed physiological state corresponds to a narrow range of kinetic parameters of only a few enzymes, whereas the kinetic parameters of other enzymes can vary widely. Furthermore, this analysis suggests which are the enzymes that should be manipulated in order to engineer the reference state of the cell in a desired way. The proposed approach also sets up the foundations of a novel type of approaches for efficient, non-asymptotic, uniform sampling of solution spaces. Copyright © 2015 International Metabolic Engineering Society. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Bloom, A. Anthony; Bowman, Kevin W.; Lee, Meemong; Turner, Alexander J.; Schroeder, Ronny; Worden, John R.; Weidner, Richard; McDonald, Kyle C.; Jacob, Daniel J.
2017-06-01
Wetland emissions remain one of the principal sources of uncertainty in the global atmospheric methane (CH4) budget, largely due to poorly constrained process controls on CH4 production in waterlogged soils. Process-based estimates of global wetland CH4 emissions and their associated uncertainties can provide crucial prior information for model-based top-down CH4 emission estimates. Here we construct a global wetland CH4 emission model ensemble for use in atmospheric chemical transport models (WetCHARTs version 1.0). Our 0.5° × 0.5° resolution model ensemble is based on satellite-derived surface water extent and precipitation reanalyses, nine heterotrophic respiration simulations (eight carbon cycle models and a data-constrained terrestrial carbon cycle analysis) and three temperature dependence parameterizations for the period 2009-2010; an extended ensemble subset based solely on precipitation and the data-constrained terrestrial carbon cycle analysis is derived for the period 2001-2015. We incorporate the mean of the full and extended model ensembles into GEOS-Chem and compare the model against surface measurements of atmospheric CH4; the model performance (site-level and zonal mean anomaly residuals) compares favourably against published wetland CH4 emissions scenarios. We find that uncertainties in carbon decomposition rates and the wetland extent together account for more than 80 % of the dominant uncertainty in the timing, magnitude and seasonal variability in wetland CH4 emissions, although uncertainty in the temperature CH4 : C dependence is a significant contributor to seasonal variations in mid-latitude wetland CH4 emissions. The combination of satellite, carbon cycle models and temperature dependence parameterizations provides a physically informed structural a priori uncertainty that is critical for top-down estimates of wetland CH4 fluxes. Specifically, our ensemble can provide enhanced information on the prior CH4 emission uncertainty and the error covariance structure, as well as a means for using posterior flux estimates and their uncertainties to quantitatively constrain the biogeochemical process controls of global wetland CH4 emissions.
Quantitative validation of an air-coupled ultrasonic probe model by Interferometric laser tomography
NASA Astrophysics Data System (ADS)
Revel, G. M.; Pandarese, G.; Cavuto, A.
2012-06-01
The present paper describes the quantitative validation of a finite element (FE) model of the ultrasound beam generated by an air coupled non-contact ultrasound transducer. The model boundary conditions are given by vibration velocities measured by laser vibrometry on the probe membrane. The proposed validation method is based on the comparison between the simulated 3D pressure field and the pressure data measured with interferometric laser tomography technique. The model details and the experimental techniques are described in paper. The analysis of results shows the effectiveness of the proposed approach and the possibility to quantitatively assess and predict the generated acoustic pressure field, with maximum discrepancies in the order of 20% due to uncertainty effects. This step is important for determining in complex problems the real applicability of air-coupled probes and for the simulation of the whole inspection procedure, also when the component is designed, so as to virtually verify its inspectability.
Quantitative Rainbow Schlieren Deflectometry as a Temperature Diagnostic for Spherical Flames
NASA Technical Reports Server (NTRS)
Feikema, Douglas A.
2004-01-01
Numerical analysis and experimental results are presented to define a method for quantitatively measuring the temperature distribution of a spherical diffusion flame using Rainbow Schlieren Deflectometry in microgravity. First, a numerical analysis is completed to show the method can suitably determine temperature in the presence of spatially varying species composition. Also, a numerical forward-backward inversion calculation is presented to illustrate the types of calculations and deflections to be encountered. Lastly, a normal gravity demonstration of temperature measurement in an axisymmetric laminar, diffusion flame using Rainbow Schlieren deflectometry is presented. The method employed in this paper illustrates the necessary steps for the preliminary design of a Schlieren system. The largest deflections for the normal gravity flame considered in this paper are 7.4 x 10(-4) radians which can be accurately measured with 2 meter focal length collimating and decollimating optics. The experimental uncertainty of deflection is less than 5 x 10(-5) radians.
Liu, Shu-Yu; Hu, Chang-Qin
2007-10-17
This study introduces the general method of quantitative nuclear magnetic resonance (qNMR) for the calibration of reference standards of macrolide antibiotics. Several qNMR experimental conditions were optimized including delay, which is an important parameter of quantification. Three kinds of macrolide antibiotics were used to validate the accuracy of the qNMR method by comparison with the results obtained by the high performance liquid chromatography (HPLC) method. The purities of five common reference standards of macrolide antibiotics were measured by the 1H qNMR method and the mass balance method, respectively. The analysis results of the two methods were compared. The qNMR is quick and simple to use. In a new medicine research and development process, qNMR provides a new and reliable method for purity analysis of the reference standard.
Williams, Richard M.; Aalseth, C. E.; Brandenberger, J. M.; ...
2017-02-17
Here, this paper describes the generation of 39Ar, via reactor irradiation of potassium carbonate, followed by quantitative analysis (length-compensated proportional counting) to yield two calibration standards that are respectively 50 and 3 times atmospheric background levels. Measurements were performed in Pacific Northwest National Laboratory's shallow underground counting laboratory studying the effect of gas density on beta-transport; these results are compared with simulation. The total expanded uncertainty of the specific activity for the ~50 × 39Ar in P10 standard is 3.6% (k=2).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Richard M.; Aalseth, C. E.; Brandenberger, J. M.
Here, this paper describes the generation of 39Ar, via reactor irradiation of potassium carbonate, followed by quantitative analysis (length-compensated proportional counting) to yield two calibration standards that are respectively 50 and 3 times atmospheric background levels. Measurements were performed in Pacific Northwest National Laboratory's shallow underground counting laboratory studying the effect of gas density on beta-transport; these results are compared with simulation. The total expanded uncertainty of the specific activity for the ~50 × 39Ar in P10 standard is 3.6% (k=2).
NASA Technical Reports Server (NTRS)
Cucinotta, Francis; Badhwar, Gautam; Saganti, Premkumar; Schimmerling, Walter; Wilson, John; Peterson, Leif; Dicello, John
2002-01-01
In this paper we discuss expected lifetime excess cancer risks for astronauts returning from exploration class missions. For the first time we make a quantitative assessment of uncertainties in cancer risk projections for space radiation exposures. Late effects from the high charge and energy (HZE) ions present in the galactic cosmic rays including cancer and the poorly understood risks to the central nervous system constitute the major risks. Methods used to project risk in low Earth orbit are seen as highly uncertain for projecting risks on exploration missions because of the limited radiobiology data available for estimating HZE ion risks. Cancer risk projections are described as a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Monte-Carlo sampling from subjective error distributions represents the lack of knowledge in each factor to quantify risk projection overall uncertainty. Cancer risk analysis is applied to several exploration mission scenarios. At solar minimum, the number of days in space where career risk of less than the limiting 3% excess cancer mortality can be assured at a 95% confidence level is found to be only of the order of 100 days.
NASA Astrophysics Data System (ADS)
Zamri, Nurnadiah; Abdullah, Lazim
2014-06-01
Flood control project is a complex issue which takes economic, social, environment and technical attributes into account. Selection of the best flood control project requires the consideration of conflicting quantitative and qualitative evaluation criteria. When decision-makers' judgment are under uncertainty, it is relatively difficult for them to provide exact numerical values. The interval type-2 fuzzy set (IT2FS) is a strong tool which can deal with the uncertainty case of subjective, incomplete, and vague information. Besides, it helps to solve for some situations where the information about criteria weights for alternatives is completely unknown. Therefore, this paper is adopted the information interval type-2 entropy concept into the weighting process of interval type-2 fuzzy TOPSIS. This entropy weight is believed can effectively balance the influence of uncertainty factors in evaluating attribute. Then, a modified ranking value is proposed in line with the interval type-2 entropy weight. Quantitative and qualitative factors that normally linked with flood control project are considered for ranking. Data in form of interval type-2 linguistic variables were collected from three authorised personnel of three Malaysian Government agencies. Study is considered for the whole of Malaysia. From the analysis, it shows that diversion scheme yielded the highest closeness coefficient at 0.4807. A ranking can be drawn using the magnitude of closeness coefficient. It was indicated that the diversion scheme recorded the first rank among five causes.
Elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-medium constructs driven by a unique set of site-specific data. Quantitative assessment of integrated, multimedia models that simulate hundreds of sites...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Behroozi, Peter S.; Wechsler, Risa H.; Conroy, Charlie
2010-07-01
We conduct a comprehensive analysis of the relationship between central galaxies and their host dark matter halos, as characterized by the stellar mass-halo mass (SM-HM) relation, with rigorous consideration of uncertainties. Our analysis focuses on results from the abundance matching technique, which assumes that every dark matter halo or subhalo above a specific mass threshold hosts one galaxy. We provide a robust estimate of the SM-HM relation for 0 < z < 1 and discuss the quantitative effects of uncertainties in observed galaxy stellar mass functions (including stellar mass estimates and counting uncertainties), halo mass functions (including cosmology and uncertaintiesmore » from substructure), and the abundance matching technique used to link galaxies to halos (including scatter in this connection). Our analysis results in a robust estimate of the SM-HM relation and its evolution from z = 0 to z = 4. The shape and the evolution are well constrained for z < 1. The largest uncertainties at these redshifts are due to stellar mass estimates (0.25 dex uncertainty in normalization); however, failure to account for scatter in stellar masses at fixed halo mass can lead to errors of similar magnitude in the SM-HM relation for central galaxies in massive halos. We also investigate the SM-HM relation to z = 4, although the shape of the relation at higher redshifts remains fairly unconstrained when uncertainties are taken into account. We find that the integrated star formation at a given halo mass peaks at 10%-20% of available baryons for all redshifts from 0 to 4. This peak occurs at a halo mass of 7 x 10{sup 11} M{sub sun} at z = 0 and this mass increases by a factor of 5 to z = 4. At lower and higher masses, star formation is substantially less efficient, with stellar mass scaling as M{sub *} {approx} M {sup 2.3}{sub h} at low masses and M{sub *} {approx} M {sup 0.29}{sub h} at high masses. The typical stellar mass for halos with mass less than 10{sup 12} M{sub sun} has increased by 0.3-0.45 dex for halos since z {approx} 1. These results will provide a powerful tool to inform galaxy evolution models.« less
Toward a probabilistic acoustic emission source location algorithm: A Bayesian approach
NASA Astrophysics Data System (ADS)
Schumacher, Thomas; Straub, Daniel; Higgins, Christopher
2012-09-01
Acoustic emissions (AE) are stress waves initiated by sudden strain releases within a solid body. These can be caused by internal mechanisms such as crack opening or propagation, crushing, or rubbing of crack surfaces. One application for the AE technique in the field of Structural Engineering is Structural Health Monitoring (SHM). With piezo-electric sensors mounted to the surface of the structure, stress waves can be detected, recorded, and stored for later analysis. An important step in quantitative AE analysis is the estimation of the stress wave source locations. Commonly, source location results are presented in a rather deterministic manner as spatial and temporal points, excluding information about uncertainties and errors. Due to variability in the material properties and uncertainty in the mathematical model, measures of uncertainty are needed beyond best-fit point solutions for source locations. This paper introduces a novel holistic framework for the development of a probabilistic source location algorithm. Bayesian analysis methods with Markov Chain Monte Carlo (MCMC) simulation are employed where all source location parameters are described with posterior probability density functions (PDFs). The proposed methodology is applied to an example employing data collected from a realistic section of a reinforced concrete bridge column. The selected approach is general and has the advantage that it can be extended and refined efficiently. Results are discussed and future steps to improve the algorithm are suggested.
NASA Astrophysics Data System (ADS)
Fernández-Ruiz, Ramón; Friedrich K., E. Josue; Redrejo, M. J.
2018-02-01
The main goal of this work was to investigate, in a systematic way, the influence of the controlled modulation of the particle size distribution of a representative solid sample with respect to the more relevant analytical parameters of the Direct Solid Analysis (DSA) by Total-reflection X-Ray Fluorescence (TXRF) quantitative method. In particular, accuracy, uncertainty, linearity and detection limits were correlated with the main parameters of their size distributions for the following elements; Al, Si, P, S, K, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, As, Se, Rb, Sr, Ba and Pb. In all cases strong correlations were finded. The main conclusion of this work can be resumed as follows; the modulation of particles shape to lower average sizes next to a minimization of the width of particle size distributions, produce a strong increment of accuracy, minimization of uncertainties and limit of detections for DSA-TXRF methodology. These achievements allow the future use of the DSA-TXRF analytical methodology for development of ISO norms and standardized protocols for the direct analysis of solids by mean of TXRF.
Quantifying Uncertainty in Near Surface Electromagnetic Imaging Using Bayesian Methods
NASA Astrophysics Data System (ADS)
Blatter, D. B.; Ray, A.; Key, K.
2017-12-01
Geoscientists commonly use electromagnetic methods to image the Earth's near surface. Field measurements of EM fields are made (often with the aid an artificial EM source) and then used to infer near surface electrical conductivity via a process known as inversion. In geophysics, the standard inversion tool kit is robust and can provide an estimate of the Earth's near surface conductivity that is both geologically reasonable and compatible with the measured field data. However, standard inverse methods struggle to provide a sense of the uncertainty in the estimate they provide. This is because the task of finding an Earth model that explains the data to within measurement error is non-unique - that is, there are many, many such models; but the standard methods provide only one "answer." An alternative method, known as Bayesian inversion, seeks to explore the full range of Earth model parameters that can adequately explain the measured data, rather than attempting to find a single, "ideal" model. Bayesian inverse methods can therefore provide a quantitative assessment of the uncertainty inherent in trying to infer near surface conductivity from noisy, measured field data. This study applies a Bayesian inverse method (called trans-dimensional Markov chain Monte Carlo) to transient airborne EM data previously collected over Taylor Valley - one of the McMurdo Dry Valleys in Antarctica. Our results confirm the reasonableness of previous estimates (made using standard methods) of near surface conductivity beneath Taylor Valley. In addition, we demonstrate quantitatively the uncertainty associated with those estimates. We demonstrate that Bayesian inverse methods can provide quantitative uncertainty to estimates of near surface conductivity.
Case studies in Bayesian microbial risk assessments.
Kennedy, Marc C; Clough, Helen E; Turner, Joanne
2009-12-21
The quantification of uncertainty and variability is a key component of quantitative risk analysis. Recent advances in Bayesian statistics make it ideal for integrating multiple sources of information, of different types and quality, and providing a realistic estimate of the combined uncertainty in the final risk estimates. We present two case studies related to foodborne microbial risks. In the first, we combine models to describe the sequence of events resulting in illness from consumption of milk contaminated with VTEC O157. We used Monte Carlo simulation to propagate uncertainty in some of the inputs to computer models describing the farm and pasteurisation process. Resulting simulated contamination levels were then assigned to consumption events from a dietary survey. Finally we accounted for uncertainty in the dose-response relationship and uncertainty due to limited incidence data to derive uncertainty about yearly incidences of illness in young children. Options for altering the risk were considered by running the model with different hypothetical policy-driven exposure scenarios. In the second case study we illustrate an efficient Bayesian sensitivity analysis for identifying the most important parameters of a complex computer code that simulated VTEC O157 prevalence within a managed dairy herd. This was carried out in 2 stages, first to screen out the unimportant inputs, then to perform a more detailed analysis on the remaining inputs. The method works by building a Bayesian statistical approximation to the computer code using a number of known code input/output pairs (training runs). We estimated that the expected total number of children aged 1.5-4.5 who become ill due to VTEC O157 in milk is 8.6 per year, with 95% uncertainty interval (0,11.5). The most extreme policy we considered was banning on-farm pasteurisation of milk, which reduced the estimate to 6.4 with 95% interval (0,11). In the second case study the effective number of inputs was reduced from 30 to 7 in the screening stage, and just 2 inputs were found to explain 82.8% of the output variance. A combined total of 500 runs of the computer code were used. These case studies illustrate the use of Bayesian statistics to perform detailed uncertainty and sensitivity analyses, integrating multiple information sources in a way that is both rigorous and efficient.
Effects of low-dose prenatal irradiation on the central nervous system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-04-01
Scientists are in general agreement about the effects of prenatal irradiation, including those affecting the central nervous system (CNS). Differing concepts and research approaches have resulted in some uncertainties about some quantitative relationships, underlying interpretations, and conclusions. Examples of uncertainties include the existence of a threshold, the quantitative relationships between prenatal radiation doses and resulting physical and functional lesions, and processes by which lesions originate and develop. A workshop was convened in which scientists with varying backgrounds and viewpoints discussed these relationships and explored ways in which various disciplines could coordinate concepts and methodologies to suggest research directions for resolvingmore » uncertainties. This Workshop Report summarizes, in an extended fashion, salient features of the presentations on the current status of our knowledge about the radiobiology and neuroscience of prenatal irradiation and the relationships between them.« less
Rotorcraft control system design for uncertain vehicle dynamics using quantitative feedback theory
NASA Technical Reports Server (NTRS)
Hess, R. A.
1994-01-01
Quantitative Feedback Theory describes a frequency-domain technique for the design of multi-input, multi-output control systems which must meet time or frequency domain performance criteria when specified uncertainty exists in the linear description of the vehicle dynamics. This theory is applied to the design of the longitudinal flight control system for a linear model of the BO-105C rotorcraft. Uncertainty in the vehicle model is due to the variation in the vehicle dynamics over a range of airspeeds from 0-100 kts. For purposes of exposition, the vehicle description contains no rotor or actuator dynamics. The design example indicates the manner in which significant uncertainty exists in the vehicle model. The advantage of using a sequential loop closure technique to reduce the cost of feedback is demonstrated by example.
Effects of low-dose prenatal irradiation on the central nervous system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Scientists are in general agreement about the effects of prenatal irradiation, including those affecting the central nervous system (CNS). Differing concepts and research approaches have resulted in some uncertainties about some quantitative relationships, underlying interpretations, and conclusions. Examples of uncertainties include the existence of a threshold, the quantitative relationships between prenatal radiation doses and resulting physical and functional lesions, and processes by which lesions originate and develop. A workshop was convened in which scientists with varying backgrounds and viewpoints discussed these relationships and explored ways in which various disciplines could coordinate concepts and methodologies to suggest research directions for resolvingmore » uncertainties. This Workshop Report summarizes, in an extended fashion, salient features of the presentations on the current status of our knowledge about the radiobiology and neuroscience of prenatal irradiation and the relationships between them.« less
NASA Astrophysics Data System (ADS)
Puechberty, Rachel; Bechon, Pierre-Marie; Le Coz, Jérôme; Renard, Benjamin
2015-04-01
The French national hydrological services (NHS) manage the production of streamflow time series throughout the national territory. The hydrological data are made available to end-users through different web applications and the national hydrological archive (Banque Hydro). Providing end-users with qualitative and quantitative information on the uncertainty of the hydrological data is key to allow them drawing relevant conclusions and making appropriate decisions. Due to technical and organisational issues that are specific to the field of hydrometry, quantifying the uncertainty of hydrological measurements is still challenging and not yet standardized. The French NHS have made progress on building a consistent strategy to assess the uncertainty of their streamflow data. The strategy consists of addressing the uncertainties produced and propagated at each step of the data production with uncertainty analysis tools that are compatible with each other and compliant with international uncertainty guidance and standards. Beyond the necessary research and methodological developments, operational software tools and procedures are absolutely necessary to the data management and uncertainty analysis by field hydrologists. A first challenge is to assess, and if possible reduce, the uncertainty of streamgauging data, i.e. direct stage-discharge measurements. Interlaboratory experiments proved to be a very efficient way to empirically measure the uncertainty of a given streamgauging technique in given measurement conditions. The Q+ method (Le Coz et al., 2012) was developed to improve the uncertainty propagation method proposed in the ISO748 standard for velocity-area gaugings. Both empirical or computed (with Q+) uncertainty values can now be assigned in BAREME, which is the software used by the French NHS for managing streamgauging measurements. A second pivotal step is to quantify the uncertainty related to stage-discharge rating curves and their application to water level records to produce continuous discharge time series. The management of rating curves is also done using BAREME. The BaRatin method (Le Coz et al., 2014) was developed as a Bayesian approach of rating curve development and uncertainty analysis. Since BaRatin accounts for the individual uncertainties of gauging data used to build the rating curve, it was coupled with BAREME. The BaRatin method is still undergoing development and research, in particular to address non univocal or time-varying stage-discharge relations, due to hysteresis, variable backwater, rating shifts, etc. A new interface including new options is under development. The next steps are now to propagate the uncertainties of water level records, through uncertain rating curves, up to discharge time series and derived variables (e.g. annual mean flow) and statistics (e.g. flood quantiles). Bayesian tools are already available for both tasks but further validation and development is necessary for their integration in the operational data workflow of the French NHS. References Le Coz, J., Camenen, B., Peyrard, X., Dramais, G., 2012. Uncertainty in open-channel discharges measured with the velocity-area method. Flow Measurement and Instrumentation 26, 18-29. Le Coz, J., Renard, B., Bonnifait, L., Branger, F., Le Boursicaud, R., 2014. Combining hydraulic knowledge and uncertain gaugings in the estimation of hydrometric rating curves: a Bayesian approach, Journal of Hydrology, 509, 573-587.
Resolving dust emission responses to land cover change using an ecological land classification
USDA-ARS?s Scientific Manuscript database
Despite efforts to quantify the impacts of land cover change on wind erosion, assessment uncertainty remains large. We address this uncertainty by evaluating the application of ecological site concepts and state-and-transition models (STMs) for detecting and quantitatively describing the impacts of ...
Propagating uncertainty from hydrology into human health risk assessment
NASA Astrophysics Data System (ADS)
Siirila, E. R.; Maxwell, R. M.
2013-12-01
Hydro-geologic modeling and uncertainty assessment of flow and transport parameters can be incorporated into human health risk (both cancer and non-cancer) assessment to better understand the associated uncertainties. This interdisciplinary approach is needed now more than ever as societal problems concerning water quality are increasingly interdisciplinary as well. For example, uncertainty can originate from environmental conditions such as a lack of information or measurement error, or can manifest as variability, such as differences in physiological and exposure parameters between individuals. To complicate the matter, traditional risk assessment methodologies are independent of time, virtually neglecting any temporal dependence. Here we present not only how uncertainty and variability can be incorporated into a risk assessment, but also how time dependent risk assessment (TDRA) allows for the calculation of risk as a function of time. The development of TDRA and the inclusion of quantitative risk analysis in this research provide a means to inform decision makers faced with water quality issues and challenges. The stochastic nature of this work also provides a means to address the question of uncertainty in management decisions, a component that is frequently difficult to quantify. To illustrate this new formulation and to investigate hydraulic mechanisms for sensitivity, an example of varying environmental concentration signals resulting from rate dependencies in geochemical reactions is used. Cancer risk is computed and compared using environmental concentration ensembles modeled with sorption as 1) a linear equilibrium assumption and 2) first order kinetics. Results show that the up scaling of these small-scale processes controls the distribution, magnitude, and associated uncertainty of cancer risk.
Aiding alternatives assessment with an uncertainty-tolerant hazard scoring method.
Faludi, Jeremy; Hoang, Tina; Gorman, Patrick; Mulvihill, Martin
2016-11-01
This research developed a single-score system to simplify and clarify decision-making in chemical alternatives assessment, accounting for uncertainty. Today, assessing alternatives to hazardous constituent chemicals is a difficult task-rather than comparing alternatives by a single definitive score, many independent toxicological variables must be considered at once, and data gaps are rampant. Thus, most hazard assessments are only comprehensible to toxicologists, but business leaders and politicians need simple scores to make decisions. In addition, they must balance hazard against other considerations, such as product functionality, and they must be aware of the high degrees of uncertainty in chemical hazard data. This research proposes a transparent, reproducible method to translate eighteen hazard endpoints into a simple numeric score with quantified uncertainty, alongside a similar product functionality score, to aid decisions between alternative products. The scoring method uses Clean Production Action's GreenScreen as a guide, but with a different method of score aggregation. It provides finer differentiation between scores than GreenScreen's four-point scale, and it displays uncertainty quantitatively in the final score. Displaying uncertainty also illustrates which alternatives are early in product development versus well-defined commercial products. This paper tested the proposed assessment method through a case study in the building industry, assessing alternatives to spray polyurethane foam insulation containing methylene diphenyl diisocyanate (MDI). The new hazard scoring method successfully identified trade-offs between different alternatives, showing finer resolution than GreenScreen Benchmarking. Sensitivity analysis showed that different weighting schemes in hazard scores had almost no effect on alternatives ranking, compared to uncertainty from data gaps. Copyright © 2016 Elsevier Ltd. All rights reserved.
Pouillot, Régis; Delignette-Muller, Marie Laure
2010-09-01
Quantitative risk assessment has emerged as a valuable tool to enhance the scientific basis of regulatory decisions in the food safety domain. This article introduces the use of two new computing resources (R packages) specifically developed to help risk assessors in their projects. The first package, "fitdistrplus", gathers tools for choosing and fitting a parametric univariate distribution to a given dataset. The data may be continuous or discrete. Continuous data may be right-, left- or interval-censored as is frequently obtained with analytical methods, with the possibility of various censoring thresholds within the dataset. Bootstrap procedures then allow the assessor to evaluate and model the uncertainty around the parameters and to transfer this information into a quantitative risk assessment model. The second package, "mc2d", helps to build and study two dimensional (or second-order) Monte-Carlo simulations in which the estimation of variability and uncertainty in the risk estimates is separated. This package easily allows the transfer of separated variability and uncertainty along a chain of conditional mathematical and probabilistic models. The usefulness of these packages is illustrated through a risk assessment of hemolytic and uremic syndrome in children linked to the presence of Escherichia coli O157:H7 in ground beef. These R packages are freely available at the Comprehensive R Archive Network (cran.r-project.org). Copyright 2010 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Zolnai, Z.; Toporkov, M.; Volk, J.; Demchenko, D. O.; Okur, S.; Szabó, Z.; Özgür, Ü.; Morkoç, H.; Avrutin, V.; Kótai, E.
2015-02-01
The atomic composition with less than 1-2 atom% uncertainty was measured in ternary BeZnO and quaternary BeMgZnO alloys using a combination of nondestructive Rutherford backscattering spectrometry with 1 MeV He+ analyzing ion beam and non-Rutherford elastic backscattering experiments with 2.53 MeV energy protons. An enhancement factor of 60 in the cross-section of Be for protons has been achieved to monitor Be atomic concentrations. Usually the quantitative analysis of BeZnO and BeMgZnO systems is challenging due to difficulties with appropriate experimental tools for the detection of the light Be element with satisfactory accuracy. As it is shown, our applied ion beam technique, supported with the detailed simulation of ion stopping, backscattering, and detection processes allows of quantitative depth profiling and compositional analysis of wurtzite BeZnO/ZnO/sapphire and BeMgZnO/ZnO/sapphire layer structures with low uncertainty for both Be and Mg. In addition, the excitonic bandgaps of the layers were deduced from optical transmittance measurements. To augment the measured compositions and bandgaps of BeO and MgO co-alloyed ZnO layers, hybrid density functional bandgap calculations were performed with varying the Be and Mg contents. The theoretical vs. experimental bandgaps show linear correlation in the entire bandgap range studied from 3.26 eV to 4.62 eV. The analytical method employed should help facilitate bandgap engineering for potential applications, such as solar blind UV photodetectors and heterostructures for UV emitters and intersubband devices.
Are quantitative sensitivity analysis methods always reliable?
NASA Astrophysics Data System (ADS)
Huang, X.
2016-12-01
Physical parameterizations developed to represent subgrid-scale physical processes include various uncertain parameters, leading to large uncertainties in today's Earth System Models (ESMs). Sensitivity Analysis (SA) is an efficient approach to quantitatively determine how the uncertainty of the evaluation metric can be apportioned to each parameter. Also, SA can identify the most influential parameters, as a result to reduce the high dimensional parametric space. In previous studies, some SA-based approaches, such as Sobol' and Fourier amplitude sensitivity testing (FAST), divide the parameters into sensitive and insensitive groups respectively. The first one is reserved but the other is eliminated for certain scientific study. However, these approaches ignore the disappearance of the interactive effects between the reserved parameters and the eliminated ones, which are also part of the total sensitive indices. Therefore, the wrong sensitive parameters might be identified by these traditional SA approaches and tools. In this study, we propose a dynamic global sensitivity analysis method (DGSAM), which iteratively removes the least important parameter until there are only two parameters left. We use the CLM-CASA, a global terrestrial model, as an example to verify our findings with different sample sizes ranging from 7000 to 280000. The result shows DGSAM has abilities to identify more influential parameters, which is confirmed by parameter calibration experiments using four popular optimization methods. For example, optimization using Top3 parameters filtered by DGSAM could achieve substantial improvement against Sobol' by 10%. Furthermore, the current computational cost for calibration has been reduced to 1/6 of the original one. In future, it is necessary to explore alternative SA methods emphasizing parameter interactions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koeylue, U.O.
1997-05-01
An in situ particulate diagnostic/analysis technique is outlined based on the Rayleigh-Debye-Gans polydisperse fractal aggregate (RDG/PFA) scattering interpretation of absolute angular light scattering and extinction measurements. Using proper particle refractive index, the proposed data analysis method can quantitatively yield all aggregate parameters (particle volume fraction, f{sub v}, fractal dimension, D{sub f}, primary particle diameter, d{sub p}, particle number density, n{sub p}, and aggregate size distribution, pdf(N)) without any prior knowledge about the particle-laden environment. The present optical diagnostic/interpretation technique was applied to two different soot-containing laminar and turbulent ethylene/air nonpremixed flames in order to assess its reliability. The aggregate interpretationmore » of optical measurements yielded D{sub f}, d{sub p}, and pdf(N) that are in excellent agreement with ex situ thermophoretic sampling/transmission electron microscope (TS/TEM) observations within experimental uncertainties. However, volume-equivalent single particle models (Rayleigh/Mie) overestimated d{sub p} by about a factor of 3, causing an order of magnitude underestimation in n{sub p}. Consequently, soot surface areas and growth rates were in error by a factor of 3, emphasizing that aggregation effects need to be taken into account when using optical diagnostics for a reliable understanding of soot formation/evolution mechanism in flames. The results also indicated that total soot emissivities were generally underestimated using Rayleigh analysis (up to 50%), mainly due to the uncertainties in soot refractive indices at infrared wavelengths. This suggests that aggregate considerations may not be essential for reasonable radiation heat transfer predictions from luminous flames because of fortuitous error cancellation, resulting in typically a 10 to 30% net effect.« less
Burns, Malcolm; Wiseman, Gordon; Knight, Angus; Bramley, Peter; Foster, Lucy; Rollinson, Sophie; Damant, Andrew; Primrose, Sandy
2016-01-07
Following a report on a significant amount of horse DNA being detected in a beef burger product on sale to the public at a UK supermarket in early 2013, the Elliott report was published in 2014 and contained a list of recommendations for helping ensure food integrity. One of the recommendations included improving laboratory testing capacity and capability to ensure a harmonised approach for testing for food authenticity. Molecular biologists have developed exquisitely sensitive methods based on the polymerase chain reaction (PCR) or mass spectrometry for detecting the presence of particular nucleic acid or peptide/protein sequences. These methods have been shown to be specific and sensitive in terms of lower limits of applicability, but they are largely qualitative in nature. Historically, the conversion of these qualitative techniques into reliable quantitative methods has been beset with problems even when used on relatively simple sample matrices. When the methods are applied to complex sample matrices, as found in many foods, the problems are magnified resulting in a high measurement uncertainty associated with the result which may mean that the assay is not fit for purpose. However, recent advances in the technology and the understanding of molecular biology approaches have further given rise to the re-assessment of these methods for their quantitative potential. This review focuses on important issues for consideration when validating a molecular biology assay and the various factors that can impact on the measurement uncertainty of a result associated with molecular biology approaches used in detection of food fraud, with a particular focus on quantitative PCR-based and proteomics assays.
A Mathematical Model and Algorithm for Routing Air Traffic Under Weather Uncertainty
NASA Technical Reports Server (NTRS)
Sadovsky, Alexander V.
2016-01-01
A central challenge in managing today's commercial en route air traffic is the task of routing the aircraft in the presence of adverse weather. Such weather can make regions of the airspace unusable, so all affected flights must be re-routed. Today this task is carried out by conference and negotiation between human air traffic controllers (ATC) responsible for the involved sectors of the airspace. One can argue that, in so doing, ATC try to solve an optimization problem without giving it a precise quantitative formulation. Such a formulation gives the mathematical machinery for constructing and verifying algorithms that are aimed at solving the problem. This paper contributes one such formulation and a corresponding algorithm. The algorithm addresses weather uncertainty and has closed form, which allows transparent analysis of correctness, realism, and computational costs.
ERIC Educational Resources Information Center
Kim, Ho Sung
2013-01-01
A quantitative method for estimating an expected uncertainty (reliability and validity) in assessment results arising from the relativity between four variables, viz examiner's expertise, examinee's expertise achieved, assessment task difficulty and examinee's performance, was developed for the complex assessment applicable to final…
Quantitative risk assessment is fraught with many uncertainties. The validity of the assumptions underlying the methods employed are often difficult to test or validate. Cancer risk assessment has generally employed either human epidemiological data from relatively high occupatio...
Biedermann, Maurus; Grob, Koni
2012-09-14
Mineral oil hydrocarbons are complex as well as varying mixtures and produce correspondingly complex chromatograms (on-line HPLC-GC-FID as described in Part 1): mostly humps of unresolved components are obtained, sometimes with sharp peaks on top. Chromatograms may also contain peaks of hydrocarbons from other sources which need to be subtracted from the mineral oil components. The review focuses on the interpretation and integration of chromatograms related to food contamination by mineral oil from paperboard boxes (off-set printing inks and recycled fibers), if possible distinguishing between various sources of mineral oil. Typical chromatograms are shown for relevant components and interferences as well as food samples encountered on the market. Details are pointed out which may provide relevant information. Integration is shown for examples of paperboard packaging materials as well as various foods. Finally the uncertainty of the analysis and limit of quantitation are discussed for specific examples. They primarily result from the interpretation of the chromatogram, manually placing the baseline and cuts for taking off extraneous components. Without previous enrichment, the limit of quantitation is between around 0.1 mg/kg for foods with a low fat content and 2.5 mg/kg for fats and oils. The measurement uncertainty can be kept clearly below 20% for most samples. Copyright © 2012 Elsevier B.V. All rights reserved.
Schmidt, Philip J; Pintar, Katarina D M; Fazil, Aamir M; Topp, Edward
2013-09-01
Dose-response models are the essential link between exposure assessment and computed risk values in quantitative microbial risk assessment, yet the uncertainty that is inherent to computed risks because the dose-response model parameters are estimated using limited epidemiological data is rarely quantified. Second-order risk characterization approaches incorporating uncertainty in dose-response model parameters can provide more complete information to decisionmakers by separating variability and uncertainty to quantify the uncertainty in computed risks. Therefore, the objective of this work is to develop procedures to sample from posterior distributions describing uncertainty in the parameters of exponential and beta-Poisson dose-response models using Bayes's theorem and Markov Chain Monte Carlo (in OpenBUGS). The theoretical origins of the beta-Poisson dose-response model are used to identify a decomposed version of the model that enables Bayesian analysis without the need to evaluate Kummer confluent hypergeometric functions. Herein, it is also established that the beta distribution in the beta-Poisson dose-response model cannot address variation among individual pathogens, criteria to validate use of the conventional approximation to the beta-Poisson model are proposed, and simple algorithms to evaluate actual beta-Poisson probabilities of infection are investigated. The developed MCMC procedures are applied to analysis of a case study data set, and it is demonstrated that an important region of the posterior distribution of the beta-Poisson dose-response model parameters is attributable to the absence of low-dose data. This region includes beta-Poisson models for which the conventional approximation is especially invalid and in which many beta distributions have an extreme shape with questionable plausibility. © Her Majesty the Queen in Right of Canada 2013. Reproduced with the permission of the Minister of the Public Health Agency of Canada.
NASA Technical Reports Server (NTRS)
Nagpal, Vinod K.
1988-01-01
The effects of actual variations, also called uncertainties, in geometry and material properties on the structural response of a space shuttle main engine turbopump blade are evaluated. A normal distribution was assumed to represent the uncertainties statistically. Uncertainties were assumed to be totally random, partially correlated, and fully correlated. The magnitude of these uncertainties were represented in terms of mean and variance. Blade responses, recorded in terms of displacements, natural frequencies, and maximum stress, was evaluated and plotted in the form of probabilistic distributions under combined uncertainties. These distributions provide an estimate of the range of magnitudes of the response and probability of occurrence of a given response. Most importantly, these distributions provide the information needed to estimate quantitatively the risk in a structural design.
A Monte-Carlo game theoretic approach for Multi-Criteria Decision Making under uncertainty
NASA Astrophysics Data System (ADS)
Madani, Kaveh; Lund, Jay R.
2011-05-01
Game theory provides a useful framework for studying Multi-Criteria Decision Making problems. This paper suggests modeling Multi-Criteria Decision Making problems as strategic games and solving them using non-cooperative game theory concepts. The suggested method can be used to prescribe non-dominated solutions and also can be used as a method to predict the outcome of a decision making problem. Non-cooperative stability definitions for solving the games allow consideration of non-cooperative behaviors, often neglected by other methods which assume perfect cooperation among decision makers. To deal with the uncertainty in input variables a Monte-Carlo Game Theory (MCGT) approach is suggested which maps the stochastic problem into many deterministic strategic games. The games are solved using non-cooperative stability definitions and the results include possible effects of uncertainty in input variables on outcomes. The method can handle multi-criteria multi-decision-maker problems with uncertainty. The suggested method does not require criteria weighting, developing a compound decision objective, and accurate quantitative (cardinal) information as it simplifies the decision analysis by solving problems based on qualitative (ordinal) information, reducing the computational burden substantially. The MCGT method is applied to analyze California's Sacramento-San Joaquin Delta problem. The suggested method provides insights, identifies non-dominated alternatives, and predicts likely decision outcomes.
Regional crop yield forecasting: a probabilistic approach
NASA Astrophysics Data System (ADS)
de Wit, A.; van Diepen, K.; Boogaard, H.
2009-04-01
Information on the outlook on yield and production of crops over large regions is essential for government services dealing with import and export of food crops, for agencies with a role in food relief, for international organizations with a mandate in monitoring the world food production and trade, and for commodity traders. Process-based mechanistic crop models are an important tool for providing such information, because they can integrate the effect of crop management, weather and soil on crop growth. When properly integrated in a yield forecasting system, the aggregated model output can be used to predict crop yield and production at regional, national and continental scales. Nevertheless, given the scales at which these models operate, the results are subject to large uncertainties due to poorly known weather conditions and crop management. Current yield forecasting systems are generally deterministic in nature and provide no information about the uncertainty bounds on their output. To improve on this situation we present an ensemble-based approach where uncertainty bounds can be derived from the dispersion of results in the ensemble. The probabilistic information provided by this ensemble-based system can be used to quantify uncertainties (risk) on regional crop yield forecasts and can therefore be an important support to quantitative risk analysis in a decision making process.
Fitts Cochrane, Jean; Lonsdorf, Eric; Allison, Taber D; Sanders-Reed, Carol A
2015-09-01
Challenges arise when renewable energy development triggers "no net loss" policies for protected species, such as where wind energy facilities affect Golden Eagles in the western United States. When established mitigation approaches are insufficient to fully avoid or offset losses, conservation goals may still be achievable through experimental implementation of unproven mitigation methods provided they are analyzed within a framework that deals transparently and rigorously with uncertainty. We developed an approach to quantify and analyze compensatory mitigation that (1) relies on expert opinion elicited in a thoughtful and structured process to design the analysis (models) and supplement available data, (2) builds computational models as hypotheses about cause-effect relationships, (3) represents scientific uncertainty in stochastic model simulations, (4) provides probabilistic predictions of "relative" mortality with and without mitigation, (5) presents results in clear formats useful to applying risk management preferences (regulatory standards) and selecting strategies and levels of mitigation for immediate action, and (6) defines predictive parameters in units that could be monitored effectively, to support experimental adaptive management and reduction in uncertainty. We illustrate the approach with a case study characterized by high uncertainty about underlying biological processes and high conservation interest: estimating the quantitative effects of voluntary strategies to abate lead poisoning in Golden Eagles in Wyoming due to ingestion of spent game hunting ammunition.
NASA Astrophysics Data System (ADS)
D'Emilia, G.; Di Gasbarro, D.; Gaspari, A.; Natale, E.
2015-11-01
A methodology is proposed assuming high-level Energy Performance Indicators (EnPIs) uncertainty as quantitative indicator of the evolution of an Energy Management System (EMS). Motivations leading to the selection of the EnPIs, uncertainty evaluation techniques and criteria supporting decision-making are discussed, in order to plan and pursue reliable measures for energy performance improvement. In this paper, problems, priorities, operative possibilities and reachable improvement limits are examined, starting from the measurement uncertainty assessment. Two different industrial cases are analysed with reference to the following aspects: absence/presence of energy management policy and action plans; responsibility level for the energy issues; employees’ training and motivation in respect of the energy problems; absence/presence of adequate infrastructures for monitoring and sharing of energy information; level of standardization and integration of methods and procedures linked to the energy activities; economic and financial resources for the improvement of energy efficiency. A critic and comparative analysis of the obtained results is realized. The methodology, experimentally validated, allows developing useful considerations for effective, realistic and economically feasible improvement plans, depending on the specific situation. Recursive application of the methodology allows getting reliable and resolved assessment of the EMS status, also in dynamic industrial contexts.
Duewer, D L; Lalonde, S A; Aubin, R A; Fourney, R M; Reeder, D J
1998-05-01
Knowledge of the expected uncertainty in restriction fragment length polymorphism (RFLP) measurements is required for confident exchange of such data among different laboratories. The total measurement uncertainty among all Technical Working Group for DNA Analysis Methods laboratories has previously been characterized and found to be acceptably small. Casework cell line control measurements provided by six Royal Canadian Mounted Police (RCMP) and 30 U.S. commercial, local, state, and Federal forensic laboratories enable quantitative determination of the within-laboratory precision and among-laboratory concordance components of measurement uncertainty typical of both sets of laboratories. Measurement precision is the same in the two countries for DNA fragments of size 1000 base pairs (bp) to 10,000 bp. However, the measurement concordance among the RCMP laboratories is clearly superior to that within the U.S. forensic community. This result is attributable to the use of a single analytical protocol in all RCMP laboratories. Concordance among U.S. laboratories cannot be improved through simple mathematical adjustments. Community-wide efforts focused on improved concordance may be the most efficient mechanism for further reduction of among-laboratory RFLP measurement uncertainty, should the resources required to fully evaluate potential cross-jurisdictional matches become burdensome as the number of RFLP profiles on record increases.
Understanding Climate Uncertainty with an Ocean Focus
NASA Astrophysics Data System (ADS)
Tokmakian, R. T.
2009-12-01
Uncertainty in climate simulations arises from various aspects of the end-to-end process of modeling the Earth’s climate. First, there is uncertainty from the structure of the climate model components (e.g. ocean/ice/atmosphere). Even the most complex models are deficient, not only in the complexity of the processes they represent, but in which processes are included in a particular model. Next, uncertainties arise from the inherent error in the initial and boundary conditions of a simulation. Initial conditions are the state of the weather or climate at the beginning of the simulation and other such things, and typically come from observations. Finally, there is the uncertainty associated with the values of parameters in the model. These parameters may represent physical constants or effects, such as ocean mixing, or non-physical aspects of modeling and computation. The uncertainty in these input parameters propagates through the non-linear model to give uncertainty in the outputs. The models in 2020 will no doubt be better than today’s models, but they will still be imperfect, and development of uncertainty analysis technology is a critical aspect of understanding model realism and prediction capability. Smith [2002] and Cox and Stephenson [2007] discuss the need for methods to quantify the uncertainties within complicated systems so that limitations or weaknesses of the climate model can be understood. In making climate predictions, we need to have available both the most reliable model or simulation and a methods to quantify the reliability of a simulation. If quantitative uncertainty questions of the internal model dynamics are to be answered with complex simulations such as AOGCMs, then the only known path forward is based on model ensembles that characterize behavior with alternative parameter settings [e.g. Rougier, 2007]. The relevance and feasibility of using "Statistical Analysis of Computer Code Output" (SACCO) methods for examining uncertainty in ocean circulation due to parameter specification will be described and early results using the ocean/ice components of the CCSM climate model in a designed experiment framework will be shown. Cox, P. and D. Stephenson, Climate Change: A Changing Climate for Prediction, 2007, Science 317 (5835), 207, DOI: 10.1126/science.1145956. Rougier, J. C., 2007: Probabilistic Inference for Future Climate Using an Ensemble of Climate Model Evaluations, Climatic Change, 81, 247-264. Smith L., 2002, What might we learn from climate forecasts? Proc. Nat’l Academy of Sciences, Vol. 99, suppl. 1, 2487-2492 doi:10.1073/pnas.012580599.
Winkler, Peter; Zurl, Brigitte; Guss, Helmuth; Kindl, Peter; Stuecklschweiger, Georg
2005-02-21
A system for dosimetric verification of intensity-modulated radiotherapy (IMRT) treatment plans using absolute calibrated radiographic films is presented. At our institution this verification procedure is performed for all IMRT treatment plans prior to patient irradiation. Therefore clinical treatment plans are transferred to a phantom and recalculated. Composite treatment plans are irradiated to a single film. Film density to absolute dose conversion is performed automatically based on a single calibration film. A software application encompassing film calibration, 2D registration of measurement and calculated distributions, image fusion, and a number of visual and quantitative evaluation utilities was developed. The main topic of this paper is a performance analysis for this quality assurance procedure, with regard to the specification of tolerance levels for quantitative evaluations. Spatial and dosimetric precision and accuracy were determined for the entire procedure, comprising all possible sources of error. The overall dosimetric and spatial measurement uncertainties obtained thereby were 1.9% and 0.8 mm respectively. Based on these results, we specified 5% dose difference and 3 mm distance-to-agreement as our tolerance levels for patient-specific quality assurance for IMRT treatments.
Considerations for interpreting probabilistic estimates of uncertainty of forest carbon
James E. Smith; Linda S. Heath
2000-01-01
Quantitative estimated of carbon inventories are needed as part of nationwide attempts to reduce net release of greenhouse gases and the associated climate forcing. Naturally, an appreciable amount of uncertainty is inherent in such large-scale assessments, especially since both science and policy issues are still evolving. Decision makers need an idea of the...
Peter B. Woodbury; James E. Smith; David A. Weinstein; John A. Laurence
1998-01-01
Most models of the potential effects of climate change on forest growth have produced deterministic predictions. However, there are large uncertainties in data on regional forest condition, estimates of future climate, and quantitative relationships between environmental conditions and forest growth rate. We constructed a new model to analyze these uncertainties...
Nakao, Takashi; Ohira, Hideki; Northoff, Georg
2012-01-01
Most experimental studies of decision-making have specifically examined situations in which a single less-predictable correct answer exists (externally guided decision-making under uncertainty). Along with such externally guided decision-making, there are instances of decision-making in which no correct answer based on external circumstances is available for the subject (internally guided decision-making). Such decisions are usually made in the context of moral decision-making as well as in preference judgment, where the answer depends on the subject’s own, i.e., internal, preferences rather than on external, i.e., circumstantial, criteria. The neuronal and psychological mechanisms that allow guidance of decisions based on more internally oriented criteria in the absence of external ones remain unclear. This study was undertaken to compare decision-making of these two kinds empirically and theoretically. First, we reviewed studies of decision-making to clarify experimental–operational differences between externally guided and internally guided decision-making. Second, using multi-level kernel density analysis, a whole-brain-based quantitative meta-analysis of neuroimaging studies was performed. Our meta-analysis revealed that the neural network used predominantly for internally guided decision-making differs from that for externally guided decision-making under uncertainty. This result suggests that studying only externally guided decision-making under uncertainty is insufficient to account for decision-making processes in the brain. Finally, based on the review and results of the meta-analysis, we discuss the differences and relations between decision-making of these two types in terms of their operational, neuronal, and theoretical characteristics. PMID:22403525
Automated Quantitative Rare Earth Elements Mineralogy by Scanning Electron Microscopy
NASA Astrophysics Data System (ADS)
Sindern, Sven; Meyer, F. Michael
2016-09-01
Increasing industrial demand of rare earth elements (REEs) stems from the central role they play for advanced technologies and the accelerating move away from carbon-based fuels. However, REE production is often hampered by the chemical, mineralogical as well as textural complexity of the ores with a need for better understanding of their salient properties. This is not only essential for in-depth genetic interpretations but also for a robust assessment of ore quality and economic viability. The design of energy and cost-efficient processing of REE ores depends heavily on information about REE element deportment that can be made available employing automated quantitative process mineralogy. Quantitative mineralogy assigns numeric values to compositional and textural properties of mineral matter. Scanning electron microscopy (SEM) combined with a suitable software package for acquisition of backscatter electron and X-ray signals, phase assignment and image analysis is one of the most efficient tools for quantitative mineralogy. The four different SEM-based automated quantitative mineralogy systems, i.e. FEI QEMSCAN and MLA, Tescan TIMA and Zeiss Mineralogic Mining, which are commercially available, are briefly characterized. Using examples of quantitative REE mineralogy, this chapter illustrates capabilities and limitations of automated SEM-based systems. Chemical variability of REE minerals and analytical uncertainty can reduce performance of phase assignment. This is shown for the REE phases parisite and synchysite. In another example from a monazite REE deposit, the quantitative mineralogical parameters surface roughness and mineral association derived from image analysis are applied for automated discrimination of apatite formed in a breakdown reaction of monazite and apatite formed by metamorphism prior to monazite breakdown. SEM-based automated mineralogy fulfils all requirements for characterization of complex unconventional REE ores that will become increasingly important for supply of REEs in the future.
A Monte Carlo Sensitivity Analysis of CF2 and CF Radical Densities in a c-C4F8 Plasma
NASA Technical Reports Server (NTRS)
Bose, Deepak; Rauf, Shahid; Hash, D. B.; Govindan, T. R.; Meyyappan, M.
2004-01-01
A Monte Carlo sensitivity analysis is used to build a plasma chemistry model for octacyclofluorobutane (c-C4F8) which is commonly used in dielectric etch. Experimental data are used both quantitatively and quantitatively to analyze the gas phase and gas surface reactions for neutral radical chemistry. The sensitivity data of the resulting model identifies a few critical gas phase and surface aided reactions that account for most of the uncertainty in the CF2 and CF radical densities. Electron impact dissociation of small radicals (CF2 and CF) and their surface recombination reactions are found to be the rate-limiting steps in the neutral radical chemistry. The relative rates for these electron impact dissociation and surface recombination reactions are also suggested. The resulting mechanism is able to explain the measurements of CF2 and CF densities available in the literature and also their hollow spatial density profiles.
Neutron activation analysis: A primary method of measurement
NASA Astrophysics Data System (ADS)
Greenberg, Robert R.; Bode, Peter; De Nadai Fernandes, Elisabete A.
2011-03-01
Neutron activation analysis (NAA), based on the comparator method, has the potential to fulfill the requirements of a primary ratio method as defined in 1998 by the Comité Consultatif pour la Quantité de Matière — Métrologie en Chimie (CCQM, Consultative Committee on Amount of Substance — Metrology in Chemistry). This thesis is evidenced in this paper in three chapters by: demonstration that the method is fully physically and chemically understood; that a measurement equation can be written down in which the values of all parameters have dimensions in SI units and thus having the potential for metrological traceability to these units; that all contributions to uncertainty of measurement can be quantitatively evaluated, underpinning the metrological traceability; and that the performance of NAA in CCQM key-comparisons of trace elements in complex matrices between 2000 and 2007 is similar to the performance of Isotope Dilution Mass Spectrometry (IDMS), which had been formerly designated by the CCQM as a primary ratio method.
Expert agreements and disagreements on induced seismicity by Enhanced Geothermal Systems
NASA Astrophysics Data System (ADS)
Trutnevyte, E.; Azevedo, I. L.
2016-12-01
Enhanced or Engineered Geothermal Systems (EGS) are at an early stage of development and only a handful of projects exist worldwide. In face of limited empirical evidence on EGS induced seismicity, expert elicitation provides a complementary view to quantitative assessments and basic science. We present the results of an international expert elicitation exercise with 14 experts from 6 countries. The elicitation aimed at evaluating induced seismicity hazard and risk for EGS and characterizing associated uncertainty. The state-of-the-art expert elicitation method was used: it combines technical analysis with behavioral science-informed elicitation of expert judgement in order to minimize subjectivity. The experts assessed a harmonized scenario of an EGS plant, its operational characteristics, geological context, and surrounding buildings and infrastructures. The experts provided quantitative estimates of exceedance probabilities of induced M>=3 and M>=5, maximum magnitudes that could be observed, and made judgements on economic loss, injuries, and fatalities in the case of M=3 and M=5. The experts also rated the importance of factors that influence induced seismicity hazard and risk (e.g. reservoir depth, injected volumes, exposed building stock etc.) and the potential uncertainty reductions through future research. We present the findings of this elicitation and highlight the points of expert agreements and disagreements.
NASA Astrophysics Data System (ADS)
Sun, Mei; Zhang, Xiaolin; Huo, Zailin; Feng, Shaoyuan; Huang, Guanhua; Mao, Xiaomin
2016-03-01
Quantitatively ascertaining and analyzing the effects of model uncertainty on model reliability is a focal point for agricultural-hydrological models due to more uncertainties of inputs and processes. In this study, the generalized likelihood uncertainty estimation (GLUE) method with Latin hypercube sampling (LHS) was used to evaluate the uncertainty of the RZWQM-DSSAT (RZWQM2) model outputs responses and the sensitivity of 25 parameters related to soil properties, nutrient transport and crop genetics. To avoid the one-sided risk of model prediction caused by using a single calibration criterion, the combined likelihood (CL) function integrated information concerning water, nitrogen, and crop production was introduced in GLUE analysis for the predictions of the following four model output responses: the total amount of water content (T-SWC) and the nitrate nitrogen (T-NIT) within the 1-m soil profile, the seed yields of waxy maize (Y-Maize) and winter wheat (Y-Wheat). In the process of evaluating RZWQM2, measurements and meteorological data were obtained from a field experiment that involved a winter wheat and waxy maize crop rotation system conducted from 2003 to 2004 in southern Beijing. The calibration and validation results indicated that RZWQM2 model can be used to simulate the crop growth and water-nitrogen migration and transformation in wheat-maize crop rotation planting system. The results of uncertainty analysis using of GLUE method showed T-NIT was sensitive to parameters relative to nitrification coefficient, maize growth characteristics on seedling period, wheat vernalization period, and wheat photoperiod. Parameters on soil saturated hydraulic conductivity, nitrogen nitrification and denitrification, and urea hydrolysis played an important role in crop yield component. The prediction errors for RZWQM2 outputs with CL function were relatively lower and uniform compared with other likelihood functions composed of individual calibration criterion. This new and successful application of the GLUE method for determining the uncertainty and sensitivity of the RZWQM2 could provide a reference for the optimization of model parameters with different emphases according to research interests.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Babendreier, Justin E.; Castleton, Karl J.
2005-08-01
Elucidating uncertainty and sensitivity structures in environmental models can be a difficult task, even for low-order, single-medium constructs driven by a unique set of site-specific data. Quantitative assessment of integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a comparative approach using several techniques, coupled with sufficient computational power. The Framework for Risk Analysis in Multimedia Environmental Systems - Multimedia, Multipathway, and Multireceptor Risk Assessment (FRAMES-3MRA) is an important software model being developed by the United States Environmental Protection Agency for use in risk assessment of hazardous waste management facilities. The 3MRAmore » modeling system includes a set of 17 science modules that collectively simulate release, fate and transport, exposure, and risk associated with hazardous contaminants disposed of in land-based waste management units (WMU) .« less
Mercury study report to Congress. Volume 5. Health effects of mercury and mercury compounds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hassett-Sipple, B.; Swartout, J.; Schoeny, R.
1997-12-01
This volume summarizes the available information on human health effects and animal data for hazard identification and dose-response assessment for three forms of mercury: elemental mercury, mercury chloride (inorganic mercury), and methylmercury (organic mercury). Effects are summarized by endpoint. The risk assessment evaluates carcinogenicity, mutagenicity, developmental toxicity and general systemic toxicity of these chemical species of mercury. Toxicokinetics (absorption, distribution, metabolism and excretion) are described for each of the three mercury species. Reference doses are calculated for inorganic and methylmercury; a reference concentrations for inhaled elemental mercury is provided. A quantitative analysis of factors contributing to variability and uncertainty inmore » the methylmercury RfD is provided in an appendix. Interactions and sensitive populations are described. the draft volume assesses ongoing research and research needs to reduce uncertainty surrounding adverse human health consequences of methylmercury exposure.« less
A fault tree model to assess probability of contaminant discharge from shipwrecks.
Landquist, H; Rosén, L; Lindhe, A; Norberg, T; Hassellöv, I-M; Lindgren, J F; Dahllöf, I
2014-11-15
Shipwrecks on the sea floor around the world may contain hazardous substances that can cause harm to the marine environment. Today there are no comprehensive methods for environmental risk assessment of shipwrecks, and thus there is poor support for decision-making on prioritization of mitigation measures. The purpose of this study was to develop a tool for quantitative risk estimation of potentially polluting shipwrecks, and in particular an estimation of the annual probability of hazardous substance discharge. The assessment of the probability of discharge is performed using fault tree analysis, facilitating quantification of the probability with respect to a set of identified hazardous events. This approach enables a structured assessment providing transparent uncertainty and sensitivity analyses. The model facilitates quantification of risk, quantification of the uncertainties in the risk calculation and identification of parameters to be investigated further in order to obtain a more reliable risk calculation. Copyright © 2014 Elsevier Ltd. All rights reserved.
Computational Support for Technology- Investment Decisions
NASA Technical Reports Server (NTRS)
Adumitroaie, Virgil; Hua, Hook; Lincoln, William; Block, Gary; Mrozinski, Joseph; Shelton, Kacie; Weisbin, Charles; Elfes, Alberto; Smith, Jeffrey
2007-01-01
Strategic Assessment of Risk and Technology (START) is a user-friendly computer program that assists human managers in making decisions regarding research-and-development investment portfolios in the presence of uncertainties and of non-technological constraints that include budgetary and time limits, restrictions related to infrastructure, and programmatic and institutional priorities. START facilitates quantitative analysis of technologies, capabilities, missions, scenarios and programs, and thereby enables the selection and scheduling of value-optimal development efforts. START incorporates features that, variously, perform or support a unique combination of functions, most of which are not systematically performed or supported by prior decision- support software. These functions include the following: Optimal portfolio selection using an expected-utility-based assessment of capabilities and technologies; Temporal investment recommendations; Distinctions between enhancing and enabling capabilities; Analysis of partial funding for enhancing capabilities; and Sensitivity and uncertainty analysis. START can run on almost any computing hardware, within Linux and related operating systems that include Mac OS X versions 10.3 and later, and can run in Windows under the Cygwin environment. START can be distributed in binary code form. START calls, as external libraries, several open-source software packages. Output is in Excel (.xls) file format.
An index-based robust decision making framework for watershed management in a changing climate.
Kim, Yeonjoo; Chung, Eun-Sung
2014-03-01
This study developed an index-based robust decision making framework for watershed management dealing with water quantity and quality issues in a changing climate. It consists of two parts of management alternative development and analysis. The first part for alternative development consists of six steps: 1) to understand the watershed components and process using HSPF model, 2) to identify the spatial vulnerability ranking using two indices: potential streamflow depletion (PSD) and potential water quality deterioration (PWQD), 3) to quantify the residents' preferences on water management demands and calculate the watershed evaluation index which is the weighted combinations of PSD and PWQD, 4) to set the quantitative targets for water quantity and quality, 5) to develop a list of feasible alternatives and 6) to eliminate the unacceptable alternatives. The second part for alternative analysis has three steps: 7) to analyze all selected alternatives with a hydrologic simulation model considering various climate change scenarios, 8) to quantify the alternative evaluation index including social and hydrologic criteria with utilizing multi-criteria decision analysis methods and 9) to prioritize all options based on a minimax regret strategy for robust decision. This framework considers the uncertainty inherent in climate models and climate change scenarios with utilizing the minimax regret strategy, a decision making strategy under deep uncertainty and thus this procedure derives the robust prioritization based on the multiple utilities of alternatives from various scenarios. In this study, the proposed procedure was applied to the Korean urban watershed, which has suffered from streamflow depletion and water quality deterioration. Our application shows that the framework provides a useful watershed management tool for incorporating quantitative and qualitative information into the evaluation of various policies with regard to water resource planning and management. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhao, S.; Mashayekhi, R.; Saeednooran, S.; Hakami, A.; Ménard, R.; Moran, M. D.; Zhang, J.
2016-12-01
We have developed a formal framework for documentation, quantification, and propagation of uncertainties in upstream emissions inventory data at various stages leading to the generation of model-ready gridded emissions through emissions processing software such as the EPA's SMOKE (Sparse Matrix Operator Kernel Emissions) system. To illustrate this framework we present a proof-of-concept case study of a bottom-up quantitative assessment of uncertainties in emissions from residential wood combustion (RWC) in the U.S. and Canada. Uncertainties associated with key inventory parameters are characterized based on existing information sources, including the American Housing Survey (AHS) from the U.S. Census Bureau, Timber Products Output (TPO) surveys from the U.S. Forest Service, TNS Canadian Facts surveys, and the AP-42 emission factor document from the U.S. EPA. The propagation of uncertainties is based on Monte Carlo simulation code external to SMOKE. Latin Hypercube Sampling (LHS) is implemented to generate a set of random realizations of each RWC inventory parameter, for which the uncertainties are assumed to be normally distributed. Random realizations are also obtained for each RWC temporal and chemical speciation profile and spatial surrogate field external to SMOKE using the LHS approach. SMOKE outputs for primary emissions (e.g., CO, VOC) using both RWC emission inventory realizations and perturbed temporal and chemical profiles and spatial surrogates show relative uncertainties of about 30-50% across the U.S. and about 70-100% across Canada. Positive skewness values (up to 2.7) and variable kurtosis values (up to 4.8) were also found. Spatial allocation contributes significantly to the overall uncertainty, particularly in Canada. By applying this framework we are able to produce random realizations of model-ready gridded emissions that along with available meteorological ensembles can be used to propagate uncertainties through chemical transport models. The approach described here provides an effective means for formal quantification of uncertainties in estimated emissions from various source sectors and for continuous documentation, assessment, and reduction of emission uncertainties.
Framework for the quantitative weight-of-evidence analysis of 'omics data for regulatory purposes.
Bridges, Jim; Sauer, Ursula G; Buesen, Roland; Deferme, Lize; Tollefsen, Knut E; Tralau, Tewes; van Ravenzwaay, Ben; Poole, Alan; Pemberton, Mark
2017-12-01
A framework for the quantitative weight-of-evidence (QWoE) analysis of 'omics data for regulatory purposes is presented. The QWoE framework encompasses seven steps to evaluate 'omics data (also together with non-'omics data): (1) Hypothesis formulation, identification and weighting of lines of evidence (LoEs). LoEs conjoin different (types of) studies that are used to critically test the hypothesis. As an essential component of the QWoE framework, step 1 includes the development of templates for scoring sheets that predefine scoring criteria with scores of 0-4 to enable a quantitative determination of study quality and data relevance; (2) literature searches and categorisation of studies into the pre-defined LoEs; (3) and (4) quantitative assessment of study quality and data relevance using the respective pre-defined scoring sheets for each study; (5) evaluation of LoE-specific strength of evidence based upon the study quality and study relevance scores of the studies conjoined in the respective LoE; (6) integration of the strength of evidence from the individual LoEs to determine the overall strength of evidence; (7) characterisation of uncertainties and conclusion on the QWoE. To put the QWoE framework in practice, case studies are recommended to confirm the relevance of its different steps, or to adapt them as necessary. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Quantitative local analysis of nonlinear systems
NASA Astrophysics Data System (ADS)
Topcu, Ufuk
This thesis investigates quantitative methods for local robustness and performance analysis of nonlinear dynamical systems with polynomial vector fields. We propose measures to quantify systems' robustness against uncertainties in initial conditions (regions-of-attraction) and external disturbances (local reachability/gain analysis). S-procedure and sum-of-squares relaxations are used to translate Lyapunov-type characterizations to sum-of-squares optimization problems. These problems are typically bilinear/nonconvex (due to local analysis rather than global) and their size grows rapidly with state/uncertainty space dimension. Our approach is based on exploiting system theoretic interpretations of these optimization problems to reduce their complexity. We propose a methodology incorporating simulation data in formal proof construction enabling more reliable and efficient search for robustness and performance certificates compared to the direct use of general purpose solvers. This technique is adapted both to region-of-attraction and reachability analysis. We extend the analysis to uncertain systems by taking an intentionally simplistic and potentially conservative route, namely employing parameter-independent rather than parameter-dependent certificates. The conservatism is simply reduced by a branch-and-hound type refinement procedure. The main thrust of these methods is their suitability for parallel computing achieved by decomposing otherwise challenging problems into relatively tractable smaller ones. We demonstrate proposed methods on several small/medium size examples in each chapter and apply each method to a benchmark example with an uncertain short period pitch axis model of an aircraft. Additional practical issues leading to a more rigorous basis for the proposed methodology as well as promising further research topics are also addressed. We show that stability of linearized dynamics is not only necessary but also sufficient for the feasibility of the formulations in region-of-attraction analysis. Furthermore, we generalize an upper bound refinement procedure in local reachability/gain analysis which effectively generates non-polynomial certificates from polynomial ones. Finally, broader applicability of optimization-based tools stringently depends on the availability of scalable/hierarchial algorithms. As an initial step toward this direction, we propose a local small-gain theorem and apply to stability region analysis in the presence of unmodeled dynamics.
How to find what you don't know: Visualising variability in 3D geological models
NASA Astrophysics Data System (ADS)
Lindsay, Mark; Wellmann, Florian; Jessell, Mark; Ailleres, Laurent
2014-05-01
Uncertainties in input data can have compounding effects on the predictive reliability of three-dimensional (3D) geological models. Resource exploration, tectonic studies and environmental modelling can be compromised by using 3D models that misrepresent the target geology, and drilling campaigns that attempt to intersect particular geological units guided by 3D models are at risk of failure if the exploration geologist is unaware of inherent uncertainties. In addition, the visual inspection of 3D models is often the first contact decision makers have with the geology, thus visually communicating the presence and magnitude of uncertainties contained within geological 3D models is critical. Unless uncertainties are presented early in the relationship between decision maker and model, the model will be considered more truthful than the uncertainties allow with each subsequent viewing. We present a selection of visualisation techniques that provide the viewer with an insight to the location and amount of uncertainty contained within a model, and the geological characteristics which are most affected. A model of the Gippsland Basin, southeastern Australia is used as a case study to demonstrate the concepts of information entropy, stratigraphic variability and geodiversity. Central to the techniques shown here is the creation of a model suite, performed by creating similar (but not the same) version of the original model through perturbation of the input data. Specifically, structural data in the form of strike and dip measurements is perturbed in the creation of the model suite. The visualisation techniques presented are: (i) information entropy; (ii) stratigraphic variability and (iii) geodiversity. Information entropy is used to analyse uncertainty in a spatial context, combining the empirical probability distributions of multiple outcomes with a single quantitative measure. Stratigraphic variability displays the number of possible lithologies that may exist at a given point within the model volume. Geodiversity analyses various model characteristics (or 'geodiveristy metrics'), including the depth, volume of unit, the curvature of an interface, the geological complexity of a contact and the contact relationships units have with each other. Principal component analysis, a multivariate statistical technique, is used to simultaneously examine each of the geodiveristy metrics to determine the boundaries of model space, and identify which metrics contribute most to model uncertainty. The combination of information entropy, stratigraphic variability and geodiversity analysis provides a descriptive and thorough representation of uncertainty with effective visualisation techniques that clearly communicate the geological uncertainty contained within the geological model.
Humphries Choptiany, John Michael; Pelot, Ronald
2014-09-01
Multicriteria decision analysis (MCDA) has been applied to various energy problems to incorporate a variety of qualitative and quantitative criteria, usually spanning environmental, social, engineering, and economic fields. MCDA and associated methods such as life-cycle assessments and cost-benefit analysis can also include risk analysis to address uncertainties in criteria estimates. One technology now being assessed to help mitigate climate change is carbon capture and storage (CCS). CCS is a new process that captures CO2 emissions from fossil-fueled power plants and injects them into geological reservoirs for storage. It presents a unique challenge to decisionmakers (DMs) due to its technical complexity, range of environmental, social, and economic impacts, variety of stakeholders, and long time spans. The authors have developed a risk assessment model using a MCDA approach for CCS decisions such as selecting between CO2 storage locations and choosing among different mitigation actions for reducing risks. The model includes uncertainty measures for several factors, utility curve representations of all variables, Monte Carlo simulation, and sensitivity analysis. This article uses a CCS scenario example to demonstrate the development and application of the model based on data derived from published articles and publicly available sources. The model allows high-level DMs to better understand project risks and the tradeoffs inherent in modern, complex energy decisions. © 2014 Society for Risk Analysis.
Contribution of physical modelling to climate-driven landslide hazard mapping: an alpine test site
NASA Astrophysics Data System (ADS)
Vandromme, R.; Desramaut, N.; Baills, A.; Hohmann, A.; Grandjean, G.; Sedan, O.; Mallet, J. P.
2012-04-01
The aim of this work is to develop a methodology for integrating climate change scenarios into quantitative hazard assessment and especially their precipitation component. The effects of climate change will be different depending on both the location of the site and the type of landslide considered. Indeed, mass movements can be triggered by different factors. This paper describes a methodology to address this issue and shows an application on an alpine test site. Mechanical approaches represent a solution for quantitative landslide susceptibility and hazard modeling. However, as the quantity and the quality of data are generally very heterogeneous at a regional scale, it is necessary to take into account the uncertainty in the analysis. In this perspective, a new hazard modeling method is developed and integrated in a program named ALICE. This program integrates mechanical stability analysis through a GIS software taking into account data uncertainty. This method proposes a quantitative classification of landslide hazard and offers a useful tool to gain time and efficiency in hazard mapping. However, an expertise approach is still necessary to finalize the maps. Indeed it is the only way to take into account some influent factors in slope stability such as heterogeneity of the geological formations or effects of anthropic interventions. To go further, the alpine test site (Barcelonnette area, France) is being used to integrate climate change scenarios into ALICE program, and especially their precipitation component with the help of a hydrological model (GARDENIA) and the regional climate model REMO (Jacob, 2001). From a DEM, land-cover map, geology, geotechnical data and so forth the program classifies hazard zones depending on geotechnics and different hydrological contexts varying in time. This communication, realized within the framework of Safeland project, is supported by the European Commission under the 7th Framework Programme for Research and Technological Development, Area "Environment", Activity 1.3.3.1 "Prediction of triggering and risk assessment for landslides".
Lash, Timothy L
2007-11-26
The associations of pesticide exposure with disease outcomes are estimated without the benefit of a randomized design. For this reason and others, these studies are susceptible to systematic errors. I analyzed studies of the associations between alachlor and glyphosate exposure and cancer incidence, both derived from the Agricultural Health Study cohort, to quantify the bias and uncertainty potentially attributable to systematic error. For each study, I identified the prominent result and important sources of systematic error that might affect it. I assigned probability distributions to the bias parameters that allow quantification of the bias, drew a value at random from each assigned distribution, and calculated the estimate of effect adjusted for the biases. By repeating the draw and adjustment process over multiple iterations, I generated a frequency distribution of adjusted results, from which I obtained a point estimate and simulation interval. These methods were applied without access to the primary record-level dataset. The conventional estimates of effect associating alachlor and glyphosate exposure with cancer incidence were likely biased away from the null and understated the uncertainty by quantifying only random error. For example, the conventional p-value for a test of trend in the alachlor study equaled 0.02, whereas fewer than 20% of the bias analysis iterations yielded a p-value of 0.02 or lower. Similarly, the conventional fully-adjusted result associating glyphosate exposure with multiple myleoma equaled 2.6 with 95% confidence interval of 0.7 to 9.4. The frequency distribution generated by the bias analysis yielded a median hazard ratio equal to 1.5 with 95% simulation interval of 0.4 to 8.9, which was 66% wider than the conventional interval. Bias analysis provides a more complete picture of true uncertainty than conventional frequentist statistical analysis accompanied by a qualitative description of study limitations. The latter approach is likely to lead to overconfidence regarding the potential for causal associations, whereas the former safeguards against such overinterpretations. Furthermore, such analyses, once programmed, allow rapid implementation of alternative assignments of probability distributions to the bias parameters, so elevate the plane of discussion regarding study bias from characterizing studies as "valid" or "invalid" to a critical and quantitative discussion of sources of uncertainty.
NASA Astrophysics Data System (ADS)
Ali, E. S. M.; Spencer, B.; McEwen, M. R.; Rogers, D. W. O.
2015-02-01
In this study, a quantitative estimate is derived for the uncertainty in the XCOM photon mass attenuation coefficients in the energy range of interest to external beam radiation therapy—i.e. 100 keV (orthovoltage) to 25 MeV—using direct comparisons of experimental data against Monte Carlo models and theoretical XCOM data. Two independent datasets are used. The first dataset is from our recent transmission measurements and the corresponding EGSnrc calculations (Ali et al 2012 Med. Phys. 39 5990-6003) for 10-30 MV photon beams from the research linac at the National Research Council Canada. The attenuators are graphite and lead, with a total of 140 data points and an experimental uncertainty of ˜0.5% (k = 1). An optimum energy-independent cross section scaling factor that minimizes the discrepancies between measurements and calculations is used to deduce cross section uncertainty. The second dataset is from the aggregate of cross section measurements in the literature for graphite and lead (49 experiments, 288 data points). The dataset is compared to the sum of the XCOM data plus the IAEA photonuclear data. Again, an optimum energy-independent cross section scaling factor is used to deduce the cross section uncertainty. Using the average result from the two datasets, the energy-independent cross section uncertainty estimate is 0.5% (68% confidence) and 0.7% (95% confidence). The potential for energy-dependent errors is discussed. Photon cross section uncertainty is shown to be smaller than the current qualitative ‘envelope of uncertainty’ of the order of 1-2%, as given by Hubbell (1999 Phys. Med. Biol 44 R1-22).
NASA Astrophysics Data System (ADS)
Hazenberg, P.; Torfs, P. J. J. F.; Leijnse, H.; Delrieu, G.; Uijlenhoet, R.
2013-09-01
This paper presents a novel approach to estimate the vertical profile of reflectivity (VPR) from volumetric weather radar data using both a traditional Eulerian as well as a newly proposed Lagrangian implementation. For this latter implementation, the recently developed Rotational Carpenter Square Cluster Algorithm (RoCaSCA) is used to delineate precipitation regions at different reflectivity levels. A piecewise linear VPR is estimated for either stratiform or neither stratiform/convective precipitation. As a second aspect of this paper, a novel approach is presented which is able to account for the impact of VPR uncertainty on the estimated radar rainfall variability. Results show that implementation of the VPR identification and correction procedure has a positive impact on quantitative precipitation estimates from radar. Unfortunately, visibility problems severely limit the impact of the Lagrangian implementation beyond distances of 100 km. However, by combining this procedure with the global Eulerian VPR estimation procedure for a given rainfall type (stratiform and neither stratiform/convective), the quality of the quantitative precipitation estimates increases up to a distance of 150 km. Analyses of the impact of VPR uncertainty shows that this aspect accounts for a large fraction of the differences between weather radar rainfall estimates and rain gauge measurements.
Uemoto, Michihisa; Makino, Masanori; Ota, Yuji; Sakaguchi, Hiromi; Shimizu, Yukari; Sato, Kazuhiro
2018-01-01
Minor and trace metals in aluminum and aluminum alloys have been determined by inductively coupled plasma atomic emission spectrometry (ICP-AES) as an interlaboratory testing toward standardization. The trueness of the measured data was successfully investigated to improve the analytical protocols, using certified reference materials of aluminum. Their precision could also be evaluated, feasible to estimate the uncertainties separately. The accuracy (trueness and precision) of the data were finally in good agreement with the certified values and assigned uncertainties. Repeated measurements of aluminum solutions with different concentrations of the analytes revealed the relative standard deviations of the measurements with concentrations, thus enabling their limits of quantitation. They differed separately and also showed slightly higher values with an aluminum matrix than those without one. In addition, the upper limit of the detectable concentration of silicon with simple acid digestion was estimated to be 0.03 % in the mass fraction.
Reliability Impacts in Life Support Architecture and Technology Selection
NASA Technical Reports Server (NTRS)
Lange Kevin E.; Anderson, Molly S.
2012-01-01
Quantitative assessments of system reliability and equivalent system mass (ESM) were made for different life support architectures based primarily on International Space Station technologies. The analysis was applied to a one-year deep-space mission. System reliability was increased by adding redundancy and spares, which added to the ESM. Results were thus obtained allowing a comparison of the ESM for each architecture at equivalent levels of reliability. Although the analysis contains numerous simplifications and uncertainties, the results suggest that achieving necessary reliabilities for deep-space missions will add substantially to the life support ESM and could influence the optimal degree of life support closure. Approaches for reducing reliability impacts were investigated and are discussed.
Measurement Uncertainty Budget of the PMV Thermal Comfort Equation
NASA Astrophysics Data System (ADS)
Ekici, Can
2016-05-01
Fanger's predicted mean vote (PMV) equation is the result of the combined quantitative effects of the air temperature, mean radiant temperature, air velocity, humidity activity level and clothing thermal resistance. PMV is a mathematical model of thermal comfort which was developed by Fanger. The uncertainty budget of the PMV equation was developed according to GUM in this study. An example is given for the uncertainty model of PMV in the exemplification section of the study. Sensitivity coefficients were derived from the PMV equation. Uncertainty budgets can be seen in the tables. A mathematical model of the sensitivity coefficients of Ta, hc, T_{mrt}, T_{cl}, and Pa is given in this study. And the uncertainty budgets for hc, T_{cl}, and Pa are given in this study.
Methodology for qualitative uncertainty assessment of climate impact indicators
NASA Astrophysics Data System (ADS)
Otto, Juliane; Keup-Thiel, Elke; Rechid, Diana; Hänsler, Andreas; Pfeifer, Susanne; Roth, Ellinor; Jacob, Daniela
2016-04-01
The FP7 project "Climate Information Portal for Copernicus" (CLIPC) is developing an integrated platform of climate data services to provide a single point of access for authoritative scientific information on climate change and climate change impacts. In this project, the Climate Service Center Germany (GERICS) has been in charge of the development of a methodology on how to assess the uncertainties related to climate impact indicators. Existing climate data portals mainly treat the uncertainties in two ways: Either they provide generic guidance and/or express with statistical measures the quantifiable fraction of the uncertainty. However, none of the climate data portals give the users a qualitative guidance how confident they can be in the validity of the displayed data. The need for such guidance was identified in CLIPC user consultations. Therefore, we aim to provide an uncertainty assessment that provides the users with climate impact indicator-specific guidance on the degree to which they can trust the outcome. We will present an approach that provides information on the importance of different sources of uncertainties associated with a specific climate impact indicator and how these sources affect the overall 'degree of confidence' of this respective indicator. To meet users requirements in the effective communication of uncertainties, their feedback has been involved during the development process of the methodology. Assessing and visualising the quantitative component of uncertainty is part of the qualitative guidance. As visual analysis method, we apply the Climate Signal Maps (Pfeifer et al. 2015), which highlight only those areas with robust climate change signals. Here, robustness is defined as a combination of model agreement and the significance of the individual model projections. Reference Pfeifer, S., Bülow, K., Gobiet, A., Hänsler, A., Mudelsee, M., Otto, J., Rechid, D., Teichmann, C. and Jacob, D.: Robustness of Ensemble Climate Projections Analyzed with Climate Signal Maps: Seasonal and Extreme Precipitation for Germany, Atmosphere (Basel)., 6(5), 677-698, doi:10.3390/atmos6050677, 2015.
Effects of radiobiological uncertainty on shield design for a 60-day lunar mission
NASA Technical Reports Server (NTRS)
Wilson, John W.; Nealy, John E.; Schimmerling, Walter
1993-01-01
Some consequences of uncertainties in radiobiological risk due to galactic cosmic ray exposure are analyzed to determine their effect on engineering designs for a first lunar outpost - a 60-day mission. Quantitative estimates of shield mass requirements as a function of a radiobiological uncertainty factor are given for a simplified vehicle structure. The additional shield mass required for compensation is calculated as a function of the uncertainty in galactic cosmic ray exposure, and this mass is found to be as large as a factor of 3 for a lunar transfer vehicle. The additional cost resulting from this mass is also calculated. These cost estimates are then used to exemplify the cost-effectiveness of research.
NASA Astrophysics Data System (ADS)
Emori, Seita; Takahashi, Kiyoshi; Yamagata, Yoshiki; Oki, Taikan; Mori, Shunsuke; Fujigaki, Yuko
2013-04-01
With the aim of proposing strategies of global climate risk management, we have launched a five-year research project called ICA-RUS (Integrated Climate Assessment - Risks, Uncertainties and Society). In this project with the phrase "risk management" in its title, we aspire for a comprehensive assessment of climate change risks, explicit consideration of uncertainties, utilization of best available information, and consideration of every possible conditions and options. We also regard the problem as one of decision-making at the human level, which involves social value judgments and adapts to future changes in circumstances. The ICA-RUS project consists of the following five themes: 1) Synthesis of global climate risk management strategies, 2) Optimization of land, water and ecosystem uses for climate risk management, 3) Identification and analysis of critical climate risks, 4) Evaluation of climate risk management options under technological, social and economic uncertainties and 5) Interactions between scientific and social rationalities in climate risk management (see also: http://www.nies.go.jp/ica-rus/en/). For the integration of quantitative knowledge of climate change risks and responses, we apply a tool named AIM/Impact [Policy], which consists of an energy-economic model, a simplified climate model and impact projection modules. At the same time, in order to make use of qualitative knowledge as well, we hold monthly project meetings for the discussion of risk management strategies and publish annual reports based on the quantitative and qualitative information. To enhance the comprehensiveness of the analyses, we maintain an inventory of risks and risk management options. The inventory is revised iteratively through interactive meetings with stakeholders such as policymakers, government officials and industrial representatives.
Bayesian adaptive survey protocols for resource management
Halstead, Brian J.; Wylie, Glenn D.; Coates, Peter S.; Casazza, Michael L.
2011-01-01
Transparency in resource management decisions requires a proper accounting of uncertainty at multiple stages of the decision-making process. As information becomes available, periodic review and updating of resource management protocols reduces uncertainty and improves management decisions. One of the most basic steps to mitigating anthropogenic effects on populations is determining if a population of a species occurs in an area that will be affected by human activity. Species are rarely detected with certainty, however, and falsely declaring a species absent can cause improper conservation decisions or even extirpation of populations. We propose a method to design survey protocols for imperfectly detected species that accounts for multiple sources of uncertainty in the detection process, is capable of quantitatively incorporating expert opinion into the decision-making process, allows periodic updates to the protocol, and permits resource managers to weigh the severity of consequences if the species is falsely declared absent. We developed our method using the giant gartersnake (Thamnophis gigas), a threatened species precinctive to the Central Valley of California, as a case study. Survey date was negatively related to the probability of detecting the giant gartersnake, and water temperature was positively related to the probability of detecting the giant gartersnake at a sampled location. Reporting sampling effort, timing and duration of surveys, and water temperatures would allow resource managers to evaluate the probability that the giant gartersnake occurs at sampled sites where it is not detected. This information would also allow periodic updates and quantitative evaluation of changes to the giant gartersnake survey protocol. Because it naturally allows multiple sources of information and is predicated upon the idea of updating information, Bayesian analysis is well-suited to solving the problem of developing efficient sampling protocols for species of conservation concern.
NASA Astrophysics Data System (ADS)
Yanai, R. D.; Bae, K.; Levine, C. R.; Lilly, P.; Vadeboncoeur, M. A.; Fatemi, F. R.; Blum, J. D.; Arthur, M.; Hamburg, S.
2013-12-01
Ecosystem nutrient budgets are difficult to construct and even more difficult to replicate. As a result, uncertainty in the estimates of pools and fluxes are rarely reported, and opportunities to assess confidence through replicated measurements are rare. In this study, we report nutrient concentrations and contents of soil and biomass pools in northern hardwood stands in replicate plots within replicate stands in 3 age classes (14-19 yr, 26-29 yr, and > 100 yr) at the Bartlett Experimental Forest, USA. Soils were described by quantitative soil pits in three plots per stand, excavated by depth increment to the C horizon and analyzed by a sequential extraction procedure. Variation in soil mass among pits within stands averaged 28% (coefficient of variation); variation among stands within an age class ranged from 9-25%. Variation in nutrient concentrations were higher still (averaging 38%, within element, depth increment, and extraction type), perhaps because the depth increments contained varying proportions of genetic horizons. To estimate nutrient contents of aboveground biomass, we propagated model uncertainty through allometric equations, and found errors ranging from 3-7%, depending on the stand. The variation in biomass among plots within stands (6-19%) was always larger than the allometric uncertainties. Variability in measured nutrient concentrations of tree tissues were more variable than the uncertainty in biomass. Foliage had the lowest variability (averaging 16% for Ca, Mg, K, N and P within age class and species), and wood had the highest (averaging 30%), when reported in proportion to the mean, because concentrations in wood are low. For Ca content of aboveground biomass, sampling variation was the greatest source of uncertainty. Coefficients of variation among plots within a stand averaged 16%; stands within an age class ranged from 5-25% CV, including uncertainties in tree allometry and tissue chemistry. Uncertainty analysis can help direct research effort to areas most in need of improvement. In systems such as the one we studied, more intensive sampling would be the best approach to reducing uncertainty, as natural spatial variation was higher than model or measurement uncertainties.
Johnston, Iain G; Rickett, Benjamin C; Jones, Nick S
2014-12-02
Back-of-the-envelope or rule-of-thumb calculations involving rough estimates of quantities play a central scientific role in developing intuition about the structure and behavior of physical systems, for example in so-called Fermi problems in the physical sciences. Such calculations can be used to powerfully and quantitatively reason about biological systems, particularly at the interface between physics and biology. However, substantial uncertainties are often associated with values in cell biology, and performing calculations without taking this uncertainty into account may limit the extent to which results can be interpreted for a given problem. We present a means to facilitate such calculations where uncertainties are explicitly tracked through the line of reasoning, and introduce a probabilistic calculator called CALADIS, a free web tool, designed to perform this tracking. This approach allows users to perform more statistically robust calculations in cell biology despite having uncertain values, and to identify which quantities need to be measured more precisely to make confident statements, facilitating efficient experimental design. We illustrate the use of our tool for tracking uncertainty in several example biological calculations, showing that the results yield powerful and interpretable statistics on the quantities of interest. We also demonstrate that the outcomes of calculations may differ from point estimates when uncertainty is accurately tracked. An integral link between CALADIS and the BioNumbers repository of biological quantities further facilitates the straightforward location, selection, and use of a wealth of experimental data in cell biological calculations. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Robustness of Reconstructed Ancestral Protein Functions to Statistical Uncertainty.
Eick, Geeta N; Bridgham, Jamie T; Anderson, Douglas P; Harms, Michael J; Thornton, Joseph W
2017-02-01
Hypotheses about the functions of ancient proteins and the effects of historical mutations on them are often tested using ancestral protein reconstruction (APR)-phylogenetic inference of ancestral sequences followed by synthesis and experimental characterization. Usually, some sequence sites are ambiguously reconstructed, with two or more statistically plausible states. The extent to which the inferred functions and mutational effects are robust to uncertainty about the ancestral sequence has not been studied systematically. To address this issue, we reconstructed ancestral proteins in three domain families that have different functions, architectures, and degrees of uncertainty; we then experimentally characterized the functional robustness of these proteins when uncertainty was incorporated using several approaches, including sampling amino acid states from the posterior distribution at each site and incorporating the alternative amino acid state at every ambiguous site in the sequence into a single "worst plausible case" protein. In every case, qualitative conclusions about the ancestral proteins' functions and the effects of key historical mutations were robust to sequence uncertainty, with similar functions observed even when scores of alternate amino acids were incorporated. There was some variation in quantitative descriptors of function among plausible sequences, suggesting that experimentally characterizing robustness is particularly important when quantitative estimates of ancient biochemical parameters are desired. The worst plausible case method appears to provide an efficient strategy for characterizing the functional robustness of ancestral proteins to large amounts of sequence uncertainty. Sampling from the posterior distribution sometimes produced artifactually nonfunctional proteins for sequences reconstructed with substantial ambiguity. © The Author 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tobias, Benjamin John; Palaniyappan, Sasikumar; Gautier, Donald Cort
Images of the R2DTO resolution target were obtained during laser-driven-radiography experiments performed at the TRIDENT laser facility, and analysis of these images using the Bayesian Inference Engine (BIE) determines a most probable full-width half maximum (FWHM) spot size of 78 μm. However, significant uncertainty prevails due to variation in the measured detector blur. Propagating this uncertainty in detector blur through the forward model results in an interval of probabilistic ambiguity spanning approximately 35-195 μm when the laser energy impinges on a thick (1 mm) tantalum target. In other phases of the experiment, laser energy is deposited on a thin (~100more » nm) aluminum target placed 250 μm ahead of the tantalum converter. When the energetic electron beam is generated in this manner, upstream from the bremsstrahlung converter, the inferred spot size shifts to a range of much larger values, approximately 270-600 μm FWHM. This report discusses methods applied to obtain these intervals as well as concepts necessary for interpreting the result within a context of probabilistic quantitative inference.« less
NASA Technical Reports Server (NTRS)
Wilson, John W.; Nealy, John E.; Schimmerling, Walter; Cucinotta, Francis A.; Wood, James S.
1993-01-01
Some consequences of uncertainties in radiobiological risk due to galactic cosmic ray (GCR) exposure are analyzed for their effect on engineering designs for the first lunar outpost and a mission to explore Mars. This report presents the plausible effect of biological uncertainties, the design changes necessary to reduce the uncertainties to acceptable levels for a safe mission, and an evaluation of the mission redesign cost. Estimates of the amount of shield mass required to compensate for radiobiological uncertainty are given for a simplified vehicle and habitat. The additional amount of shield mass required to provide a safety factor for uncertainty compensation is calculated from the expected response to GCR exposure. The amount of shield mass greatly increases in the estimated range of biological uncertainty, thus, escalating the estimated cost of the mission. The estimates are used as a quantitative example for the cost-effectiveness of research in radiation biophysics and radiation physics.
IWGT report on quantitative approaches to genotoxicity risk ...
This is the second of two reports from the International Workshops on Genotoxicity Testing (IWGT) Working Group on Quantitative Approaches to Genetic Toxicology Risk Assessment (the QWG). The first report summarized the discussions and recommendations of the QWG related to the need for quantitative dose–response analysis of genetic toxicology data, the existence and appropriate evaluation of threshold responses, and methods to analyze exposure-response relationships and derive points of departure (PoDs) from which acceptable exposure levels could be determined. This report summarizes the QWG discussions and recommendations regarding appropriate approaches to evaluate exposure-related risks of genotoxic damage, including extrapolation below identified PoDs and across test systems and species. Recommendations include the selection of appropriate genetic endpoints and target tissues, uncertainty factors and extrapolation methods to be considered, the importance and use of information on mode of action, toxicokinetics, metabolism, and exposure biomarkers when using quantitative exposure-response data to determine acceptable exposure levels in human populations or to assess the risk associated with known or anticipated exposures. The empirical relationship between genetic damage (mutation and chromosomal aberration) and cancer in animal models was also examined. It was concluded that there is a general correlation between cancer induction and mutagenic and/or clast
Defining Tsunami Magnitude as Measure of Potential Impact
NASA Astrophysics Data System (ADS)
Titov, V. V.; Tang, L.
2016-12-01
The goal of tsunami forecast, as a system for predicting potential impact of a tsunami at coastlines, requires quick estimate of a tsunami magnitude. This goal has been recognized since the beginning of tsunami research. The work of Kajiura, Soloviev, Abe, Murty, and many others discussed several scales for tsunami magnitude based on estimates of tsunami energy. However, difficulties of estimating tsunami energy based on available tsunami measurements at coastal sea-level stations has carried significant uncertainties and has been virtually impossible in real time, before tsunami impacts coastlines. The slow process of tsunami magnitude estimates, including collection of vast amount of available coastal sea-level data from affected coastlines, made it impractical to use any tsunami magnitude scales in tsunami warning operations. Uncertainties of estimates made tsunami magnitudes difficult to use as universal scale for tsunami analysis. Historically, the earthquake magnitude has been used as a proxy of tsunami impact estimates, since real-time seismic data is available of real-time processing and ample amount of seismic data is available for an elaborate post event analysis. This measure of tsunami impact carries significant uncertainties in quantitative tsunami impact estimates, since the relation between the earthquake and generated tsunami energy varies from case to case. In this work, we argue that current tsunami measurement capabilities and real-time modeling tools allow for establishing robust tsunami magnitude that will be useful for tsunami warning as a quick estimate for tsunami impact and for post-event analysis as a universal scale for tsunamis inter-comparison. We present a method for estimating the tsunami magnitude based on tsunami energy and present application of the magnitude analysis for several historical events for inter-comparison with existing methods.
A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.
Yu, Hongyang; Khan, Faisal; Veitch, Brian
2017-09-01
Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.
Validation of Bayesian analysis of compartmental kinetic models in medical imaging.
Sitek, Arkadiusz; Li, Quanzheng; El Fakhri, Georges; Alpert, Nathaniel M
2016-10-01
Kinetic compartmental analysis is frequently used to compute physiologically relevant quantitative values from time series of images. In this paper, a new approach based on Bayesian analysis to obtain information about these parameters is presented and validated. The closed-form of the posterior distribution of kinetic parameters is derived with a hierarchical prior to model the standard deviation of normally distributed noise. Markov chain Monte Carlo methods are used for numerical estimation of the posterior distribution. Computer simulations of the kinetics of F18-fluorodeoxyglucose (FDG) are used to demonstrate drawing statistical inferences about kinetic parameters and to validate the theory and implementation. Additionally, point estimates of kinetic parameters and covariance of those estimates are determined using the classical non-linear least squares approach. Posteriors obtained using methods proposed in this work are accurate as no significant deviation from the expected shape of the posterior was found (one-sided P>0.08). It is demonstrated that the results obtained by the standard non-linear least-square methods fail to provide accurate estimation of uncertainty for the same data set (P<0.0001). The results of this work validate new methods for a computer simulations of FDG kinetics. Results show that in situations where the classical approach fails in accurate estimation of uncertainty, Bayesian estimation provides an accurate information about the uncertainties in the parameters. Although a particular example of FDG kinetics was used in the paper, the methods can be extended for different pharmaceuticals and imaging modalities. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
A TIERED APPROACH TO PERFORMING UNCERTAINTY ANALYSIS IN CONDUCTING EXPOSURE ANALYSIS FOR CHEMICALS
The WHO/IPCS draft Guidance Document on Characterizing and Communicating Uncertainty in Exposure Assessment provides guidance on recommended strategies for conducting uncertainty analysis as part of human exposure analysis. Specifically, a tiered approach to uncertainty analysis ...
Crotta, Matteo; Rizzi, Rita; Varisco, Giorgio; Daminelli, Paolo; Cunico, Elena Cosciani; Luini, Mario; Graber, Hans Ulrich; Paterlini, Franco; Guitian, Javier
2016-03-01
Quantitative microbial risk assessment (QMRA) models are extensively applied to inform management of a broad range of food safety risks. Inevitably, QMRA modeling involves an element of simplification of the biological process of interest. Two features that are frequently simplified or disregarded are the pathogenicity of multiple strains of a single pathogen and consumer behavior at the household level. In this study, we developed a QMRA model with a multiple-strain approach and a consumer phase module (CPM) based on uncertainty distributions fitted from field data. We modeled exposure to staphylococcal enterotoxin A in raw milk in Lombardy; a specific enterotoxin production module was thus included. The model is adaptable and could be used to assess the risk related to other pathogens in raw milk as well as other staphylococcal enterotoxins. The multiplestrain approach, implemented as a multinomial process, allowed the inclusion of variability and uncertainty with regard to pathogenicity at the bacterial level. Data from 301 questionnaires submitted to raw milk consumers were used to obtain uncertainty distributions for the CPM. The distributions were modeled to be easily updatable with further data or evidence. The sources of uncertainty due to the multiple-strain approach and the CPM were identified, and their impact on the output was assessed by comparing specific scenarios to the baseline. When the distributions reflecting the uncertainty in consumer behavior were fixed to the 95th percentile, the risk of exposure increased up to 160 times. This reflects the importance of taking into consideration the diversity of consumers' habits at the household level and the impact that the lack of knowledge about variables in the CPM can have on the final QMRA estimates. The multiple-strain approach lends itself to use in other food matrices besides raw milk and allows the model to better capture the complexity of the real world and to be capable of geographical specificity.
NASA Astrophysics Data System (ADS)
Leblanc, Thierry; Sica, Robert J.; van Gijsel, Joanna A. E.; Haefele, Alexander; Payen, Guillaume; Liberti, Gianluigi
2016-08-01
A standardized approach for the definition, propagation, and reporting of uncertainty in the temperature lidar data products contributing to the Network for the Detection for Atmospheric Composition Change (NDACC) database is proposed. One important aspect of the proposed approach is the ability to propagate all independent uncertainty components in parallel through the data processing chain. The individual uncertainty components are then combined together at the very last stage of processing to form the temperature combined standard uncertainty. The identified uncertainty sources comprise major components such as signal detection, saturation correction, background noise extraction, temperature tie-on at the top of the profile, and absorption by ozone if working in the visible spectrum, as well as other components such as molecular extinction, the acceleration of gravity, and the molecular mass of air, whose magnitudes depend on the instrument, data processing algorithm, and altitude range of interest. The expression of the individual uncertainty components and their step-by-step propagation through the temperature data processing chain are thoroughly estimated, taking into account the effect of vertical filtering and the merging of multiple channels. All sources of uncertainty except detection noise imply correlated terms in the vertical dimension, which means that covariance terms must be taken into account when vertical filtering is applied and when temperature is integrated from the top of the profile. Quantitatively, the uncertainty budget is presented in a generic form (i.e., as a function of instrument performance and wavelength), so that any NDACC temperature lidar investigator can easily estimate the expected impact of individual uncertainty components in the case of their own instrument. Using this standardized approach, an example of uncertainty budget is provided for the Jet Propulsion Laboratory (JPL) lidar at Mauna Loa Observatory, Hawai'i, which is typical of the NDACC temperature lidars transmitting at 355 nm. The combined temperature uncertainty ranges between 0.1 and 1 K below 60 km, with detection noise, saturation correction, and molecular extinction correction being the three dominant sources of uncertainty. Above 60 km and up to 10 km below the top of the profile, the total uncertainty increases exponentially from 1 to 10 K due to the combined effect of random noise and temperature tie-on. In the top 10 km of the profile, the accuracy of the profile mainly depends on that of the tie-on temperature. All other uncertainty components remain below 0.1 K throughout the entire profile (15-90 km), except the background noise correction uncertainty, which peaks around 0.3-0.5 K. It should be kept in mind that these quantitative estimates may be very different for other lidar instruments, depending on their altitude range and the wavelengths used.
Comprehensive Analysis of the Gas- and Particle-Phase Products of VOC Oxidation
NASA Astrophysics Data System (ADS)
Bakker-Arkema, J.; Ziemann, P. J.
2017-12-01
Controlled environmental chamber studies are important for determining atmospheric reaction mechanisms and gas and aerosol products formed in the oxidation of volatile organic compounds (VOCs). Such information is necessary for developing detailed chemical models for use in predicting the atmospheric fate of VOCs and also secondary organic aerosol (SOA) formation. However, complete characterization of atmospheric oxidation reactions, including gas- and particle-phase product yields, and reaction branching ratios, are difficult to achieve. In this work, we investigated the reactions of terminal and internal alkenes with OH radicals in the presence of NOx in an attempt to fully characterize the chemistry of these systems while minimizing and accounting for the inherent uncertainties associated with environmental chamber experiments. Gas-phase products (aldehydes formed by alkoxy radical decomposition) and particle-phase products (alkyl nitrates, β-hydroxynitrates, dihydroxynitrates, 1,4-hydroxynitrates, 1,4-hydroxycarbonyls, and dihydroxycarbonyls) formed through pathways involving addition of OH to the C=C double bond as well as H-atom abstraction were identified and quantified using a suite of analytical techniques. Particle-phase products were analyzed in real time with a thermal desorption particle beam mass spectrometer; and off-line by collection onto filters, extraction, and subsequent analysis of functional groups by derivatization-spectrophotometric methods developed in our lab. Derivatized products were also separated by liquid chromatography for molecular quantitation by UV absorbance and identification using chemical ionization-ion trap mass spectrometry. Gas phase aldehydes were analyzed off-line by collection onto Tenax and a 5-channel denuder with subsequent analysis by gas chromatography, or by collection onto DNPH-coated cartridges and subsequent analysis by liquid chromatography. The full product identification and quantitation, with careful minimization of uncertainties for the various components of the experiment and analyses, demonstrates our capability to comprehensively and accurately analyze the complex chemical composition of products formed in the oxidation of organic compounds in laboratory chamber studies.
NASA Astrophysics Data System (ADS)
Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.
2016-11-01
Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.
NASA Astrophysics Data System (ADS)
Zhang, Junlong; Li, Yongping; Huang, Guohe; Chen, Xi; Bao, Anming
2016-07-01
Without a realistic assessment of parameter uncertainty, decision makers may encounter difficulties in accurately describing hydrologic processes and assessing relationships between model parameters and watershed characteristics. In this study, a Markov-Chain-Monte-Carlo-based multilevel-factorial-analysis (MCMC-MFA) method is developed, which can not only generate samples of parameters from a well constructed Markov chain and assess parameter uncertainties with straightforward Bayesian inference, but also investigate the individual and interactive effects of multiple parameters on model output through measuring the specific variations of hydrological responses. A case study is conducted for addressing parameter uncertainties in the Kaidu watershed of northwest China. Effects of multiple parameters and their interactions are quantitatively investigated using the MCMC-MFA with a three-level factorial experiment (totally 81 runs). A variance-based sensitivity analysis method is used to validate the results of parameters' effects. Results disclose that (i) soil conservation service runoff curve number for moisture condition II (CN2) and fraction of snow volume corresponding to 50% snow cover (SNO50COV) are the most significant factors to hydrological responses, implying that infiltration-excess overland flow and snow water equivalent represent important water input to the hydrological system of the Kaidu watershed; (ii) saturate hydraulic conductivity (SOL_K) and soil evaporation compensation factor (ESCO) have obvious effects on hydrological responses; this implies that the processes of percolation and evaporation would impact hydrological process in this watershed; (iii) the interactions of ESCO and SNO50COV as well as CN2 and SNO50COV have an obvious effect, implying that snow cover can impact the generation of runoff on land surface and the extraction of soil evaporative demand in lower soil layers. These findings can help enhance the hydrological model's capability for simulating/predicting water resources.
Ligmann-Zielinska, Arika; Kramer, Daniel B; Spence Cheruvelil, Kendra; Soranno, Patricia A
2014-01-01
Agent-based models (ABMs) have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1) efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2) conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system.
Ligmann-Zielinska, Arika; Kramer, Daniel B.; Spence Cheruvelil, Kendra; Soranno, Patricia A.
2014-01-01
Agent-based models (ABMs) have been widely used to study socioecological systems. They are useful for studying such systems because of their ability to incorporate micro-level behaviors among interacting agents, and to understand emergent phenomena due to these interactions. However, ABMs are inherently stochastic and require proper handling of uncertainty. We propose a simulation framework based on quantitative uncertainty and sensitivity analyses to build parsimonious ABMs that serve two purposes: exploration of the outcome space to simulate low-probability but high-consequence events that may have significant policy implications, and explanation of model behavior to describe the system with higher accuracy. The proposed framework is applied to the problem of modeling farmland conservation resulting in land use change. We employ output variance decomposition based on quasi-random sampling of the input space and perform three computational experiments. First, we perform uncertainty analysis to improve model legitimacy, where the distribution of results informs us about the expected value that can be validated against independent data, and provides information on the variance around this mean as well as the extreme results. In our last two computational experiments, we employ sensitivity analysis to produce two simpler versions of the ABM. First, input space is reduced only to inputs that produced the variance of the initial ABM, resulting in a model with output distribution similar to the initial model. Second, we refine the value of the most influential input, producing a model that maintains the mean of the output of initial ABM but with less spread. These simplifications can be used to 1) efficiently explore model outcomes, including outliers that may be important considerations in the design of robust policies, and 2) conduct explanatory analysis that exposes the smallest number of inputs influencing the steady state of the modeled system. PMID:25340764
NASA Astrophysics Data System (ADS)
Charonko, John J.; Vlachos, Pavlos P.
2013-06-01
Numerous studies have established firmly that particle image velocimetry (PIV) is a robust method for non-invasive, quantitative measurements of fluid velocity, and that when carefully conducted, typical measurements can accurately detect displacements in digital images with a resolution well below a single pixel (in some cases well below a hundredth of a pixel). However, to date, these estimates have only been able to provide guidance on the expected error for an average measurement under specific image quality and flow conditions. This paper demonstrates a new method for estimating the uncertainty bounds to within a given confidence interval for a specific, individual measurement. Here, cross-correlation peak ratio, the ratio of primary to secondary peak height, is shown to correlate strongly with the range of observed error values for a given measurement, regardless of flow condition or image quality. This relationship is significantly stronger for phase-only generalized cross-correlation PIV processing, while the standard correlation approach showed weaker performance. Using an analytical model of the relationship derived from synthetic data sets, the uncertainty bounds at a 95% confidence interval are then computed for several artificial and experimental flow fields, and the resulting errors are shown to match closely to the predicted uncertainties. While this method stops short of being able to predict the true error for a given measurement, knowledge of the uncertainty level for a PIV experiment should provide great benefits when applying the results of PIV analysis to engineering design studies and computational fluid dynamics validation efforts. Moreover, this approach is exceptionally simple to implement and requires negligible additional computational cost.
NASA Astrophysics Data System (ADS)
Liang, Zhongmin; Li, Yujie; Hu, Yiming; Li, Binquan; Wang, Jun
2017-06-01
Accurate and reliable long-term forecasting plays an important role in water resources management and utilization. In this paper, a hybrid model called SVR-HUP is presented to predict long-term runoff and quantify the prediction uncertainty. The model is created based on three steps. First, appropriate predictors are selected according to the correlations between meteorological factors and runoff. Second, a support vector regression (SVR) model is structured and optimized based on the LibSVM toolbox and a genetic algorithm. Finally, using forecasted and observed runoff, a hydrologic uncertainty processor (HUP) based on a Bayesian framework is used to estimate the posterior probability distribution of the simulated values, and the associated uncertainty of prediction was quantitatively analyzed. Six precision evaluation indexes, including the correlation coefficient (CC), relative root mean square error (RRMSE), relative error (RE), mean absolute percentage error (MAPE), Nash-Sutcliffe efficiency (NSE), and qualification rate (QR), are used to measure the prediction accuracy. As a case study, the proposed approach is applied in the Han River basin, South Central China. Three types of SVR models are established to forecast the monthly, flood season and annual runoff volumes. The results indicate that SVR yields satisfactory accuracy and reliability at all three scales. In addition, the results suggest that the HUP cannot only quantify the uncertainty of prediction based on a confidence interval but also provide a more accurate single value prediction than the initial SVR forecasting result. Thus, the SVR-HUP model provides an alternative method for long-term runoff forecasting.
Direct Observation of Ultralow Vertical Emittance using a Vertical Undulator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wootton, Kent
2015-09-17
In recent work, the first quantitative measurements of electron beam vertical emittance using a vertical undulator were presented, with particular emphasis given to ultralow vertical emittances [K. P. Wootton, et al., Phys. Rev. ST Accel. Beams, 17, 112802 (2014)]. Using this apparatus, a geometric vertical emittance of 0.9 ± 0.3 pm rad has been observed. A critical analysis is given of measurement approaches that were attempted, with particular emphasis on systematic and statistical uncertainties. The method used is explained, compared to other techniques and the applicability of these results to other scenarios discussed.
Extra dimension searches at hadron colliders to next-to-leading order-QCD
NASA Astrophysics Data System (ADS)
Kumar, M. C.; Mathews, Prakash; Ravindran, V.
2007-11-01
The quantitative impact of NLO-QCD corrections for searches of large and warped extra dimensions at hadron colliders are investigated for the Drell-Yan process. The K-factor for various observables at hadron colliders are presented. Factorisation, renormalisation scale dependence and uncertainties due to various parton distribution functions are studied. Uncertainties arising from the error on experimental data are estimated using the MRST parton distribution functions.
Biomass Deconstruction and Pretreatment Publications | Bioenergy | NREL
a quantitative indication of data uncertainty. Enlarge image Transgenic Ferritin Overproduction Pretreatment, Industrial & Engineering Chemistry Research Image of a chart showing solubilization of
A Benefit-Risk Analysis Approach to Capture Regulatory Decision-Making: Multiple Myeloma.
Raju, G K; Gurumurthi, Karthik; Domike, Reuben; Kazandjian, Dickran; Landgren, Ola; Blumenthal, Gideon M; Farrell, Ann; Pazdur, Richard; Woodcock, Janet
2018-01-01
Drug regulators around the world make decisions about drug approvability based on qualitative benefit-risk analysis. In this work, a quantitative benefit-risk analysis approach captures regulatory decision-making about new drugs to treat multiple myeloma (MM). MM assessments have been based on endpoints such as time to progression (TTP), progression-free survival (PFS), and objective response rate (ORR) which are different than benefit-risk analysis based on overall survival (OS). Twenty-three FDA decisions on MM drugs submitted to FDA between 2003 and 2016 were identified and analyzed. The benefits and risks were quantified relative to comparators (typically the control arm of the clinical trial) to estimate whether the median benefit-risk was positive or negative. A sensitivity analysis was demonstrated using ixazomib to explore the magnitude of uncertainty. FDA approval decision outcomes were consistent and logical using this benefit-risk framework. © 2017 American Society for Clinical Pharmacology and Therapeutics.
THE EVOLUTION OF SOLAR FLUX FROM 0.1 nm TO 160 {mu}m: QUANTITATIVE ESTIMATES FOR PLANETARY STUDIES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Claire, Mark W.; Sheets, John; Meadows, Victoria S.
2012-09-20
Understanding changes in the solar flux over geologic time is vital for understanding the evolution of planetary atmospheres because it affects atmospheric escape and chemistry, as well as climate. We describe a numerical parameterization for wavelength-dependent changes to the non-attenuated solar flux appropriate for most times and places in the solar system. We combine data from the Sun and solar analogs to estimate enhanced UV and X-ray fluxes for the young Sun and use standard solar models to estimate changing visible and infrared fluxes. The parameterization, a series of multipliers relative to the modern top of the atmosphere flux atmore » Earth, is valid from 0.1 nm through the infrared, and from 0.6 Gyr through 6.7 Gyr, and is extended from the solar zero-age main sequence to 8.0 Gyr subject to additional uncertainties. The parameterization is applied to a representative modern day flux, providing quantitative estimates of the wavelength dependence of solar flux for paleodates relevant to the evolution of atmospheres in the solar system (or around other G-type stars). We validate the code by Monte Carlo analysis of uncertainties in stellar age and flux, and with comparisons to the solar proxies {kappa}{sup 1} Cet and EK Dra. The model is applied to the computation of photolysis rates on the Archean Earth.« less
Projected changes to growth and mortality of Hawaiian corals over the next 100 years.
Hoeke, Ron K; Jokiel, Paul L; Buddemeier, Robert W; Brainard, Russell E
2011-03-29
Recent reviews suggest that the warming and acidification of ocean surface waters predicated by most accepted climate projections will lead to mass mortality and declining calcification rates of reef-building corals. This study investigates the use of modeling techniques to quantitatively examine rates of coral cover change due to these effects. Broad-scale probabilities of change in shallow-water scleractinian coral cover in the Hawaiian Archipelago for years 2000-2099 A.D. were calculated assuming a single middle-of-the-road greenhouse gas emissions scenario. These projections were based on ensemble calculations of a growth and mortality model that used sea surface temperature (SST), atmospheric carbon dioxide (CO(2)), observed coral growth (calcification) rates, and observed mortality linked to mass coral bleaching episodes as inputs. SST and CO(2) predictions were derived from the World Climate Research Programme (WCRP) multi-model dataset, statistically downscaled with historical data. The model calculations illustrate a practical approach to systematic evaluation of climate change effects on corals, and also show the effect of uncertainties in current climate predictions and in coral adaptation capabilities on estimated changes in coral cover. Despite these large uncertainties, this analysis quantitatively illustrates that a large decline in coral cover is highly likely in the 21(st) Century, but that there are significant spatial and temporal variances in outcomes, even under a single climate change scenario.
Projected changes to growth and mortality of Hawaiian corals over the next 100 years
Hoeke, R.K.; Jokiel, P.L.; Buddemeier, R.W.; Brainard, R.E.
2011-01-01
Background: Recent reviews suggest that the warming and acidification of ocean surface waters predicated by most accepted climate projections will lead to mass mortality and declining calcification rates of reef-building corals. This study investigates the use of modeling techniques to quantitatively examine rates of coral cover change due to these effects. Methodology/Principal Findings: Broad-scale probabilities of change in shallow-water scleractinian coral cover in the Hawaiian Archipelago for years 2000-2099 A.D. were calculated assuming a single middle-of-the-road greenhouse gas emissions scenario. These projections were based on ensemble calculations of a growth and mortality model that used sea surface temperature (SST), atmospheric carbon dioxide (CO2), observed coral growth (calcification) rates, and observed mortality linked to mass coral bleaching episodes as inputs. SST and CO2 predictions were derived from the World Climate Research Programme (WCRP) multi-model dataset, statistically downscaled with historical data. Conclusions/Significance: The model calculations illustrate a practical approach to systematic evaluation of climate change effects on corals, and also show the effect of uncertainties in current climate predictions and in coral adaptation capabilities on estimated changes in coral cover. Despite these large uncertainties, this analysis quantitatively illustrates that a large decline in coral cover is highly likely in the 21st Century, but that there are significant spatial and temporal variances in outcomes, even under a single climate change scenario.
NASA Astrophysics Data System (ADS)
Ren, Luchuan
2015-04-01
A Global Sensitivity Analysis Method on Maximum Tsunami Wave Heights to Potential Seismic Source Parameters Luchuan Ren, Jianwei Tian, Mingli Hong Institute of Disaster Prevention, Sanhe, Heibei Province, 065201, P.R. China It is obvious that the uncertainties of the maximum tsunami wave heights in offshore area are partly from uncertainties of the potential seismic tsunami source parameters. A global sensitivity analysis method on the maximum tsunami wave heights to the potential seismic source parameters is put forward in this paper. The tsunami wave heights are calculated by COMCOT ( the Cornell Multi-grid Coupled Tsunami Model), on the assumption that an earthquake with magnitude MW8.0 occurred at the northern fault segment along the Manila Trench and triggered a tsunami in the South China Sea. We select the simulated results of maximum tsunami wave heights at specific sites in offshore area to verify the validity of the method proposed in this paper. For ranking importance order of the uncertainties of potential seismic source parameters (the earthquake's magnitude, the focal depth, the strike angle, dip angle and slip angle etc..) in generating uncertainties of the maximum tsunami wave heights, we chose Morris method to analyze the sensitivity of the maximum tsunami wave heights to the aforementioned parameters, and give several qualitative descriptions of nonlinear or linear effects of them on the maximum tsunami wave heights. We quantitatively analyze the sensitivity of the maximum tsunami wave heights to these parameters and the interaction effects among these parameters on the maximum tsunami wave heights by means of the extended FAST method afterward. The results shows that the maximum tsunami wave heights are very sensitive to the earthquake magnitude, followed successively by the epicenter location, the strike angle and dip angle, the interactions effect between the sensitive parameters are very obvious at specific site in offshore area, and there exist differences in importance order in generating uncertainties of the maximum tsunami wave heights for same group parameters at different specific sites in offshore area. These results are helpful to deeply understand the relationship between the tsunami wave heights and the seismic tsunami source parameters. Keywords: Global sensitivity analysis; Tsunami wave height; Potential seismic tsunami source parameter; Morris method; Extended FAST method
Revised planetary protection policy for solar system exploration.
DeVincenzi, D L; Stabekis, P D
1984-01-01
In order to control contamination of planets by terrestrial microorganisms and organic constituents, U.S. planetary missions have been governed by a planetary protection (or planetary quarantine) policy which has changed little since 1972. This policy has recently been reviewed in light of new information obtained from planetary exploration during the past decade and because of changes to, or uncertainties in, some parameters used in the existing quantitative approach. On the basis of this analysis, a revised planetary protection policy with the following key features is proposed: deemphasizing the use of mathematical models and quantitative analyses; establishing requirements for target planet/mission type (i.e., orbiter, lander, etc.) combinations; considering sample return missions a separate category; simplifying documentation; and imposing implementing procedures (i.e., trajectory biasing, cleanroom assembly, spacecraft sterilization, etc.) by exception, i.e., only if the planet/mission combination warrants such controls.
A Team Mental Model Perspective of Pre-Quantitative Risk
NASA Technical Reports Server (NTRS)
Cooper, Lynne P.
2011-01-01
This study was conducted to better understand how teams conceptualize risk before it can be quantified, and the processes by which a team forms a shared mental model of this pre-quantitative risk. Using an extreme case, this study analyzes seven months of team meeting transcripts, covering the entire lifetime of the team. Through an analysis of team discussions, a rich and varied structural model of risk emerges that goes significantly beyond classical representations of risk as the product of a negative consequence and a probability. In addition to those two fundamental components, the team conceptualization includes the ability to influence outcomes and probabilities, networks of goals, interaction effects, and qualitative judgments about the acceptability of risk, all affected by associated uncertainties. In moving from individual to team mental models, team members employ a number of strategies to gain group recognition of risks and to resolve or accept differences.
Quantitative prediction of solvation free energy in octanol of organic compounds.
Delgado, Eduardo J; Jaña, Gonzalo A
2009-03-01
The free energy of solvation, DeltaGS0, in octanol of organic compounds is quantitatively predicted from the molecular structure. The model, involving only three molecular descriptors, is obtained by multiple linear regression analysis from a data set of 147 compounds containing diverse organic functions, namely, halogenated and non-halogenated alkanes, alkenes, alkynes, aromatics, alcohols, aldehydes, ketones, amines, ethers and esters; covering a DeltaGS0 range from about -50 to 0 kJ.mol(-1). The model predicts the free energy of solvation with a squared correlation coefficient of 0.93 and a standard deviation, 2.4 kJ.mol(-1), just marginally larger than the generally accepted value of experimental uncertainty. The involved molecular descriptors have definite physical meaning corresponding to the different intermolecular interactions occurring in the bulk liquid phase. The model is validated with an external set of 36 compounds not included in the training set.
Quantitative Prediction of Solvation Free Energy in Octanol of Organic Compounds
Delgado, Eduardo J.; Jaña, Gonzalo A.
2009-01-01
The free energy of solvation, ΔGS0, in octanol of organic compunds is quantitatively predicted from the molecular structure. The model, involving only three molecular descriptors, is obtained by multiple linear regression analysis from a data set of 147 compounds containing diverse organic functions, namely, halogenated and non-halogenated alkanes, alkenes, alkynes, aromatics, alcohols, aldehydes, ketones, amines, ethers and esters; covering a ΔGS0 range from about −50 to 0 kJ·mol−1. The model predicts the free energy of solvation with a squared correlation coefficient of 0.93 and a standard deviation, 2.4 kJ·mol−1, just marginally larger than the generally accepted value of experimental uncertainty. The involved molecular descriptors have definite physical meaning corresponding to the different intermolecular interactions occurring in the bulk liquid phase. The model is validated with an external set of 36 compounds not included in the training set. PMID:19399236
Uncertainty Budget Analysis for Dimensional Inspection Processes (U)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valdez, Lucas M.
2012-07-26
This paper is intended to provide guidance and describe how to prepare an uncertainty analysis of a dimensional inspection process through the utilization of an uncertainty budget analysis. The uncertainty analysis is stated in the same methodology as that of the ISO GUM standard for calibration and testing. There is a specific distinction between how Type A and Type B uncertainty analysis is used in a general and specific process. All theory and applications are utilized to represent both a generalized approach to estimating measurement uncertainty and how to report and present these estimations for dimensional measurements in a dimensionalmore » inspection process. The analysis of this uncertainty budget shows that a well-controlled dimensional inspection process produces a conservative process uncertainty, which can be attributed to the necessary assumptions in place for best possible results.« less
The BEACH Act of 2000 directed the U.S. EPA to establish more expeditious methods for the detection of pathogen indicators in coastal waters, as well as new water quality criteria based on these methods. Progress has been made in developing a quantitative PCR (qPCR) method for en...
Natural flow regimes of the Ozark-Ouachita Interior Highlands region
Leasure, D. R.; Magoulick, Daniel D.; Longing, S. D.
2016-01-01
Natural flow regimes represent the hydrologic conditions to which native aquatic organisms are best adapted. We completed a regional river classification and quantitative descriptions of each natural flow regime for the Ozark–Ouachita Interior Highlands region of Arkansas, Missouri and Oklahoma. On the basis of daily flow records from 64 reference streams, seven natural flow regimes were identified with mixture model cluster analysis: Groundwater Stable, Groundwater, Groundwater Flashy, Perennial Runoff, Runoff Flashy, Intermittent Runoff and Intermittent Flashy. Sets of flow metrics were selected that best quantified nine ecologically important components of these natural flow regimes. An uncertainty analysis was performed to avoid selecting metrics strongly affected by measurement uncertainty that can result from short periods of record. Measurement uncertainties (bias, precision and accuracy) were assessed for 170 commonly used flow metrics. The ranges of variability expected for select flow metrics under natural conditions were quantified for each flow regime to provide a reference for future assessments of hydrologic alteration. A random forest model was used to predict the natural flow regimes of all stream segments in the study area based on climate and catchment characteristics, and a map was produced. The geographic distribution of flow regimes suggested distinct ecohydrological regions that may be useful for conservation planning. This project provides a hydrologic foundation for future examination of flow–ecology relationships in the region. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.
Poças, Maria F; Oliveira, Jorge C; Brandsch, Rainer; Hogg, Timothy
2010-07-01
The use of probabilistic approaches in exposure assessments of contaminants migrating from food packages is of increasing interest but the lack of concentration or migration data is often referred as a limitation. Data accounting for the variability and uncertainty that can be expected in migration, for example, due to heterogeneity in the packaging system, variation of the temperature along the distribution chain, and different time of consumption of each individual package, are required for probabilistic analysis. The objective of this work was to characterize quantitatively the uncertainty and variability in estimates of migration. A Monte Carlo simulation was applied to a typical solution of the Fick's law with given variability in the input parameters. The analysis was performed based on experimental data of a model system (migration of Irgafos 168 from polyethylene into isooctane) and illustrates how important sources of variability and uncertainty can be identified in order to refine analyses. For long migration times and controlled conditions of temperature the affinity of the migrant to the food can be the major factor determining the variability in the migration values (more than 70% of variance). In situations where both the time of consumption and temperature can vary, these factors can be responsible, respectively, for more than 60% and 20% of the variance in the migration estimates. The approach presented can be used with databases from consumption surveys to yield a true probabilistic estimate of exposure.
Uncertainty assessment method for the Cs-137 fallout inventory and penetration depth.
Papadakos, G N; Karangelos, D J; Petropoulos, N P; Anagnostakis, M J; Hinis, E P; Simopoulos, S E
2017-05-01
Within the presented study, soil samples were collected in year 2007 at 20 different locations of the Greek terrain, both from the surface and also from depths down to 26 cm. Sampling locations were selected primarily from areas where high levels of 137 Cs deposition after the Chernobyl accident had already been identified by the Nuclear Engineering Laboratory of the National Technical University of Athens during and after the year of 1986. At one location of relatively higher deposition, soil core samples were collected following a 60 m by 60 m Cartesian grid with a 20 m node-to-node distance. Single or pair core samples were also collected from the remaining 19 locations. Sample measurements and analysis were used to estimate 137 Cs inventory and the corresponding depth migration, twenty years after the deposition on Greek terrain. Based on these data, the uncertainty components of the whole sampling-to-results procedure were investigated. A cause-and-effect assessment process was used to apply the law of error propagation and demonstrate that the dominating significant component of the combined uncertainty is that due to the spatial variability of the contemporary (2007) 137 Cs inventory. A secondary, yet also significant component was identified to be the activity measurement process itself. Other less-significant uncertainty parameters were sampling methods, the variation in the soil field density with depth and the preparation of samples for measurement. The sampling grid experiment allowed for the quantitative evaluation of the uncertainty due to spatial variability, also by the assistance of the semivariance analysis. Denser, optimized grid could return more accurate values for this component but with a significantly elevated laboratory cost, in terms of both, human and material resources. Using the hereby collected data and for the case of a single core soil sampling using a well-defined sampling methodology quality assurance, the uncertainty component due to spatial variability was evaluated to about 19% for the 137 Cs inventory and up to 34% for the 137 Cs penetration depth. Based on the presented results and also on related literature, it is argued that such high uncertainties should be anticipated for single core samplings conducted using similar methodology and employed as 137 Cs inventory and penetration depth estimators. Copyright © 2017 Elsevier Ltd. All rights reserved.
Hydrologic modeling strategy for the Islamic Republic of Mauritania, Africa
Friedel, Michael J.
2008-01-01
The government of Mauritania is interested in how to maintain hydrologic balance to ensure a long-term stable water supply for minerals-related, domestic, and other purposes. Because of the many complicating and competing natural and anthropogenic factors, hydrologists will perform quantitative analysis with specific objectives and relevant computer models in mind. Whereas various computer models are available for studying water-resource priorities, the success of these models to provide reliable predictions largely depends on adequacy of the model-calibration process. Predictive analysis helps us evaluate the accuracy and uncertainty associated with simulated dependent variables of our calibrated model. In this report, the hydrologic modeling process is reviewed and a strategy summarized for future Mauritanian hydrologic modeling studies.
M.S.L.A.P. Modular Spectral Line Analysis Program documentation
NASA Technical Reports Server (NTRS)
Joseph, Charles L.; Jenkins, Edward B.
1991-01-01
MSLAP is a software for analyzing spectra, providing the basic structure to identify spectral features, to make quantitative measurements of this features, and to store the measurements for convenient access. MSLAP can be used to measure not only the zeroth moment (equivalent width) of a profile, but also the first and second moments. Optical depths and the corresponding column densities across the profile can be measured as well for sufficiently high resolution data. The software was developed for an interactive, graphical analysis where the computer carries most of the computational and data organizational burden and the investigator is responsible only for all judgement decisions. It employs sophisticated statistical techniques for determining the best polynomial fit to the continuum and for calculating the uncertainties.
Mazurek, Artur; Jamroz, Jerzy
2015-04-15
In food analysis, a method for determination of vitamin C should enable measuring of total content of ascorbic acid (AA) and dehydroascorbic acid (DHAA) because both chemical forms exhibit biological activity. The aim of the work was to confirm applicability of HPLC-DAD method for analysis of total content of vitamin C (TC) and ascorbic acid in various types of food by determination of validation parameters such as: selectivity, precision, accuracy, linearity and limits of detection and quantitation. The results showed that the method applied for determination of TC and AA was selective, linear and precise. Precision of DHAA determination by the subtraction method was also evaluated. It was revealed that the results of DHAA determination obtained by the subtraction method were not precise which resulted directly from the assumption of this method and the principles of uncertainty propagation. The proposed chromatographic method should be recommended for routine determinations of total vitamin C in various food. Copyright © 2014 Elsevier Ltd. All rights reserved.
Woube, Yilkal Asfaw; Dibaba, Asseged Bogale; Tameru, Berhanu; Fite, Richard; Nganwa, David; Robnett, Vinaida; Demisse, Amsalu; Habtemariam, Tsegaye
2015-11-01
Contagious bovine pleuropneumonia (CBPP) is a highly contagious bacterial disease of cattle caused by Mycoplasma mycoides subspecies mycoides small colony (SC) bovine biotype (MmmSC). It has been eradicated from many countries; however, the disease persists in many parts of Africa and Asia. CBPP is one of the major trade-restricting diseases of cattle in Ethiopia. In this quantitative risk assessment the OIE concept of zoning was adopted to assess the entry of CBPP into an importing country when up to 280,000 live cattle are exported every year from the northwestern proposed disease free zone (DFZ) of Ethiopia. To estimate the level of risk, a six-tiered risk pathway (scenario tree) was developed, evidences collected and equations generated. The probability of occurrence of the hazard at each node was modelled as a probability distribution using Monte Carlo simulation (@RISK software) at 10,000 iterations to account for uncertainty and variability. The uncertainty and variability of data points surrounding the risk estimate were further quantified by sensitivity analysis. In this study a single animal destined for export from the northwestern DFZ of Ethiopia has a CBPP infection probability of 4.76×10(-6) (95% CI=7.25×10(-8) 1.92×10(-5)). The probability that at least one infected animal enters an importing country in one year is 0.53 (90% CI=0.042-0.97). The expected number of CBPP infected animals exported any given year is 1.28 (95% CI=0.021-5.42). According to the risk estimate, an average of 2.73×10(6) animals (90% CI=10,674-5.9×10(6)) must be exported to get the first infected case. By this account it would, on average, take 10.15 years (90% CI=0.24-23.18) for the first infected animal to be included in the consignment. Sensitivity analysis revealed that prevalence and vaccination had the highest impact on the uncertainty and variability of the overall risk. Copyright © 2015 Elsevier B.V. All rights reserved.
CCQM-K126: low polarity organic in water: carbamazepine in surface water
NASA Astrophysics Data System (ADS)
Wai-mei Sin, Della; Wong, Yiu-chung; Lehmann, Andreas; Schneider, Rudolf J.; Kakoulides, Elias; Tang Lin, Teo; Qinde, Liu; Cabillic, Julie; Lardy-fontan, Sophie; Nammoonnoy, Jintana; Prevoo-Franzsen, Désirée; López, Eduardo Emilio; Alberti, Cecilia; Su, Fuhai
2017-01-01
The key comparison CCQM-K126 low polarity organic in water: carbamazepine in surface water was coordinated by Government Laboratory Hong Kong under the auspices of the Organic Analysis Working Group (OAWG) of the Comité Consultatif pour la Quantité de Matière (CCQM). Eight National Metrology institutes or Designated Institutes participated and participants were requested to report the mass fraction of carbamazepine in surface water study material. The surface water sample was collected in Hong Kong and was gravimetrically spiked with standard solution. This study provided the means for assessing measurement capabilities for determination of low molecular weight analytes (mass range 100-500) and low polarity (pKOW<= -2) in aqueous matrix. Nine NMIs/DIs registered for the KC and one withdrew before results were submitted. Nine results were submitted by the eight participants. Eight results applied the LC-MS/MS method and Isotope Dilution Mass Spectrometry approach for quantification. BAM additionally submitted a result from ELISA that was not included in the key comparison reference values (KCRV) calculation as is provided in the report for information. One further result was not included as the participant withdrew their result from the calculation after further analysis. The assigned KCRV was the median of the seven results and was assigned a KCRV of 250.2 ng/kg with a combined standard uncertainty of 3.6 ng/kg, The k-factor for the estimation of the expanded uncertainty of the KCRVs was chosen as k = 2. The degree of equivalence (with the KCRV) and its uncertainty was calculated for each result. Seven of the participants were able to demonstrate the ability to quantitatively determine low-polarity analyte in aqueous matrix by applying LC-MS/MS technique at a very low level. Main text To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).
Representation of analysis results involving aleatory and epistemic uncertainty.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Jay Dean; Helton, Jon Craig; Oberkampf, William Louis
2008-08-01
Procedures are described for the representation of results in analyses that involve both aleatory uncertainty and epistemic uncertainty, with aleatory uncertainty deriving from an inherent randomness in the behavior of the system under study and epistemic uncertainty deriving from a lack of knowledge about the appropriate values to use for quantities that are assumed to have fixed but poorly known values in the context of a specific study. Aleatory uncertainty is usually represented with probability and leads to cumulative distribution functions (CDFs) or complementary cumulative distribution functions (CCDFs) for analysis results of interest. Several mathematical structures are available for themore » representation of epistemic uncertainty, including interval analysis, possibility theory, evidence theory and probability theory. In the presence of epistemic uncertainty, there is not a single CDF or CCDF for a given analysis result. Rather, there is a family of CDFs and a corresponding family of CCDFs that derive from epistemic uncertainty and have an uncertainty structure that derives from the particular uncertainty structure (i.e., interval analysis, possibility theory, evidence theory, probability theory) used to represent epistemic uncertainty. Graphical formats for the representation of epistemic uncertainty in families of CDFs and CCDFs are investigated and presented for the indicated characterizations of epistemic uncertainty.« less
Automated retinal image quality assessment on the UK Biobank dataset for epidemiological studies.
Welikala, R A; Fraz, M M; Foster, P J; Whincup, P H; Rudnicka, A R; Owen, C G; Strachan, D P; Barman, S A
2016-04-01
Morphological changes in the retinal vascular network are associated with future risk of many systemic and vascular diseases. However, uncertainty over the presence and nature of some of these associations exists. Analysis of data from large population based studies will help to resolve these uncertainties. The QUARTZ (QUantitative Analysis of Retinal vessel Topology and siZe) retinal image analysis system allows automated processing of large numbers of retinal images. However, an image quality assessment module is needed to achieve full automation. In this paper, we propose such an algorithm, which uses the segmented vessel map to determine the suitability of retinal images for use in the creation of vessel morphometric data suitable for epidemiological studies. This includes an effective 3-dimensional feature set and support vector machine classification. A random subset of 800 retinal images from UK Biobank (a large prospective study of 500,000 middle aged adults; where 68,151 underwent retinal imaging) was used to examine the performance of the image quality algorithm. The algorithm achieved a sensitivity of 95.33% and a specificity of 91.13% for the detection of inadequate images. The strong performance of this image quality algorithm will make rapid automated analysis of vascular morphometry feasible on the entire UK Biobank dataset (and other large retinal datasets), with minimal operator involvement, and at low cost. Copyright © 2016 Elsevier Ltd. All rights reserved.
Fuzzy parametric uncertainty analysis of linear dynamical systems: A surrogate modeling approach
NASA Astrophysics Data System (ADS)
Chowdhury, R.; Adhikari, S.
2012-10-01
Uncertainty propagation engineering systems possess significant computational challenges. This paper explores the possibility of using correlated function expansion based metamodelling approach when uncertain system parameters are modeled using Fuzzy variables. In particular, the application of High-Dimensional Model Representation (HDMR) is proposed for fuzzy finite element analysis of dynamical systems. The HDMR expansion is a set of quantitative model assessment and analysis tools for capturing high-dimensional input-output system behavior based on a hierarchy of functions of increasing dimensions. The input variables may be either finite-dimensional (i.e., a vector of parameters chosen from the Euclidean space RM) or may be infinite-dimensional as in the function space CM[0,1]. The computational effort to determine the expansion functions using the alpha cut method scales polynomially with the number of variables rather than exponentially. This logic is based on the fundamental assumption underlying the HDMR representation that only low-order correlations among the input variables are likely to have significant impacts upon the outputs for most high-dimensional complex systems. The proposed method is integrated with a commercial Finite Element software. Modal analysis of a simplified aircraft wing with Fuzzy parameters has been used to illustrate the generality of the proposed approach. In the numerical examples, triangular membership functions have been used and the results have been validated against direct Monte Carlo simulations.
Uncertainties in ecosystem service maps: a comparison on the European scale.
Schulp, Catharina J E; Burkhard, Benjamin; Maes, Joachim; Van Vliet, Jasper; Verburg, Peter H
2014-01-01
Safeguarding the benefits that ecosystems provide to society is increasingly included as a target in international policies. To support such policies, ecosystem service maps are made. However, there is little attention for the accuracy of these maps. We made a systematic review and quantitative comparison of ecosystem service maps on the European scale to generate insights in the uncertainty of ecosystem service maps and discuss the possibilities for quantitative validation. Maps of climate regulation and recreation were reasonably similar while large uncertainties among maps of erosion protection and flood regulation were observed. Pollination maps had a moderate similarity. Differences among the maps were caused by differences in indicator definition, level of process understanding, mapping aim, data sources and methodology. Absence of suitable observed data on ecosystem services provisioning hampers independent validation of the maps. Consequently, there are, so far, no accurate measures for ecosystem service map quality. Policy makers and other users need to be cautious when applying ecosystem service maps for decision-making. The results illustrate the need for better process understanding and data acquisition to advance ecosystem service mapping, modelling and validation.
Measurement uncertainty analysis techniques applied to PV performance measurements
NASA Astrophysics Data System (ADS)
Wells, C.
1992-10-01
The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.
Numerical Simulation and Quantitative Uncertainty Assessment of Microchannel Flow
NASA Astrophysics Data System (ADS)
Debusschere, Bert; Najm, Habib; Knio, Omar; Matta, Alain; Ghanem, Roger; Le Maitre, Olivier
2002-11-01
This study investigates the effect of uncertainty in physical model parameters on computed electrokinetic flow of proteins in a microchannel with a potassium phosphate buffer. The coupled momentum, species transport, and electrostatic field equations give a detailed representation of electroosmotic and pressure-driven flow, including sample dispersion mechanisms. The chemistry model accounts for pH-dependent protein labeling reactions as well as detailed buffer electrochemistry in a mixed finite-rate/equilibrium formulation. To quantify uncertainty, the governing equations are reformulated using a pseudo-spectral stochastic methodology, which uses polynomial chaos expansions to describe uncertain/stochastic model parameters, boundary conditions, and flow quantities. Integration of the resulting equations for the spectral mode strengths gives the evolution of all stochastic modes for all variables. Results show the spatiotemporal evolution of uncertainties in predicted quantities and highlight the dominant parameters contributing to these uncertainties during various flow phases. This work is supported by DARPA.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wall, Anna
With recent trends toward intermittent renewable energy sources in the U.S., the geothermal industry in its current form faces a crossroad: adapt, disrupt, or be left behind. Strategic planning with scenario analysis offers a framework to characterize plausible views of the future given current trends - as well as disruptions to the status quo. To inform strategic planning in the Department of Energy's (DOE) Geothermal Technology Office (GTO), the Geothermal Vision Study is tasked with offering data-driven pathways for future geothermal development. Scenario analysis is a commonly used tool in private industry corporate strategic planning as a way to prioritizemore » and manage large investments in light of uncertainty and risk. Since much of the uncertainty and risk in a geothermal project is believed to occur within early stage exploration and drilling, this paper focuses on the levers (technical and financial) within the exploration process that can be pulled to affect change. Given these potential changes, this work first qualitatively explores potential shifts to the geothermal industry. Future work within the Geothermal Vision Study will incorporate these qualitative scenarios quantitatively, in competition with other renewable and conventional energy industries.« less
An Analysis of Categorical and Quantitative Methods for Planning Under Uncertainty
Langlotz, Curtis P.; Shortliffe, Edward H.
1988-01-01
Decision theory and logical reasoning are both methods for representing and solving medical decision problems. We analyze the usefulness of these two approaches to medical therapy planning by establishing a simple correspondence between decision theory and non-monotonic logic, a formalization of categorical logical reasoning. The analysis indicates that categorical approaches to planning can be viewed as comprising two decision-theoretic concepts: probabilities (degrees of belief in planning hypotheses) and utilities (degrees of desirability of planning outcomes). We present and discuss examples of the following lessons from this decision-theoretic view of categorical (nonmonotonic) reasoning: (1) Decision theory and artificial intelligence techniques are intended to solve different components of the planning problem. (2) When considered in the context of planning under uncertainty, nonmonotonic logics do not retain the domain-independent characteristics of classical logical reasoning for planning under certainty. (3) Because certain nonmonotonic programming paradigms (e.g., frame-based inheritance, rule-based planning, protocol-based reminders) are inherently problem-specific, they may be inappropriate to employ in the solution of certain types of planning problems. We discuss how these conclusions affect several current medical informatics research issues, including the construction of “very large” medical knowledge bases.
Tolkoff, Max R; Alfaro, Michael E; Baele, Guy; Lemey, Philippe; Suchard, Marc A
2018-05-01
Phylogenetic comparative methods explore the relationships between quantitative traits adjusting for shared evolutionary history. This adjustment often occurs through a Brownian diffusion process along the branches of the phylogeny that generates model residuals or the traits themselves. For high-dimensional traits, inferring all pair-wise correlations within the multivariate diffusion is limiting. To circumvent this problem, we propose phylogenetic factor analysis (PFA) that assumes a small unknown number of independent evolutionary factors arise along the phylogeny and these factors generate clusters of dependent traits. Set in a Bayesian framework, PFA provides measures of uncertainty on the factor number and groupings, combines both continuous and discrete traits, integrates over missing measurements and incorporates phylogenetic uncertainty with the help of molecular sequences. We develop Gibbs samplers based on dynamic programming to estimate the PFA posterior distribution, over 3-fold faster than for multivariate diffusion and a further order-of-magnitude more efficiently in the presence of latent traits. We further propose a novel marginal likelihood estimator for previously impractical models with discrete data and find that PFA also provides a better fit than multivariate diffusion in evolutionary questions in columbine flower development, placental reproduction transitions and triggerfish fin morphometry.
Facing uncertainty in ecosystem services-based resource management.
Grêt-Regamey, Adrienne; Brunner, Sibyl H; Altwegg, Jürg; Bebi, Peter
2013-09-01
The concept of ecosystem services is increasingly used as a support for natural resource management decisions. While the science for assessing ecosystem services is improving, appropriate methods to address uncertainties in a quantitative manner are missing. Ignoring parameter uncertainties, modeling uncertainties and uncertainties related to human-environment interactions can modify decisions and lead to overlooking important management possibilities. In this contribution, we present a new approach for mapping the uncertainties in the assessment of multiple ecosystem services. The spatially explicit risk approach links Bayesian networks to a Geographic Information System for forecasting the value of a bundle of ecosystem services and quantifies the uncertainties related to the outcomes in a spatially explicit manner. We demonstrate that mapping uncertainties in ecosystem services assessments provides key information for decision-makers seeking critical areas in the delivery of ecosystem services in a case study in the Swiss Alps. The results suggest that not only the total value of the bundle of ecosystem services is highly dependent on uncertainties, but the spatial pattern of the ecosystem services values changes substantially when considering uncertainties. This is particularly important for the long-term management of mountain forest ecosystems, which have long rotation stands and are highly sensitive to pressing climate and socio-economic changes. Copyright © 2012 Elsevier Ltd. All rights reserved.
Shi, Lei; Shuai, Jian; Xu, Kui
2014-08-15
Fire and explosion accidents of steel oil storage tanks (FEASOST) occur occasionally during the petroleum and chemical industry production and storage processes and often have devastating impact on lives, the environment and property. To contribute towards the development of a quantitative approach for assessing the occurrence probability of FEASOST, a fault tree of FEASOST is constructed that identifies various potential causes. Traditional fault tree analysis (FTA) can achieve quantitative evaluation if the failure data of all of the basic events (BEs) are available, which is almost impossible due to the lack of detailed data, as well as other uncertainties. This paper makes an attempt to perform FTA of FEASOST by a hybrid application between an expert elicitation based improved analysis hierarchy process (AHP) and fuzzy set theory, and the occurrence possibility of FEASOST is estimated for an oil depot in China. A comparison between statistical data and calculated data using fuzzy fault tree analysis (FFTA) based on traditional and improved AHP is also made. Sensitivity and importance analysis has been performed to identify the most crucial BEs leading to FEASOST that will provide insights into how managers should focus effective mitigation. Copyright © 2014 Elsevier B.V. All rights reserved.
Methylcyclopentadienyl manganese tricarbonyl: health risk uncertainties and research directions.
Davis, J M
1998-01-01
With the way cleared for increased use of the fuel additive methylcyclopentadienyl manganese tricarbonyl (MMT) in the United States, the issue of possible public health impacts associated with this additive has gained greater attention. In assessing potential health risks of particulate Mn emitted from the combustion of MMT in gasoline, the U.S. Environmental Protection Agency not only considered the qualitative types of toxic effects associated with inhaled Mn, but conducted extensive exposure-response analyses using various statistical approaches and also estimated population exposure distributions of particulate Mn based on data from an exposure study conducted in California when MMT was used in leaded gasoline. Because of limitations in available data and the need to make several assumptions and extrapolations, the resulting risk characterization had inherent uncertainties that made it impossible to estimate health risks in a definitive or quantitative manner. To support an improved health risk characterization, further investigation is needed in the areas of health effects, emission characterization, and exposure analysis. PMID:9539013
A statistical approach to the brittle fracture of a multi-phase solid
NASA Technical Reports Server (NTRS)
Liu, W. K.; Lua, Y. I.; Belytschko, T.
1991-01-01
A stochastic damage model is proposed to quantify the inherent statistical distribution of the fracture toughness of a brittle, multi-phase solid. The model, based on the macrocrack-microcrack interaction, incorporates uncertainties in locations and orientations of microcracks. Due to the high concentration of microcracks near the macro-tip, a higher order analysis based on traction boundary integral equations is formulated first for an arbitrary array of cracks. The effects of uncertainties in locations and orientations of microcracks at a macro-tip are analyzed quantitatively by using the boundary integral equations method in conjunction with the computer simulation of the random microcrack array. The short range interactions resulting from surrounding microcracks closet to the main crack tip are investigated. The effects of microcrack density parameter are also explored in the present study. The validity of the present model is demonstrated by comparing its statistical output with the Neville distribution function, which gives correct fits to sets of experimental data from multi-phase solids.
The redoubtable ecological periodic table
Ecological periodic tables are repositories of reliable information on quantitative, predictably recurring (periodic) habitat–community patterns and their uncertainty, scaling and transferability. Their reliability derives from their grounding in sound ecological principle...
Uncertainties have a meaning: Information entropy as a quality measure for 3-D geological models
NASA Astrophysics Data System (ADS)
Wellmann, J. Florian; Regenauer-Lieb, Klaus
2012-03-01
Analyzing, visualizing and communicating uncertainties are important issues as geological models can never be fully determined. To date, there exists no general approach to quantify uncertainties in geological modeling. We propose here to use information entropy as an objective measure to compare and evaluate model and observational results. Information entropy was introduced in the 50s and defines a scalar value at every location in the model for predictability. We show that this method not only provides a quantitative insight into model uncertainties but, due to the underlying concept of information entropy, can be related to questions of data integration (i.e. how is the model quality interconnected with the used input data) and model evolution (i.e. does new data - or a changed geological hypothesis - optimize the model). In other words information entropy is a powerful measure to be used for data assimilation and inversion. As a first test of feasibility, we present the application of the new method to the visualization of uncertainties in geological models, here understood as structural representations of the subsurface. Applying the concept of information entropy on a suite of simulated models, we can clearly identify (a) uncertain regions within the model, even for complex geometries; (b) the overall uncertainty of a geological unit, which is, for example, of great relevance in any type of resource estimation; (c) a mean entropy for the whole model, important to track model changes with one overall measure. These results cannot easily be obtained with existing standard methods. The results suggest that information entropy is a powerful method to visualize uncertainties in geological models, and to classify the indefiniteness of single units and the mean entropy of a model quantitatively. Due to the relationship of this measure to the missing information, we expect the method to have a great potential in many types of geoscientific data assimilation problems — beyond pure visualization.
Uncertainties in cylindrical anode current inferences on pulsed power drivers
NASA Astrophysics Data System (ADS)
Porwitzky, Andrew; Brown, Justin
2018-06-01
For over a decade, velocimetry based techniques have been used to infer the electrical current delivered to dynamic materials properties experiments on pulsed power drivers such as the Z Machine. Though originally developed for planar load geometries, in recent years, inferring the current delivered to cylindrical coaxial loads has become a valuable diagnostic tool for numerous platforms. Presented is a summary of uncertainties that can propagate through the current inference technique when applied to expanding cylindrical anodes. An equation representing quantitative uncertainty is developed which shows the unfold method to be accurate to a few percent above 10 MA of load current.
Impact of petrophysical uncertainty on Bayesian hydrogeophysical inversion and model selection
NASA Astrophysics Data System (ADS)
Brunetti, Carlotta; Linde, Niklas
2018-01-01
Quantitative hydrogeophysical studies rely heavily on petrophysical relationships that link geophysical properties to hydrogeological properties and state variables. Coupled inversion studies are frequently based on the questionable assumption that these relationships are perfect (i.e., no scatter). Using synthetic examples and crosshole ground-penetrating radar (GPR) data from the South Oyster Bacterial Transport Site in Virginia, USA, we investigate the impact of spatially-correlated petrophysical uncertainty on inferred posterior porosity and hydraulic conductivity distributions and on Bayes factors used in Bayesian model selection. Our study shows that accounting for petrophysical uncertainty in the inversion (I) decreases bias of the inferred variance of hydrogeological subsurface properties, (II) provides more realistic uncertainty assessment and (III) reduces the overconfidence in the ability of geophysical data to falsify conceptual hydrogeological models.
Uncertainties of Mayak urine data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Guthrie; Vostrotin, Vadim; Vvdensky, Vladimir
2008-01-01
For internal dose calculations for the Mayak worker epidemiological study, quantitative estimates of uncertainty of the urine measurements are necessary. Some of the data consist of measurements of 24h urine excretion on successive days (e.g. 3 or 4 days). In a recent publication, dose calculations were done where the uncertainty of the urine measurements was estimated starting from the statistical standard deviation of these replicate mesurements. This approach is straightforward and accurate when the number of replicate measurements is large, however, a Monte Carlo study showed it to be problematic for the actual number of replicate measurements (median from 3more » to 4). Also, it is sometimes important to characterize the uncertainty of a single urine measurement. Therefore this alternate method has been developed. A method of parameterizing the uncertainty of Mayak urine bioassay measmements is described. The Poisson lognormal model is assumed and data from 63 cases (1099 urine measurements in all) are used to empirically determine the lognormal normalization uncertainty, given the measurement uncertainties obtained from count quantities. The natural logarithm of the geometric standard deviation of the normalization uncertainty is found to be in the range 0.31 to 0.35 including a measurement component estimated to be 0.2.« less
Uncertainty dimension and basin entropy in relativistic chaotic scattering
NASA Astrophysics Data System (ADS)
Bernal, Juan D.; Seoane, Jesús M.; Sanjuán, Miguel A. F.
2018-04-01
Chaotic scattering is an important topic in nonlinear dynamics and chaos with applications in several fields in physics and engineering. The study of this phenomenon in relativistic systems has received little attention as compared to the Newtonian case. Here we focus our work on the study of some relevant characteristics of the exit basin topology in the relativistic Hénon-Heiles system: the uncertainty dimension, the Wada property, and the basin entropy. Our main findings for the uncertainty dimension show two different behaviors insofar as we change the relativistic parameter β , in which a crossover behavior is uncovered. This crossover point is related with the disappearance of KAM islands in phase space, which happens for velocity values above the ultrarelativistic limit, v >0.1 c . This result is supported by numerical simulations and by qualitative analysis, which are in good agreement. On the other hand, the computation of the exit basins in the phase space suggests the existence of Wada basins for a range of β <0.625 . We also studied the evolution of the exit basins in a quantitative manner by computing the basin entropy, which shows a maximum value for β ≈0.2 . This last quantity is related to the uncertainty in the prediction of the final fate of the system. Finally, our work is relevant in galactic dynamics, and it also has important implications in other topics in physics such as as in the Störmer problem, among others.
Past and present cosmic structure in the SDSS DR7 main sample
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jasche, J.; Leclercq, F.; Wandelt, B.D., E-mail: jasche@iap.fr, E-mail: florent.leclercq@polytechnique.org, E-mail: wandelt@iap.fr
2015-01-01
We present a chrono-cosmography project, aiming at the inference of the four dimensional formation history of the observed large scale structure from its origin to the present epoch. To do so, we perform a full-scale Bayesian analysis of the northern galactic cap of the Sloan Digital Sky Survey (SDSS) Data Release 7 main galaxy sample, relying on a fully probabilistic, physical model of the non-linearly evolved density field. Besides inferring initial conditions from observations, our methodology naturally and accurately reconstructs non-linear features at the present epoch, such as walls and filaments, corresponding to high-order correlation functions generated by late-time structuremore » formation. Our inference framework self-consistently accounts for typical observational systematic and statistical uncertainties such as noise, survey geometry and selection effects. We further account for luminosity dependent galaxy biases and automatic noise calibration within a fully Bayesian approach. As a result, this analysis provides highly-detailed and accurate reconstructions of the present density field on scales larger than ∼ 3 Mpc/h, constrained by SDSS observations. This approach also leads to the first quantitative inference of plausible formation histories of the dynamic large scale structure underlying the observed galaxy distribution. The results described in this work constitute the first full Bayesian non-linear analysis of the cosmic large scale structure with the demonstrated capability of uncertainty quantification. Some of these results will be made publicly available along with this work. The level of detail of inferred results and the high degree of control on observational uncertainties pave the path towards high precision chrono-cosmography, the subject of simultaneously studying the dynamics and the morphology of the inhomogeneous Universe.« less
Rish, William R; Pfau, Edward J
2018-04-01
A bounding risk assessment is presented that evaluates possible human health risk from a hypothetical scenario involving a 10,000-gallon release of flowback water from horizontal fracturing of Marcellus Shale. The water is assumed to be spilled on the ground, infiltrates into groundwater that is a source of drinking water, and an adult and child located downgradient drink the groundwater. Key uncertainties in estimating risk are given explicit quantitative treatment using Monte Carlo analysis. Chemicals that contribute significantly to estimated health risks are identified, as are key uncertainties and variables to which risk estimates are sensitive. The results show that hypothetical exposure via drinking water impacted by chemicals in Marcellus Shale flowback water, assumed to be spilled onto the ground surface, results in predicted bounds between 10 -10 and 10 -6 (for both adult and child receptors) for excess lifetime cancer risk. Cumulative hazard indices (HI CUMULATIVE ) resulting from these hypothetical exposures have predicted bounds (5th to 95th percentile) between 0.02 and 35 for assumed adult receptors and 0.1 and 146 for assumed child receptors. Predicted health risks are dominated by noncancer endpoints related to ingestion of barium and lithium in impacted groundwater. Hazard indices above unity are largely related to exposure to lithium. Salinity taste thresholds are likely to be exceeded before drinking water exposures result in adverse health effects. The findings provide focus for policy discussions concerning flowback water risk management. They also indicate ways to improve the ability to estimate health risks from drinking water impacted by a flowback water spill (i.e., reducing uncertainty). © 2017 Society for Risk Analysis.
Templeton, A. R.; Sing, C. F.
1993-01-01
We previously developed an analytical strategy based on cladistic theory to identify subsets of haplotypes that are associated with significant phenotypic deviations. Our initial approach was limited to segments of DNA in which little recombination occurs. In such cases, a cladogram can be constructed from the restriction site data to estimate the evolutionary steps that interrelate the observed haplotypes to one another. The cladogram is then used to define a nested statistical design for identifying mutational steps associated with significant phenotypic deviations. The central assumption behind this strategy is that a mutation responsible for a particular phenotypic effect is embedded within the evolutionary history that is represented by the cladogram. The power of this approach depends on the accuracy of the cladogram in portraying the evolutionary history of the DNA region. This accuracy can be diminished both by recombination and by uncertainty in the estimated cladogram topology. In a previous paper, we presented an algorithm for estimating the set of likely cladograms and recombination events. In this paper we present an algorithm for defining a nested statistical design under cladogram uncertainty and recombination. Given the nested design, phenotypic associations can be examined using either a nested analysis of variance (for haploids or homozygous strains) or permutation testing (for outcrossed, diploid gene regions). In this paper we also extend this analytical strategy to include categorical phenotypes in addition to quantitative phenotypes. Some worked examples are presented using Drosophila data sets. These examples illustrate that having some recombination may actually enhance the biological inferences that may derived from a cladistic analysis. In particular, recombination can be used to assign a physical localization to a given subregion for mutations responsible for significant phenotypic effects. PMID:8100789
Valese, Andressa Camargo; Molognoni, Luciano; de Souza, Naielly Coelho; de Sá Ploêncio, Leandro Antunes; Costa, Ana Carolina Oliveira; Barreto, Fabiano; Daguer, Heitor
2017-05-15
A sensitive method for the simultaneous residues analysis of 62 veterinary drugs in feeds by liquid chromatography-tandem mass spectrometry has been developed and validated in accordance to Commission Decision 657/2002/EC. Additionally, limits of detection (LOD), limits of quantitation (LOQ), matrix effects and measurement uncertainty were also assessed. Extraction was performed for all analytes and respective internal standards in a single step and chromatographic separation was achieved in only 12min. LOQ were set to 0.63-5.00μgkg -1 (amphenicols), 0.63-30.00μgkg -1 (avermectins), 0.63μgkg -1 (benzimidazoles), 0.25-200.00μgkg -1 (coccidiostats), 0.63-200.00μgkg -1 (lincosamides and macrolides), 0.25-5.00μgkg -1 (nitrofurans), 0.63-20.00μgkg -1 (fluoroquinolones and quinolones), 15.00μgkg -1 (quinoxaline), 0.63-7.50μgkg -1 (sulfonamides), 0.63-20.00μgkg -1 (tetracyclines), 0.25μgkg -1 (β-agonists), and 30.00μgkg -1 (β-lactams). The top-down approach was adequate for the calculation of measurement uncertainty for all analytes, except the banned substances, which should be rather assessed by the bottom-up approach. Routine analysis of different types of feeds was then carried out. An interesting profile of residues of veterinary drugs among samples was revealed, enlightening the need for stricter control in producing animals. Among the total of 27 feed samples, 20 analytes could be detected/quantified, ranging from trace levels to very high concentrations. A high throughput screening/confirmatory method for the residue analysis of several veterinary drugs in feeds was proposed as a helpful control tool. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Dai, H.; Chen, X.; Ye, M.; Song, X.; Zachara, J. M.
2016-12-01
Sensitivity analysis has been an important tool in groundwater modeling to identify the influential parameters. Among various sensitivity analysis methods, the variance-based global sensitivity analysis has gained popularity for its model independence characteristic and capability of providing accurate sensitivity measurements. However, the conventional variance-based method only considers uncertainty contribution of single model parameters. In this research, we extended the variance-based method to consider more uncertainty sources and developed a new framework to allow flexible combinations of different uncertainty components. We decompose the uncertainty sources into a hierarchical three-layer structure: scenario, model and parametric. Furthermore, each layer of uncertainty source is capable of containing multiple components. An uncertainty and sensitivity analysis framework was then constructed following this three-layer structure using Bayesian network. Different uncertainty components are represented as uncertain nodes in this network. Through the framework, variance-based sensitivity analysis can be implemented with great flexibility of using different grouping strategies for uncertainty components. The variance-based sensitivity analysis thus is improved to be able to investigate the importance of an extended range of uncertainty sources: scenario, model, and other different combinations of uncertainty components which can represent certain key model system processes (e.g., groundwater recharge process, flow reactive transport process). For test and demonstration purposes, the developed methodology was implemented into a test case of real-world groundwater reactive transport modeling with various uncertainty sources. The results demonstrate that the new sensitivity analysis method is able to estimate accurate importance measurements for any uncertainty sources which were formed by different combinations of uncertainty components. The new methodology can provide useful information for environmental management and decision-makers to formulate policies and strategies.
NASA Astrophysics Data System (ADS)
Lloveras, Diego G.; Vásquez, Alberto M.; Nuevo, Federico A.; Frazin, Richard A.
2017-10-01
Using differential emission measure tomography (DEMT) based on time series of EUV images, we carry out a quantitative comparative analysis of the three-dimensional (3D) structure of the electron density and temperature of the inner corona (r<1.25 R_{⊙}) between two specific rotations selected from the last two solar minima, namely Carrington Rotations (CR)1915 and CR-2081. The analysis places error bars on the results because of the systematic uncertainty of the sources. While the results for CR-2081 are characterized by a remarkable north-south symmetry, the southern hemisphere for CR-1915 exhibits higher densities and temperatures than the northern hemisphere. The core region of the streamer belt in both rotations is found to be populated by structures whose temperature decreases with height (called "down loops" in our previous articles). They are characterized by plasma β≳1, and may be the result of the efficient dissipation of Alfvén waves at low coronal heights. The comparative analysis reveals that the low latitudes of the equatorial streamer belt of CR-1915 exhibit higher densities than for CR-2081. This cannot be explained by the systematic uncertainties. In addition, the southern hemisphere of the streamer belt of CR-1915 is characterized by higher temperatures and density scale heights than for CR-2081. On the other hand, the coronal hole region of CR-1915 shows lower temperatures than for CR-2081. The reported differences are in the range ≈ 10 - 25%, depending on the specific physical quantity and region that is compared, as fully detailed in the analysis. For other regions and/or physical quantities, the uncertainties do not allow assessing the thermodynamical differences between the two rotations. Future investigation will involve a DEMT analysis of other Carrington rotations selected from both epochs, and also a comparison of their tomographic reconstructions with magnetohydrodynamical simulations of the inner corona.
ACCOUNTING FOR CALIBRATION UNCERTAINTIES IN X-RAY ANALYSIS: EFFECTIVE AREAS IN SPECTRAL FITTING
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Hyunsook; Kashyap, Vinay L.; Drake, Jeremy J.
2011-04-20
While considerable advance has been made to account for statistical uncertainties in astronomical analyses, systematic instrumental uncertainties have been generally ignored. This can be crucial to a proper interpretation of analysis results because instrumental calibration uncertainty is a form of systematic uncertainty. Ignoring it can underestimate error bars and introduce bias into the fitted values of model parameters. Accounting for such uncertainties currently requires extensive case-specific simulations if using existing analysis packages. Here, we present general statistical methods that incorporate calibration uncertainties into spectral analysis of high-energy data. We first present a method based on multiple imputation that can bemore » applied with any fitting method, but is necessarily approximate. We then describe a more exact Bayesian approach that works in conjunction with a Markov chain Monte Carlo based fitting. We explore methods for improving computational efficiency, and in particular detail a method of summarizing calibration uncertainties with a principal component analysis of samples of plausible calibration files. This method is implemented using recently codified Chandra effective area uncertainties for low-resolution spectral analysis and is verified using both simulated and actual Chandra data. Our procedure for incorporating effective area uncertainty is easily generalized to other types of calibration uncertainties.« less
Impact of uncertainty on modeling and testing
NASA Technical Reports Server (NTRS)
Coleman, Hugh W.; Brown, Kendall K.
1995-01-01
A thorough understanding of the uncertainties associated with the modeling and testing of the Space Shuttle Main Engine (SSME) Engine will greatly aid decisions concerning hardware performance and future development efforts. This report will describe the determination of the uncertainties in the modeling and testing of the Space Shuttle Main Engine test program at the Technology Test Bed facility at Marshall Space Flight Center. Section 2 will present a summary of the uncertainty analysis methodology used and discuss the specific applications to the TTB SSME test program. Section 3 will discuss the application of the uncertainty analysis to the test program and the results obtained. Section 4 presents the results of the analysis of the SSME modeling effort from an uncertainty analysis point of view. The appendices at the end of the report contain a significant amount of information relative to the analysis, including discussions of venturi flowmeter data reduction and uncertainty propagation, bias uncertainty documentations, technical papers published, the computer code generated to determine the venturi uncertainties, and the venturi data and results used in the analysis.
A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.
Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco
2018-01-01
One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality control laboratory: It is laborious, time consuming, semi-quantitative, and requires a radioisotope. Along with dot-blot hybridization, two alternatives techniques were evaluated: threshold analysis and quantitative polymerase chain reaction. Quality risk management tools were applied to compare the techniques, taking into account the uncertainties, the possibility of circumstances or future events, and their effects upon method performance. By illustrating the application of these tools with DNA methods, we provide an example of how they can be used to support a scientific and practical approach to decision making and can assess and manage method performance risk using such tools. This paper discusses, considering the principles of quality risk management, an additional approach to the development and selection of analytical quality control methods using the risk analysis tool hazard analysis and critical control points. This tool provides the possibility to find the method procedural steps with higher impact on method reliability (called critical control points). Our model concluded that the radioactive dot-blot assay has the larger number of critical control points, followed by quantitative polymerase chain reaction and threshold analysis. Quantitative polymerase chain reaction is shown to be the better alternative analytical methodology in residual cellular DNA analysis. © PDA, Inc. 2018.
Spectral optimization and uncertainty quantification in combustion modeling
NASA Astrophysics Data System (ADS)
Sheen, David Allan
Reliable simulations of reacting flow systems require a well-characterized, detailed chemical model as a foundation. Accuracy of such a model can be assured, in principle, by a multi-parameter optimization against a set of experimental data. However, the inherent uncertainties in the rate evaluations and experimental data leave a model still characterized by some finite kinetic rate parameter space. Without a careful analysis of how this uncertainty space propagates into the model's predictions, those predictions can at best be trusted only qualitatively. In this work, the Method of Uncertainty Minimization using Polynomial Chaos Expansions is proposed to quantify these uncertainties. In this method, the uncertainty in the rate parameters of the as-compiled model is quantified. Then, the model is subjected to a rigorous multi-parameter optimization, as well as a consistency-screening process. Lastly, the uncertainty of the optimized model is calculated using an inverse spectral optimization technique, and then propagated into a range of simulation conditions. An as-compiled, detailed H2/CO/C1-C4 kinetic model is combined with a set of ethylene combustion data to serve as an example. The idea that the hydrocarbon oxidation model should be understood and developed in a hierarchical fashion has been a major driving force in kinetics research for decades. How this hierarchical strategy works at a quantitative level, however, has never been addressed. In this work, we use ethylene and propane combustion as examples and explore the question of hierarchical model development quantitatively. The Method of Uncertainty Minimization using Polynomial Chaos Expansions is utilized to quantify the amount of information that a particular combustion experiment, and thereby each data set, contributes to the model. This knowledge is applied to explore the relationships among the combustion chemistry of hydrogen/carbon monoxide, ethylene, and larger alkanes. Frequently, new data will become available, and it will be desirable to know the effect that inclusion of these data has on the optimized model. Two cases are considered here. In the first, a study of H2/CO mass burning rates has recently been published, wherein the experimentally-obtained results could not be reconciled with any extant H2/CO oxidation model. It is shown in that an optimized H2/CO model can be developed that will reproduce the results of the new experimental measurements. In addition, the high precision of the new experiments provide a strong constraint on the reaction rate parameters of the chemistry model, manifested in a significant improvement in the precision of simulations. In the second case, species time histories were measured during n-heptane oxidation behind reflected shock waves. The highly precise nature of these measurements is expected to impose critical constraints on chemical kinetic models of hydrocarbon combustion. The results show that while an as-compiled, prior reaction model of n-alkane combustion can be accurate in its prediction of the detailed species profiles, the kinetic parameter uncertainty in the model remains to be too large to obtain a precise prediction of the data. Constraining the prior model against the species time histories within the measurement uncertainties led to notable improvements in the precision of model predictions against the species data as well as the global combustion properties considered. Lastly, we show that while the capability of the multispecies measurement presents a step-change in our precise knowledge of the chemical processes in hydrocarbon combustion, accurate data of global combustion properties are still necessary to predict fuel combustion.
Determination of Uncertainties for the New SSME Model
NASA Technical Reports Server (NTRS)
Coleman, Hugh W.; Hawk, Clark W.
1996-01-01
This report discusses the uncertainty analysis performed in support of a new test analysis and performance prediction model for the Space Shuttle Main Engine. The new model utilizes uncertainty estimates for experimental data and for the analytical model to obtain the most plausible operating condition for the engine system. This report discusses the development of the data sets and uncertainty estimates to be used in the development of the new model. It also presents the application of uncertainty analysis to analytical models and the uncertainty analysis for the conservation of mass and energy balance relations is presented. A new methodology for the assessment of the uncertainty associated with linear regressions is presented.
NASA Astrophysics Data System (ADS)
Herring, T.; Cey, E. E.; Pidlisecky, A.
2017-12-01
Time-lapse electrical resistivity tomography (ERT) is used to image changes in subsurface electrical conductivity (EC), e.g. due to a saline contaminant plume. Temperature variation also produces an EC response, which interferes with the signal of interest. Temperature compensation requires the temperature distribution and the relationship between EC and temperature, but this relationship at subzero temperatures is not well defined. The goal of this study is to examine how uncertainty in the subzero EC/temperature relationship manifests in temperature corrected ERT images, especially with respect to relevant plume parameters (location, contaminant mass, etc.). First, a lab experiment was performed to determine the EC of fine-grained glass beads over a range of temperatures (-20° to 20° C) and saturations. The measured EC/temperature relationship was then used to add temperature effects to a hypothetical EC model of a conductive plume. Forward simulations yielded synthetic field data to which temperature corrections were applied. Varying the temperature/EC relationship used in the temperature correction and comparing the temperature corrected ERT results to the synthetic model enabled a quantitative analysis of the error of plume parameters associated with temperature variability. Modeling possible scenarios in this way helps to establish the feasibility of different time-lapse ERT applications by quantifying the uncertainty associated with parameter(s) of interest.
NASA Astrophysics Data System (ADS)
Reynders, Edwin; Maes, Kristof; Lombaert, Geert; De Roeck, Guido
2016-01-01
Identified modal characteristics are often used as a basis for the calibration and validation of dynamic structural models, for structural control, for structural health monitoring, etc. It is therefore important to know their accuracy. In this article, a method for estimating the (co)variance of modal characteristics that are identified with the stochastic subspace identification method is validated for two civil engineering structures. The first structure is a damaged prestressed concrete bridge for which acceleration and dynamic strain data were measured in 36 different setups. The second structure is a mid-rise building for which acceleration data were measured in 10 different setups. There is a good quantitative agreement between the predicted levels of uncertainty and the observed variability of the eigenfrequencies and damping ratios between the different setups. The method can therefore be used with confidence for quantifying the uncertainty of the identified modal characteristics, also when some or all of them are estimated from a single batch of vibration data. Furthermore, the method is seen to yield valuable insight in the variability of the estimation accuracy from mode to mode and from setup to setup: the more informative a setup is regarding an estimated modal characteristic, the smaller is the estimated variance.
NASA Astrophysics Data System (ADS)
Bode, Felix; Ferré, Ty; Zigelli, Niklas; Emmert, Martin; Nowak, Wolfgang
2018-03-01
Collaboration between academics and practitioners promotes knowledge transfer between research and industry, with both sides benefiting greatly. However, academic approaches are often not feasible given real-world limits on time, cost and data availability, especially for risk and uncertainty analyses. Although the need for uncertainty quantification and risk assessment are clear, there are few published studies examining how scientific methods can be used in practice. In this work, we introduce possible strategies for transferring and communicating academic approaches to real-world applications, countering the current disconnect between increasingly sophisticated academic methods and methods that work and are accepted in practice. We analyze a collaboration between academics and water suppliers in Germany who wanted to design optimal groundwater monitoring networks for drinking-water well catchments. Our key conclusions are: to prefer multiobjective over single-objective optimization; to replace Monte-Carlo analyses by scenario methods; and to replace data-hungry quantitative risk assessment by easy-to-communicate qualitative methods. For improved communication, it is critical to set up common glossaries of terms to avoid misunderstandings, use striking visualization to communicate key concepts, and jointly and continually revisit the project objectives. Ultimately, these approaches and recommendations are simple and utilitarian enough to be transferred directly to other practical water resource related problems.
Hallifax, D; Houston, J B
2009-03-01
Mechanistic prediction of unbound drug clearance from human hepatic microsomes and hepatocytes correlates with in vivo clearance but is both systematically low (10 - 20 % of in vivo clearance) and highly variable, based on detailed assessments of published studies. Metabolic capacity (Vmax) of commercially available human hepatic microsomes and cryopreserved hepatocytes is log-normally distributed within wide (30 - 150-fold) ranges; Km is also log-normally distributed and effectively independent of Vmax, implying considerable variability in intrinsic clearance. Despite wide overlap, average capacity is 2 - 20-fold (dependent on P450 enzyme) greater in microsomes than hepatocytes, when both are normalised (scaled to whole liver). The in vitro ranges contrast with relatively narrow ranges of clearance among clinical studies. The high in vitro variation probably reflects unresolved phenotypical variability among liver donors and practicalities in processing of human liver into in vitro systems. A significant contribution from the latter is supported by evidence of low reproducibility (several fold) of activity in cryopreserved hepatocytes and microsomes prepared from the same cells, between separate occasions of thawing of cells from the same liver. The large uncertainty which exists in human hepatic in vitro systems appears to dominate the overall uncertainty of in vitro-in vivo extrapolation, including uncertainties within scaling, modelling and drug dependent effects. As such, any notion of quantitative prediction of clearance appears severely challenged.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fayerweather, W.E.
2007-07-01
The study's objectives were to update Partanen's and Boffetta's 1994 meta-analysis of lung cancer among roofing and paving asphalt workers and explore the role of coal tar in explaining the statistical heterogeneity among these studies. Information retrieval strategies and eligibility criteria were defined for identifying the epidemiologic studies to be included in the analysis. The relative risk ratio (RR) for lung cancer was selected as the effect measure of interest. Coal tar bias factors were developed and used to externally adjust each eligible study's published RR for confounding by coal tar. The meta-Relative Risk (meta-RR) and its variance were estimatedmore » by general variance-based methods. Heterogeneity of the RRs was assessed by heterogeneity chi-square and I{sup 2} tests. The results from this update were similar to those in Partanen's and Boffetta's original meta-analysis. Although the meta-RRs for the roofers and the pavers were no longer statistically significantly different from one another, significant heterogeneity remained within each of the coal tar-adjusted sectors. Meta-analysis of non-experimental epidemiologic studies is subject to significant uncertainties as is externally correcting studies for confounding. Given these uncertainties, the specific quantitative estimates in this (or any similar) analysis must be viewed with caution. Nevertheless, this analysis provides support for the hypothesis proposed by several major reviewers that confounding by coal tar-related PAH exposures may explain most or all of the lung cancer risks found in the epidemiologic literature on asphalt roofing and paving workers.« less
Uncertainties in future-proof decision-making: the Dutch Delta Model
NASA Astrophysics Data System (ADS)
IJmker, Janneke; Snippen, Edwin; Ruijgh, Erik
2013-04-01
In 1953, a number of European countries experienced flooding after a major storm event coming from the northwest. Over 2100 people died of the resulting floods, 1800 of them being Dutch. This gave rise to the development of the so-called Delta Works and Zuiderzee Works that strongly reduced the flood risk in the Netherlands. These measures were a response to a large flooding event. As boundary conditions have changed (increasing population, increasing urban development, etc.) , the flood risk should be evaluated continuously, and measures should be taken if necessary. The Delta Programme was designed to be prepared for future changes and to limit the flood risk, taking into account economics, nature, landscape, residence and recreation . To support decisions in the Delta Programme, the Delta Model was developed. By using four different input scenarios (extremes in climate and economics) and variations in system setup, the outcomes of the Delta Model represent a range of possible outcomes for the hydrological situation in 2050 and 2100. These results flow into effect models that give insight in the integrated effects on freshwater supply (including navigation, industry and ecology) and flood risk. As the long-term water management policy of the Netherlands for the next decades will be based on these results, they have to be reliable. Therefore, a study was carried out to investigate the impact of uncertainties on the model outcomes. The study focused on "known unknowns": uncertainties in the boundary conditions, in the parameterization and in the model itself. This showed that for different parts of the Netherlands, the total uncertainty is in the order of meters! Nevertheless, (1) the total uncertainty is dominated by uncertainties in boundary conditions. Internal model uncertainties are subordinate to that. Furthermore, (2) the model responses develop in a logical way, such that the exact model outcomes might be uncertain, but the outcomes of different model runs are reliable relative to each other. The Delta Model therefore is a reliable instrument for finding the optimal water management policy for the future. As the exact model outcomes show a high degree of uncertainty, the model analysis will be on a large numbers of model runs to gain insight in the sensitivity of the model for different setups and boundary conditions. The results allow fast investigation of (relative) effects of measures. Furthermore, it helps to identify bottlenecks in the system. To summarize, the Delta Model is a tool for policy makers to base their policy strategies on quantitative rather than qualitative information. It can be applied to the current and future situation, and feeds the political discussion. The uncertainty of the model has no determinative effect on the analysis that can be done by the Delta Model.
NASA Astrophysics Data System (ADS)
Akram, Muhammad Farooq Bin
The management of technology portfolios is an important element of aerospace system design. New technologies are often applied to new product designs to ensure their competitiveness at the time they are introduced to market. The future performance of yet-to- be designed components is inherently uncertain, necessitating subject matter expert knowledge, statistical methods and financial forecasting. Estimates of the appropriate parameter settings often come from disciplinary experts, who may disagree with each other because of varying experience and background. Due to inherent uncertain nature of expert elicitation in technology valuation process, appropriate uncertainty quantification and propagation is very critical. The uncertainty in defining the impact of an input on performance parameters of a system makes it difficult to use traditional probability theory. Often the available information is not enough to assign the appropriate probability distributions to uncertain inputs. Another problem faced during technology elicitation pertains to technology interactions in a portfolio. When multiple technologies are applied simultaneously on a system, often their cumulative impact is non-linear. Current methods assume that technologies are either incompatible or linearly independent. It is observed that in case of lack of knowledge about the problem, epistemic uncertainty is the most suitable representation of the process. It reduces the number of assumptions during the elicitation process, when experts are forced to assign probability distributions to their opinions without sufficient knowledge. Epistemic uncertainty can be quantified by many techniques. In present research it is proposed that interval analysis and Dempster-Shafer theory of evidence are better suited for quantification of epistemic uncertainty in technology valuation process. Proposed technique seeks to offset some of the problems faced by using deterministic or traditional probabilistic approaches for uncertainty propagation. Non-linear behavior in technology interactions is captured through expert elicitation based technology synergy matrices (TSM). Proposed TSMs increase the fidelity of current technology forecasting methods by including higher order technology interactions. A test case for quantification of epistemic uncertainty on a large scale problem of combined cycle power generation system was selected. A detailed multidisciplinary modeling and simulation environment was adopted for this problem. Results have shown that evidence theory based technique provides more insight on the uncertainties arising from incomplete information or lack of knowledge as compared to deterministic or probability theory methods. Margin analysis was also carried out for both the techniques. A detailed description of TSMs and their usage in conjunction with technology impact matrices and technology compatibility matrices is discussed. Various combination methods are also proposed for higher order interactions, which can be applied according to the expert opinion or historical data. The introduction of technology synergy matrix enabled capturing the higher order technology interactions, and improvement in predicted system performance.
Transfer of uncertainty of space-borne high resolution rainfall products at ungauged regions
NASA Astrophysics Data System (ADS)
Tang, Ling
Hydrologically relevant characteristics of high resolution (˜ 0.25 degree, 3 hourly) satellite rainfall uncertainty were derived as a function of season and location using a six year (2002-2007) archive of National Aeronautics and Space Administration (NASA)'s Tropical Rainfall Measuring Mission (TRMM) Multi-satellite Precipitation Analysis (TMPA) precipitation data. The Next Generation Radar (NEXRAD) Stage IV rainfall data over the continental United States was used as ground validation (GV) data. A geostatistical mapping scheme was developed and tested for transfer (i.e., spatial interpolation) of uncertainty information from GV regions to the vast non-GV regions by leveraging the error characterization work carried out in the earlier step. The open question explored here was, "If 'error' is defined on the basis of independent ground validation (GV) data, how are error metrics estimated for a satellite rainfall data product without the need for much extensive GV data?" After a quantitative analysis of the spatial and temporal structure of the satellite rainfall uncertainty, a proof-of-concept geostatistical mapping scheme (based on the kriging method) was evaluated. The idea was to understand how realistic the idea of 'transfer' is for the GPM era. It was found that it was indeed technically possible to transfer error metrics from a gauged to an ungauged location for certain error metrics and that a regionalized error metric scheme for GPM may be possible. The uncertainty transfer scheme based on a commonly used kriging method (ordinary kriging) was then assessed further at various timescales (climatologic, seasonal, monthly and weekly), and as a function of the density of GV coverage. The results indicated that if a transfer scheme for estimating uncertainty metrics was finer than seasonal scale (ranging from 3-6 hourly to weekly-monthly), the effectiveness for uncertainty transfer worsened significantly. Next, a comprehensive assessment of different kriging methods for spatial transfer (interpolation) of error metrics was performed. Three kriging methods for spatial interpolation are compared, which are: ordinary kriging (OK), indicator kriging (IK) and disjunctive kriging (DK). Additional comparison with the simple inverse distance weighting (IDW) method was also performed to quantify the added benefit (if any) of using geostatistical methods. The overall performance ranking of the kriging methods was found to be as follows: OK=DK > IDW > IK. Lastly, various metrics of satellite rainfall uncertainty were identified for two large continental landmasses that share many similar Koppen climate zones, United States and Australia. The dependence of uncertainty as a function of gauge density was then investigated. The investigation revealed that only the first and second ordered moments of error are most amenable to a Koppen-type climate type classification in different continental landmasses.
NASA Astrophysics Data System (ADS)
Cobden, L. J.
2017-12-01
Mineral physics provides the essential link between seismic observations of the Earth's interior, and laboratory (or computer-simulated) measurements of rock properties. In this presentation I will outline the procedure for quantitative conversion from thermochemical structure to seismic structure (and vice versa) using the latest datasets from seismology and mineralogy. I will show examples of how this method can allow us to infer major chemical and dynamic properties of the deep mantle. I will also indicate where uncertainties and limitations in the data require us to exercise caution, in order not to "over-interpret" seismic observations. Understanding and modelling these uncertainties serves as a useful guide for mineralogists to ascertain which mineral parameters are most useful in seismic interpretation, and enables seismologists to optimise their data assembly and inversions for quantitative interpretations.
From intuition to statistics in building subsurface structural models
Brandenburg, J.P.; Alpak, F.O.; Naruk, S.; Solum, J.
2011-01-01
Experts associated with the oil and gas exploration industry suggest that combining forward trishear models with stochastic global optimization algorithms allows a quantitative assessment of the uncertainty associated with a given structural model. The methodology is applied to incompletely imaged structures related to deepwater hydrocarbon reservoirs and results are compared to prior manual palinspastic restorations and borehole data. This methodology is also useful for extending structural interpretations into other areas of limited resolution, such as subsalt in addition to extrapolating existing data into seismic data gaps. This technique can be used for rapid reservoir appraisal and potentially have other applications for seismic processing, well planning, and borehole stability analysis.
NASA Astrophysics Data System (ADS)
Ye, M.; Chen, Z.; Shi, L.; Zhu, Y.; Yang, J.
2017-12-01
Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. While global sensitivity analysis is a vital tool for identifying the parameters important to nitrogen reactive transport, conventional global sensitivity analysis only considers parametric uncertainty. This may result in inaccurate selection of important parameters, because parameter importance may vary under different models and modeling scenarios. By using a recently developed variance-based global sensitivity analysis method, this paper identifies important parameters with simultaneous consideration of parametric uncertainty, model uncertainty, and scenario uncertainty. In a numerical example of nitrogen reactive transport modeling, a combination of three scenarios of soil temperature and two scenarios of soil moisture leads to a total of six scenarios. Four alternative models are used to evaluate reduction functions used for calculating actual rates of nitrification and denitrification. The model uncertainty is tangled with scenario uncertainty, as the reduction functions depend on soil temperature and moisture content. The results of sensitivity analysis show that parameter importance varies substantially between different models and modeling scenarios, which may lead to inaccurate selection of important parameters if model and scenario uncertainties are not considered. This problem is avoided by using the new method of sensitivity analysis in the context of model averaging and scenario averaging. The new method of sensitivity analysis can be applied to other problems of contaminant transport modeling when model uncertainty and/or scenario uncertainty are present.
NASA Astrophysics Data System (ADS)
Komorowski, Jean-Christophe; Hincks, Thea; Sparks, Steve; Aspinall, Willy; Legendre, Yoann; Boudon, Georges
2013-04-01
Since 1992, mild but persistent seismic and fumarolic unrest at La Soufrière de Guadeloupe volcano has prompted renewed concern about hazards and risks, crisis response planning, and has rejuvenated interest in geological studies. Scientists monitoring active volcanoes frequently have to provide science-based decision support to civil authorities during such periods of unrest. In these circumstances, the Bayesian Belief Network (BBN) offers a formalized evidence analysis tool for making inferences about the state of the volcano from different strands of data, allowing associated uncertainties to be treated in a rational and auditable manner, to the extent warranted by the strength of the evidence. To illustrate the principles of the BBN approach, a retrospective analysis is undertaken of the 1975-77 crisis, providing an inferential assessment of the evolving state of the magmatic system and the probability of subsequent eruption. Conditional dependencies and parameters in the BBN are characterized quantitatively by structured expert elicitation. Revisiting data available in 1976 suggests the probability of magmatic intrusion would have been evaluated high at the time, according with subsequent thinking about the volcanological nature of the episode. The corresponding probability of a magmatic eruption therefore would have been elevated in July and August 1976; however, collective uncertainty about the future course of the crisis was great at the time, even if some individual opinions were certain. From this BBN analysis, while the more likely appraised outcome - based on observational trends at 31 August 1976 - might have been 'no eruption' (mean probability 0.5; 5-95 percentile range 0.8), an imminent magmatic eruption (or blast) could have had a probability of about 0.4, almost as substantial. Thus, there was no real scientific basis to assert one scenario was more likely than the other. This retrospective evaluation adds objective probabilistic expression to the contemporary volcanological narrative, and demonstrates that a formal evidential case could have been made to support the authorities' concerns and decision to evacuate. Revisiting the circumstances of the 1976 crisis highlights many contemporary challenges of decision-making under conditions of volcanological uncertainty. We suggest the BBN concept is a suitable framework for marshalling multiple observations, model results and interpretations - and all associated uncertainties - in a methodical manner. Base-rate eruption probabilities for Guadeloupe can be updated now with a new chronology of activity suggesting that 10 major explosive phases and 9 dome-forming phases occurred in the last 9150 years, associated with ≥ 8 flank-collapses and ≥ 6-7 high-energy pyroclastic density currents (blasts). Eruptive recurrence, magnitude and intensity place quantitative constraints on La Soufrière's event tree to elaborate credible scenarios. The current unrest offers an opportunity to update the BBN model and explore the uncertainty on inferences about the system's internal state. This probabilistic formalism would provoke key questions relating to unrest evolution: 1) is the unrest hydrothermal or magmatic? 2) what controls dyke/intrusion arrest and hence failed-magmatic eruptions like 1976? 3) what conditions could lead to significant pressurization with potential for explosive activity and edifice instability, and what monitoring signs might be manifest?
The NASA Langley Multidisciplinary Uncertainty Quantification Challenge
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.
2014-01-01
This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.
A Framework to Determine New System Requirements Under Design Parameter and Demand Uncertainties
2015-04-30
relegates quantitative complexities of decision-making to the method and designates trade-space exploration to the practitioner. We demonstrate the...quantitative complexities of decision-making to the method and designates trade-space exploration to the practitioner. We demonstrate the approach...play a critical role in determining new system requirements. Scope and Method of Approach The early stages of the design process have substantial
MO-E-BRE-01: Determination, Minimization and Communication of Uncertainties in Radiation Therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Dyk, J; Palta, J; Bortfeld, T
2014-06-15
Medical Physicists have a general understanding of uncertainties in the radiation treatment process, both with respect to dosimetry and geometry. However, there is a desire to be more quantitative about uncertainty estimation. A recent International Atomic Energy Agency (IAEA) report (about to be published) recommends that we should be as “accurate as reasonably achievable, technical and biological factors being taken into account”. Thus, a single recommendation as a goal for accuracy in radiation therapy is an oversimplification. That report also suggests that individual clinics should determine their own level of uncertainties for their specific treatment protocols. The question is “howmore » do we implement this in clinical practice”? AAPM Monograph 35 (2011 AAPM Summer School) addressed many specific aspects of uncertainties in each of the steps of a course of radiation treatment. The intent of this symposium is: (1) to review uncertainty considerations in the entire radiation treatment process including uncertainty determination for each step and uncertainty propagation for the total process, (2) to consider aspects of robust optimization which optimizes treatment plans while protecting them against uncertainties, and (3) to describe various methods of displaying uncertainties and communicating uncertainties to the relevant professionals. While the theoretical and research aspects will also be described, the emphasis will be on the practical considerations for the medical physicist in clinical practice. Learning Objectives: To review uncertainty determination in the overall radiation treatment process. To consider uncertainty modeling and uncertainty propagation. To highlight the basic ideas and clinical potential of robust optimization procedures to generate optimal treatment plans that are not severely affected by uncertainties. To describe methods of uncertainty communication and display.« less
Van Straten, S K; Xu, M; Rayne, S R
2017-06-01
Breast cancer is a leading cause of morbidity and mortality in South African women. In resource-limited settings emphasis for disease management is often concentrated on biological control and survival. However, understanding the full biopsychosocial experience of breast cancer is essential in improving access and patient uptake of care. A quantitative cross-sectional study was carried out in patients prior to breast surgery. Each participant completed the survey including validated questionnaires of uncertainty, QoL index, social support scale and demographics. Of the 59 women approached, 53 (89.9%) participated. Uncertainty was found in 86.8% (28.3% severe uncertainty) with all newly-diagnosed patients experiencing uncertainty. Patients above 45 years made up 80% of all those who were severely uncertain. Good social support did not affect levels of uncertainty. Conversely QoL was improved in women with at least primary education, and in women above 45 years. Pre-surgical chemotherapy was not associated with either uncertainty or QoL. Greatest uncertainty was reported about the roles of the treating staff and the presence of unanswered questions. Older women and those with education more commonly experienced uncertainty, but reported better QoL. The areas of uncertainty can help direct clinicians in limited resources settings to better direct services to help support patients, instituting simple measures of education and orientation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Distler, T. M.; Wong, C. M.
Runoff-water samples for the first, third, and fourth quarters of 1975 were analyzed for pesticide residues at LLL and independently by the LFE Environmental Analysis Laboratories. For the compounds analyzed, upper limits to possible contamination were placed conservatively at the low parts-per-billion level. In addition, soil samples were also analyzed. Future work will continue to include quarterly sampling and will be broadened in scope to include quantitative analysis of a larger number of compounds. A study of recovery efficiency is planned. Because of the high backgrounds on soil samples together with the uncertainties introduced by the cleanup procedures, there ismore » little hope of evaluating the distribution of a complex mixture of pesticides among the aqueous and solid phases in a drainage sample. No further sampling of soil from the streambed is therefore contemplated.« less
Bayesian evidence computation for model selection in non-linear geoacoustic inference problems.
Dettmer, Jan; Dosso, Stan E; Osler, John C
2010-12-01
This paper applies a general Bayesian inference approach, based on Bayesian evidence computation, to geoacoustic inversion of interface-wave dispersion data. Quantitative model selection is carried out by computing the evidence (normalizing constants) for several model parameterizations using annealed importance sampling. The resulting posterior probability density estimate is compared to estimates obtained from Metropolis-Hastings sampling to ensure consistent results. The approach is applied to invert interface-wave dispersion data collected on the Scotian Shelf, off the east coast of Canada for the sediment shear-wave velocity profile. Results are consistent with previous work on these data but extend the analysis to a rigorous approach including model selection and uncertainty analysis. The results are also consistent with core samples and seismic reflection measurements carried out in the area.
Mortier, Séverine Thérèse F C; Van Bockstal, Pieter-Jan; Corver, Jos; Nopens, Ingmar; Gernaey, Krist V; De Beer, Thomas
2016-06-01
Large molecules, such as biopharmaceuticals, are considered the key driver of growth for the pharmaceutical industry. Freeze-drying is the preferred way to stabilise these products when needed. However, it is an expensive, inefficient, time- and energy-consuming process. During freeze-drying, there are only two main process variables to be set, i.e. the shelf temperature and the chamber pressure, however preferably in a dynamic way. This manuscript focuses on the essential use of uncertainty analysis for the determination and experimental verification of the dynamic primary drying Design Space for pharmaceutical freeze-drying. Traditionally, the chamber pressure and shelf temperature are kept constant during primary drying, leading to less optimal process conditions. In this paper it is demonstrated how a mechanistic model of the primary drying step gives the opportunity to determine the optimal dynamic values for both process variables during processing, resulting in a dynamic Design Space with a well-known risk of failure. This allows running the primary drying process step as time efficient as possible, hereby guaranteeing that the temperature at the sublimation front does not exceed the collapse temperature. The Design Space is the multidimensional combination and interaction of input variables and process parameters leading to the expected product specifications with a controlled (i.e., high) probability. Therefore, inclusion of parameter uncertainty is an essential part in the definition of the Design Space, although it is often neglected. To quantitatively assess the inherent uncertainty on the parameters of the mechanistic model, an uncertainty analysis was performed to establish the borders of the dynamic Design Space, i.e. a time-varying shelf temperature and chamber pressure, associated with a specific risk of failure. A risk of failure acceptance level of 0.01%, i.e. a 'zero-failure' situation, results in an increased primary drying process time compared to the deterministic dynamic Design Space; however, the risk of failure is under control. Experimental verification revealed that only a risk of failure acceptance level of 0.01% yielded a guaranteed zero-defect quality end-product. The computed process settings with a risk of failure acceptance level of 0.01% resulted in a decrease of more than half of the primary drying time in comparison with a regular, conservative cycle with fixed settings. Copyright © 2016. Published by Elsevier B.V.
Forward and backward uncertainty propagation: an oxidation ditch modelling example.
Abusam, A; Keesman, K J; van Straten, G
2003-01-01
In the field of water technology, forward uncertainty propagation is frequently used, whereas backward uncertainty propagation is rarely used. In forward uncertainty analysis, one moves from a given (or assumed) parameter subspace towards the corresponding distribution of the output or objective function. However, in the backward uncertainty propagation, one moves in the reverse direction, from the distribution function towards the parameter subspace. Backward uncertainty propagation, which is a generalisation of parameter estimation error analysis, gives information essential for designing experimental or monitoring programmes, and for tighter bounding of parameter uncertainty intervals. The procedure of carrying out backward uncertainty propagation is illustrated in this technical note by working example for an oxidation ditch wastewater treatment plant. Results obtained have demonstrated that essential information can be achieved by carrying out backward uncertainty propagation analysis.
Poet, Torka S; Timchalk, Charles; Bartels, Michael J; Smith, Jordan N; McDougal, Robin; Juberg, Daland R; Price, Paul S
2017-06-01
A physiologically based pharmacokinetic and pharmacodynamic (PBPK/PD) model combined with Monte Carlo analysis of inter-individual variation was used to assess the effects of the insecticide, chlorpyrifos and its active metabolite, chlorpyrifos oxon in humans. The PBPK/PD model has previously been validated and used to describe physiological changes in typical individuals as they grow from birth to adulthood. This model was updated to include physiological and metabolic changes that occur with pregnancy. The model was then used to assess the impact of inter-individual variability in physiology and biochemistry on predictions of internal dose metrics and quantitatively assess the impact of major sources of parameter uncertainty and biological diversity on the pharmacodynamics of red blood cell acetylcholinesterase inhibition. These metrics were determined in potentially sensitive populations of infants, adult women, pregnant women, and a combined population of adult men and women. The parameters primarily responsible for inter-individual variation in RBC acetylcholinesterase inhibition were related to metabolic clearance of CPF and CPF-oxon. Data Derived Extrapolation Factors that address intra-species physiology and biochemistry to replace uncertainty factors with quantitative differences in metrics were developed in these same populations. The DDEFs were less than 4 for all populations. These data and modeling approach will be useful in ongoing and future human health risk assessments for CPF and could be used for other chemicals with potential human exposure. Copyright © 2017 Elsevier Inc. All rights reserved.
Persistence and uncertainty in the academic career
Petersen, Alexander M.; Riccaboni, Massimo; Stanley, H. Eugene; Pammolli, Fabio
2012-01-01
Understanding how institutional changes within academia may affect the overall potential of science requires a better quantitative representation of how careers evolve over time. Because knowledge spillovers, cumulative advantage, competition, and collaboration are distinctive features of the academic profession, both the employment relationship and the procedures for assigning recognition and allocating funding should be designed to account for these factors. We study the annual production ni(t) of a given scientist i by analyzing longitudinal career data for 200 leading scientists and 100 assistant professors from the physics community. Our empirical analysis of individual productivity dynamics shows that (i) there are increasing returns for the top individuals within the competitive cohort, and that (ii) the distribution of production growth is a leptokurtic “tent-shaped” distribution that is remarkably symmetric. Our methodology is general, and we speculate that similar features appear in other disciplines where academic publication is essential and collaboration is a key feature. We introduce a model of proportional growth which reproduces these two observations, and additionally accounts for the significantly right-skewed distributions of career longevity and achievement in science. Using this theoretical model, we show that short-term contracts can amplify the effects of competition and uncertainty making careers more vulnerable to early termination, not necessarily due to lack of individual talent and persistence, but because of random negative production shocks. We show that fluctuations in scientific production are quantitatively related to a scientist’s collaboration radius and team efficiency. PMID:22431620
Nelson, Michael A; Bedner, Mary; Lang, Brian E; Toman, Blaza; Lippa, Katrice A
2015-11-01
Given the critical role of pure, organic compound primary reference standards used to characterize and certify chemical Certified Reference Materials (CRMs), it is essential that associated mass purity assessments be fit-for-purpose, represented by an appropriate uncertainty interval, and metrologically sound. The mass fraction purities (% g/g) of 25-hydroxyvitamin D (25(OH)D) reference standards used to produce and certify values for clinical vitamin D metabolite CRMs were investigated by multiple orthogonal quantitative measurement techniques. Quantitative (1)H-nuclear magnetic resonance spectroscopy (qNMR) was performed to establish traceability of these materials to the International System of Units (SI) and to directly assess the principal analyte species. The 25(OH)D standards contained volatile and water impurities, as well as structurally-related impurities that are difficult to observe by chromatographic methods or to distinguish from the principal 25(OH)D species by one-dimensional NMR. These impurities have the potential to introduce significant biases to purity investigations in which a limited number of measurands are quantified. Combining complementary information from multiple analytical methods, using both direct and indirect measurement techniques, enabled mitigation of these biases. Purities of 25(OH)D reference standards and associated uncertainties were determined using frequentist and Bayesian statistical models to combine data acquired via qNMR, liquid chromatography with UV absorbance and atmospheric pressure-chemical ionization mass spectrometric detection (LC-UV, LC-ACPI-MS), thermogravimetric analysis (TGA), and Karl Fischer (KF) titration.
PROBABILISTIC RISK ANALYSIS OF RADIOACTIVE WASTE DISPOSALS - a case study
NASA Astrophysics Data System (ADS)
Trinchero, P.; Delos, A.; Tartakovsky, D. M.; Fernandez-Garcia, D.; Bolster, D.; Dentz, M.; Sanchez-Vila, X.; Molinero, J.
2009-12-01
The storage of contaminant material in superficial or sub-superficial repositories, such as tailing piles for mine waste or disposal sites for low and intermediate nuclear waste, poses a potential threat for the surrounding biosphere. The minimization of these risks can be achieved by supporting decision-makers with quantitative tools capable to incorporate all source of uncertainty within a rigorous probabilistic framework. A case study is presented where we assess the risks associated to the superficial storage of hazardous waste close to a populated area. The intrinsic complexity of the problem, involving many events with different spatial and time scales and many uncertainty parameters is overcome by using a formal PRA (probabilistic risk assessment) procedure that allows decomposing the system into a number of key events. Hence, the failure of the system is directly linked to the potential contamination of one of the three main receptors: the underlying karst aquifer, a superficial stream that flows near the storage piles and a protection area surrounding a number of wells used for water supply. The minimal cut sets leading to the failure of the system are obtained by defining a fault-tree that incorporates different events including the failure of the engineered system (e.g. cover of the piles) and the failure of the geological barrier (e.g. clay layer that separates the bottom of the pile from the karst formation). Finally the probability of failure is quantitatively assessed combining individual independent or conditional probabilities that are computed numerically or borrowed from reliability database.
Uncertainty in accounting for carbon accumulation following forest harvesting
NASA Astrophysics Data System (ADS)
Lilly, P.; Yanai, R. D.; Arthur, M. A.; Bae, K.; Hamburg, S.; Levine, C. R.; Vadeboncoeur, M. A.
2014-12-01
Tree biomass and forest soils are both difficult to quantify with confidence, for different reasons. Forest biomass is estimated non-destructively using allometric equations, often from other sites; these equations are difficult to validate. Forest soils are destructively sampled, resulting in little measurement error at a point, but with large sampling error in heterogeneous soil environments, such as in soils developed on glacial till. In this study, we report C contents of biomass and soil pools in northern hardwood stands in replicate plots within replicate stands in 3 age classes following clearcut harvesting (14-19 yr, 26-29 yr, and > 100 yr) at the Bartlett Experimental Forest, USA. The rate of C accumulation in aboveground biomass was ~3 Mg/ha/yr between the young and mid-aged stands and <1 Mg/ha/yr between the mid-aged and mature stands. We propagated model uncertainty through allometric equations, and found errors ranging from 3-7%, depending on the stand. The variation in biomass among plots within stands (6-19%) was always larger than the allometric uncertainties. Soils were described by quantitative soil pits in three plots per stand, excavated by depth increment to the C horizon. Variation in soil mass among pits within stands averaged 28% (coefficient of variation); variation among stands within an age class ranged from 9-25%. Variation in carbon concentrations averaged 27%, mainly because the depth increments contained varying proportions of genetic horizons, in the upper part of the soil profile. Differences across age classes in soil C were not significant, because of the high variability. Uncertainty analysis can help direct the design of monitoring schemes to achieve the greatest confidence in C stores per unit of sampling effort. In the system we studied, more extensive sampling would be the best approach to reducing uncertainty, as natural spatial variation was higher than model or measurement uncertainties.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matney, J; Lian, J; Chera, B
2015-06-15
Introduction: Geometric uncertainties in daily patient setup can lead to variations in the planned dose, especially when using highly conformal techniques such as helical Tomotherapy. To account for the potential effect of geometric uncertainty, our clinical practice is to expand critical structures by 3mm expansion into planning risk volumes (PRV). The PRV concept assumes the spatial dose cloud is insensitive to patient positioning. However, no tools currently exist to determine if a Tomotherapy plan is robust to the effects of daily setup variation. We objectively quantified the impact of geometric uncertainties on the 3D doses to critical normal tissues duringmore » helical Tomotherapy. Methods: Using a Matlab-based program created and validated by Accuray (Madison, WI), the planned Tomotherapy delivery sinogram recalculated dose on shifted CT datasets. Ten head and neck patients were selected for analysis. To simulate setup uncertainty, the patient anatomy was shifted ±3mm in the longitudinal, lateral and vertical axes. For each potential shift, the recalculated doses to various critical normal tissues were compared to the doses delivered to the PRV in the original plan Results: 18 shifted scenarios created from Tomotherapy plans for three patients with head and neck cancers were analyzed. For all simulated setup errors, the maximum doses to the brainstem, spinal cord, parotids and cochlea were no greater than 0.6Gy of the respective original PRV maximum. Despite 3mm setup shifts, the minimum dose delivered to 95% of the CTVs and PTVs were always within 0.4Gy of the original plan. Conclusions: For head and neck sites treated with Tomotherapy, the use of a 3mm PRV expansion provide a reasonable estimate of the dosimetric effects of 3mm setup uncertainties. Similarly, target coverage appears minimally effected by a 3mm setup uncertainty. Data from a larger number of patients will be presented. Future work will include other anatomical sites.« less
NASA Astrophysics Data System (ADS)
Möbius, E.; Bzowski, M.; Frisch, P. C.; Fuselier, S. A.; Heirtzler, D.; Kubiak, M. A.; Kucharek, H.; Lee, M. A.; Leonard, T.; McComas, D. J.; Schwadron, N. A.; Sokół, J. M.; Swaczyna, P.; Wurz, P.
2015-10-01
The Interstellar Boundary Explorer (IBEX) samples the interstellar neutral (ISN) gas flow of several species every year from December through late March when the Earth moves into the incoming flow. The first quantitative analyses of these data resulted in a narrow tube in four-dimensional interstellar parameter space, which couples speed, flow latitude, flow longitude, and temperature, and center values with approximately 3° larger longitude and 3 km s-1 lower speed, but with temperatures similar to those obtained from observations by the Ulysses spacecraft. IBEX has now recorded six years of ISN flow observations, providing a large database over increasing solar activity and using varying viewing strategies. In this paper, we evaluate systematic effects that are important for the ISN flow vector and temperature determination. We find that all models in use return ISN parameters well within the observational uncertainties and that the derived ISN flow direction is resilient against uncertainties in the ionization rate. We establish observationally an effective IBEX-Lo pointing uncertainty of ±0.°18 in spin angle and confirm an uncertainty of ±0.°1 in longitude. We also show that the IBEX viewing strategy with different spin-axis orientations minimizes the impact of several systematic uncertainties, and thus improves the robustness of the measurement. The Helium Warm Breeze has likely contributed substantially to the somewhat different center values of the ISN flow vector. By separating the flow vector and temperature determination, we can mitigate these effects on the analysis, which returns an ISN flow vector very close to the Ulysses results, but with a substantially higher temperature. Due to coupling with the ISN flow speed along the ISN parameter tube, we provide the temperature {T}{VISN∞ }=8710+440/-680 K for {V}{ISN∞ }=26 {km} {{{s}}}-1 for comparison, where most of the uncertainty is systematic and likely due to the presence of the Warm Breeze.
Pretest uncertainty analysis for chemical rocket engine tests
NASA Technical Reports Server (NTRS)
Davidian, Kenneth J.
1987-01-01
A parametric pretest uncertainty analysis has been performed for a chemical rocket engine test at a unique 1000:1 area ratio altitude test facility. Results from the parametric study provide the error limits required in order to maintain a maximum uncertainty of 1 percent on specific impulse. Equations used in the uncertainty analysis are presented.
Neural Mechanisms of Updating under Reducible and Irreducible Uncertainty.
Kobayashi, Kenji; Hsu, Ming
2017-07-19
Adaptive decision making depends on an agent's ability to use environmental signals to reduce uncertainty. However, because of multiple types of uncertainty, agents must take into account not only the extent to which signals violate prior expectations but also whether uncertainty can be reduced in the first place. Here we studied how human brains of both sexes respond to signals under conditions of reducible and irreducible uncertainty. We show behaviorally that subjects' value updating was sensitive to the reducibility of uncertainty, and could be quantitatively characterized by a Bayesian model where agents ignore expectancy violations that do not update beliefs or values. Using fMRI, we found that neural processes underlying belief and value updating were separable from responses to expectancy violation, and that reducibility of uncertainty in value modulated connections from belief-updating regions to value-updating regions. Together, these results provide insights into how agents use knowledge about uncertainty to make better decisions while ignoring mere expectancy violation. SIGNIFICANCE STATEMENT To make good decisions, a person must observe the environment carefully, and use these observations to reduce uncertainty about consequences of actions. Importantly, uncertainty should not be reduced purely based on how surprising the observations are, particularly because in some cases uncertainty is not reducible. Here we show that the human brain indeed reduces uncertainty adaptively by taking into account the nature of uncertainty and ignoring mere surprise. Behaviorally, we show that human subjects reduce uncertainty in a quasioptimal Bayesian manner. Using fMRI, we characterize brain regions that may be involved in uncertainty reduction, as well as the network they constitute, and dissociate them from brain regions that respond to mere surprise. Copyright © 2017 the authors 0270-6474/17/376972-11$15.00/0.
Neural Mechanisms of Updating under Reducible and Irreducible Uncertainty
2017-01-01
Adaptive decision making depends on an agent's ability to use environmental signals to reduce uncertainty. However, because of multiple types of uncertainty, agents must take into account not only the extent to which signals violate prior expectations but also whether uncertainty can be reduced in the first place. Here we studied how human brains of both sexes respond to signals under conditions of reducible and irreducible uncertainty. We show behaviorally that subjects' value updating was sensitive to the reducibility of uncertainty, and could be quantitatively characterized by a Bayesian model where agents ignore expectancy violations that do not update beliefs or values. Using fMRI, we found that neural processes underlying belief and value updating were separable from responses to expectancy violation, and that reducibility of uncertainty in value modulated connections from belief-updating regions to value-updating regions. Together, these results provide insights into how agents use knowledge about uncertainty to make better decisions while ignoring mere expectancy violation. SIGNIFICANCE STATEMENT To make good decisions, a person must observe the environment carefully, and use these observations to reduce uncertainty about consequences of actions. Importantly, uncertainty should not be reduced purely based on how surprising the observations are, particularly because in some cases uncertainty is not reducible. Here we show that the human brain indeed reduces uncertainty adaptively by taking into account the nature of uncertainty and ignoring mere surprise. Behaviorally, we show that human subjects reduce uncertainty in a quasioptimal Bayesian manner. Using fMRI, we characterize brain regions that may be involved in uncertainty reduction, as well as the network they constitute, and dissociate them from brain regions that respond to mere surprise. PMID:28626019
Uncertainties in estimating health risks associated with exposure to ionising radiation.
Preston, R Julian; Boice, John D; Brill, A Bertrand; Chakraborty, Ranajit; Conolly, Rory; Hoffman, F Owen; Hornung, Richard W; Kocher, David C; Land, Charles E; Shore, Roy E; Woloschak, Gayle E
2013-09-01
The information for the present discussion on the uncertainties associated with estimation of radiation risks and probability of disease causation was assembled for the recently published NCRP Report No. 171 on this topic. This memorandum provides a timely overview of the topic, given that quantitative uncertainty analysis is the state of the art in health risk assessment and given its potential importance to developments in radiation protection. Over the past decade the increasing volume of epidemiology data and the supporting radiobiology findings have aided in the reduction of uncertainty in the risk estimates derived. However, it is equally apparent that there remain significant uncertainties related to dose assessment, low dose and low dose-rate extrapolation approaches (e.g. the selection of an appropriate dose and dose-rate effectiveness factor), the biological effectiveness where considerations of the health effects of high-LET and lower-energy low-LET radiations are required and the transfer of risks from a population for which health effects data are available to one for which such data are not available. The impact of radiation on human health has focused in recent years on cancer, although there has been a decided increase in the data for noncancer effects together with more reliable estimates of the risk following radiation exposure, even at relatively low doses (notably for cataracts and cardiovascular disease). New approaches for the estimation of hereditary risk have been developed with the use of human data whenever feasible, although the current estimates of heritable radiation effects still are based on mouse data because of an absence of effects in human studies. Uncertainties associated with estimation of these different types of health effects are discussed in a qualitative and semi-quantitative manner as appropriate. The way forward would seem to require additional epidemiological studies, especially studies of low dose and low dose-rate occupational and perhaps environmental exposures and for exposures to x rays and high-LET radiations used in medicine. The development of models for more reliably combining the epidemiology data with experimental laboratory animal and cellular data can enhance the overall risk assessment approach by providing biologically refined data to strengthen the estimation of effects at low doses as opposed to the sole use of mathematical models of epidemiological data that are primarily driven by medium/high doses. NASA's approach to radiation protection for astronauts, although a unique occupational group, indicates the possible applicability of estimates of risk and their uncertainty in a broader context for developing recommendations on: (1) dose limits for occupational exposure and exposure of members of the public; (2) criteria to limit exposures of workers and members of the public to radon and its short-lived decay products; and (3) the dosimetric quantity (effective dose) used in radiation protection.
Conclusions on measurement uncertainty in microbiology.
Forster, Lynne I
2009-01-01
Since its first issue in 1999, testing laboratories wishing to comply with all the requirements of ISO/IEC 17025 have been collecting data for estimating uncertainty of measurement for quantitative determinations. In the microbiological field of testing, some debate has arisen as to whether uncertainty needs to be estimated for each method performed in the laboratory for each type of sample matrix tested. Queries also arise concerning the estimation of uncertainty when plate/membrane filter colony counts are below recommended method counting range limits. A selection of water samples (with low to high contamination) was tested in replicate with the associated uncertainty of measurement being estimated from the analytical results obtained. The analyses performed on the water samples included total coliforms, fecal coliforms, fecal streptococci by membrane filtration, and heterotrophic plate counts by the pour plate technique. For those samples where plate/membrane filter colony counts were > or =20, uncertainty estimates at a 95% confidence level were very similar for the methods, being estimated as 0.13, 0.14, 0.14, and 0.12, respectively. For those samples where plate/membrane filter colony counts were <20, estimated uncertainty values for each sample showed close agreement with published confidence limits established using a Poisson distribution approach.
Towards traceability in CO2 line strength measurements by TDLAS at 2.7 µm
NASA Astrophysics Data System (ADS)
Pogány, Andrea; Ott, Oliver; Werhahn, Olav; Ebert, Volker
2013-11-01
Direct tunable diode laser absorption spectroscopy (TDLAS) was combined in this study with metrological principles on the determination of uncertainties to measure the line strengths of the P36e and P34e line of 12C16O2 in the ν1+ν3 band at 2.7 μm. Special emphasis was put on traceability and a concise, well-documented uncertainty assessment. We have quantitatively analyzed the uncertainty contributions of different experimental parameters to the uncertainty of the line strength. Establishment of the wavenumber axis and the gas handling procedure proved to be the two major contributors to the final uncertainty. The obtained line strengths at 296 K are 1.593×10-20 cm/molecule for the P36e and 1.981×10-20 cm/molecule for the P34e line, with relative expanded uncertainties of 1.1% and 1.3%, respectively (k=2, corresponding to a 95% confidence level). The measured line strength values are in agreement with literature data (line strengths listed in the HITRAN and GEISA databases), but show an uncertainty, which is at least a factor of 2 lower.
NASA Astrophysics Data System (ADS)
Lewandowsky, Stephan; Freeman, Mark C.; Mann, Michael E.
2017-09-01
There is broad consensus among economists that unmitigated climate change will ultimately have adverse global economic consequences, that the costs of inaction will likely outweigh the cost of taking action, and that social planners should therefore put a price on carbon. However, there is considerable debate and uncertainty about the appropriate value of the social discount rate, that is the extent to which future damages should be discounted relative to mitigation costs incurred now. We briefly review the ethical issues surrounding the social discount rate and then report a simulation experiment that constrains the value of the discount rate by considering 4 sources of uncertainty and ambiguity: Scientific uncertainty about the extent of future warming, social uncertainty about future population and future economic development, political uncertainty about future mitigation trajectories, and ethical ambiguity about how much the welfare of future generations should be valued today. We compute a certainty-equivalent declining discount rate that accommodates all those sources of uncertainty and ambiguity. The forward (instantaneous) discount rate converges to a value near 0% by century's end and the spot (horizon) discount rate drops below 2% by 2100 and drops below previous estimates by 2070.
Mapping (dis)agreement in hydrologic projections
NASA Astrophysics Data System (ADS)
Melsen, Lieke A.; Addor, Nans; Mizukami, Naoki; Newman, Andrew J.; Torfs, Paul J. J. F.; Clark, Martyn P.; Uijlenhoet, Remko; Teuling, Adriaan J.
2018-03-01
Hydrologic projections are of vital socio-economic importance. However, they are also prone to uncertainty. In order to establish a meaningful range of storylines to support water managers in decision making, we need to reveal the relevant sources of uncertainty. Here, we systematically and extensively investigate uncertainty in hydrologic projections for 605 basins throughout the contiguous US. We show that in the majority of the basins, the sign of change in average annual runoff and discharge timing for the period 2070-2100 compared to 1985-2008 differs among combinations of climate models, hydrologic models, and parameters. Mapping the results revealed that different sources of uncertainty dominate in different regions. Hydrologic model induced uncertainty in the sign of change in mean runoff was related to snow processes and aridity, whereas uncertainty in both mean runoff and discharge timing induced by the climate models was related to disagreement among the models regarding the change in precipitation. Overall, disagreement on the sign of change was more widespread for the mean runoff than for the discharge timing. The results demonstrate the need to define a wide range of quantitative hydrologic storylines, including parameter, hydrologic model, and climate model forcing uncertainty, to support water resource planning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bao, C.; Hanany, S.; Baccigalupi, C.
We extend a general maximum likelihood foreground estimation for cosmic microwave background (CMB) polarization data to include estimation of instrumental systematic effects. We focus on two particular effects: frequency band measurement uncertainty and instrumentally induced frequency dependent polarization rotation. We assess the bias induced on the estimation of the B-mode polarization signal by these two systematic effects in the presence of instrumental noise and uncertainties in the polarization and spectral index of Galactic dust. Degeneracies between uncertainties in the band and polarization angle calibration measurements and in the dust spectral index and polarization increase the uncertainty in the extracted CMBmore » B-mode power, and may give rise to a biased estimate. We provide a quantitative assessment of the potential bias and increased uncertainty in an example experimental configuration. For example, we find that with 10% polarized dust, a tensor to scalar ratio of r = 0.05, and the instrumental configuration of the E and B experiment balloon payload, the estimated CMB B-mode power spectrum is recovered without bias when the frequency band measurement has 5% uncertainty or less, and the polarization angle calibration has an uncertainty of up to 4°.« less
Uncertainties in stormwater runoff data collection from a small urban catchment, Southeast China.
Huang, Jinliang; Tu, Zhenshun; Du, Pengfei; Lin, Jie; Li, Qingsheng
2010-01-01
Monitoring data are often used to identify stormwater runoff characteristics and in stormwater runoff modelling without consideration of their inherent uncertainties. Integrated with discrete sample analysis and error propagation analysis, this study attempted to quantify the uncertainties of discrete chemical oxygen demand (COD), total suspended solids (TSS) concentration, stormwater flowrate, stormwater event volumes, COD event mean concentration (EMC), and COD event loads in terms of flow measurement, sample collection, storage and laboratory analysis. The results showed that the uncertainties due to sample collection, storage and laboratory analysis of COD from stormwater runoff are 13.99%, 19.48% and 12.28%. Meanwhile, flow measurement uncertainty was 12.82%, and the sample collection uncertainty of TSS from stormwater runoff was 31.63%. Based on the law of propagation of uncertainties, the uncertainties regarding event flow volume, COD EMC and COD event loads were quantified as 7.03%, 10.26% and 18.47%.
NASA Astrophysics Data System (ADS)
Andrade, João Rodrigo; Martins, Ramon Silva; Thompson, Roney Leon; Mompean, Gilmar; da Silveira Neto, Aristeu
2018-04-01
The present paper provides an analysis of the statistical uncertainties associated with direct numerical simulation (DNS) results and experimental data for turbulent channel and pipe flows, showing a new physically based quantification of these errors, to improve the determination of the statistical deviations between DNSs and experiments. The analysis is carried out using a recently proposed criterion by Thompson et al. ["A methodology to evaluate statistical errors in DNS data of plane channel flows," Comput. Fluids 130, 1-7 (2016)] for fully turbulent plane channel flows, where the mean velocity error is estimated by considering the Reynolds stress tensor, and using the balance of the mean force equation. It also presents how the residual error evolves in time for a DNS of a plane channel flow, and the influence of the Reynolds number on its convergence rate. The root mean square of the residual error is shown in order to capture a single quantitative value of the error associated with the dimensionless averaging time. The evolution in time of the error norm is compared with the final error provided by DNS data of similar Reynolds numbers available in the literature. A direct consequence of this approach is that it was possible to compare different numerical results and experimental data, providing an improved understanding of the convergence of the statistical quantities in turbulent wall-bounded flows.
Wu, Bing; Zhang, Yan; Zhang, Xu-Xiang; Cheng, Shu-Pei
2011-12-01
A carcinogenic risk assessment of polycyclic aromatic hydrocarbons (PAHs) in source water and drinking water of China was conducted using probabilistic techniques from a national perspective. The published monitoring data of PAHs were gathered and converted into BaP equivalent (BaP(eq)) concentrations. Based on the transformed data, comprehensive risk assessment was performed by considering different age groups and exposure pathways. Monte Carlo simulation and sensitivity analysis were applied to quantify uncertainties of risk estimation. The risk analysis indicated that, the risk values for children and teens were lower than the accepted value (1.00E-05), indicating no significant carcinogenic risk. The probability of risk values above 1.00E-05 was 5.8% and 6.7% for adults and lifetime groups, respectively. Overall, carcinogenic risks of PAHs in source water and drinking water of China were mostly accepted. However, specific regions, such as Yellow river of Lanzhou reach and Qiantang river should be paid more attention. Notwithstanding the uncertainties inherent in the risk assessment, this study is the first attempt to provide information on carcinogenic risk of PAHs in source water and drinking water of China, and might be useful for potential strategies of carcinogenic risk management and reduction. Copyright © 2011 Elsevier B.V. All rights reserved.
Failure analysis of parameter-induced simulation crashes in climate models
NASA Astrophysics Data System (ADS)
Lucas, D. D.; Klein, R.; Tannahill, J.; Ivanova, D.; Brandon, S.; Domyancic, D.; Zhang, Y.
2013-01-01
Simulations using IPCC-class climate models are subject to fail or crash for a variety of reasons. Quantitative analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation crashes within the Parallel Ocean Program (POP2) component of the Community Climate System Model (CCSM4). About 8.5% of our CCSM4 simulations failed for numerical reasons at combinations of POP2 parameter values. We apply support vector machine (SVM) classification from machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. A committee of SVM classifiers readily predicts model failures in an independent validation ensemble, as assessed by the area under the receiver operating characteristic (ROC) curve metric (AUC > 0.96). The causes of the simulation failures are determined through a global sensitivity analysis. Combinations of 8 parameters related to ocean mixing and viscosity from three different POP2 parameterizations are the major sources of the failures. This information can be used to improve POP2 and CCSM4 by incorporating correlations across the relevant parameters. Our method can also be used to quantify, predict, and understand simulation crashes in other complex geoscientific models.
Failure analysis of parameter-induced simulation crashes in climate models
NASA Astrophysics Data System (ADS)
Lucas, D. D.; Klein, R.; Tannahill, J.; Ivanova, D.; Brandon, S.; Domyancic, D.; Zhang, Y.
2013-08-01
Simulations using IPCC (Intergovernmental Panel on Climate Change)-class climate models are subject to fail or crash for a variety of reasons. Quantitative analysis of the failures can yield useful insights to better understand and improve the models. During the course of uncertainty quantification (UQ) ensemble simulations to assess the effects of ocean model parameter uncertainties on climate simulations, we experienced a series of simulation crashes within the Parallel Ocean Program (POP2) component of the Community Climate System Model (CCSM4). About 8.5% of our CCSM4 simulations failed for numerical reasons at combinations of POP2 parameter values. We applied support vector machine (SVM) classification from machine learning to quantify and predict the probability of failure as a function of the values of 18 POP2 parameters. A committee of SVM classifiers readily predicted model failures in an independent validation ensemble, as assessed by the area under the receiver operating characteristic (ROC) curve metric (AUC > 0.96). The causes of the simulation failures were determined through a global sensitivity analysis. Combinations of 8 parameters related to ocean mixing and viscosity from three different POP2 parameterizations were the major sources of the failures. This information can be used to improve POP2 and CCSM4 by incorporating correlations across the relevant parameters. Our method can also be used to quantify, predict, and understand simulation crashes in other complex geoscientific models.
Detailed Uncertainty Analysis of the ZEM-3 Measurement System
NASA Technical Reports Server (NTRS)
Mackey, Jon; Sehirlioglu, Alp; Dynys, Fred
2014-01-01
The measurement of Seebeck coefficient and electrical resistivity are critical to the investigation of all thermoelectric systems. Therefore, it stands that the measurement uncertainty must be well understood to report ZT values which are accurate and trustworthy. A detailed uncertainty analysis of the ZEM-3 measurement system has been performed. The uncertainty analysis calculates error in the electrical resistivity measurement as a result of sample geometry tolerance, probe geometry tolerance, statistical error, and multi-meter uncertainty. The uncertainty on Seebeck coefficient includes probe wire correction factors, statistical error, multi-meter uncertainty, and most importantly the cold-finger effect. The cold-finger effect plagues all potentiometric (four-probe) Seebeck measurement systems, as heat parasitically transfers through thermocouple probes. The effect leads to an asymmetric over-estimation of the Seebeck coefficient. A thermal finite element analysis allows for quantification of the phenomenon, and provides an estimate on the uncertainty of the Seebeck coefficient. The thermoelectric power factor has been found to have an uncertainty of +9-14 at high temperature and 9 near room temperature.
On aerodynamic wake analysis and its relation to total aerodynamic drag in a wind tunnel environment
NASA Astrophysics Data System (ADS)
Guterres, Rui M.
The present work was developed with the goal of advancing the state of the art in the application of three-dimensional wake data analysis to the quantification of aerodynamic drag on a body in a low speed wind tunnel environment. Analysis of the existing tools, their strengths and limitations is presented. Improvements to the existing analysis approaches were made. Software tools were developed to integrate the analysis into a practical tool. A comprehensive derivation of the equations needed for drag computations based on three dimensional separated wake data is developed. A set of complete steps ranging from the basic mathematical concept to the applicable engineering equations is presented. An extensive experimental study was conducted. Three representative body types were studied in varying ground effect conditions. A detailed qualitative wake analysis using wake imaging and two and three dimensional flow visualization was performed. Several significant features of the flow were identified and their relation to the total aerodynamic drag established. A comprehensive wake study of this type is shown to be in itself a powerful tool for the analysis of the wake aerodynamics and its relation to body drag. Quantitative wake analysis techniques were developed. Significant post processing and data conditioning tools and precision analysis were developed. The quality of the data is shown to be in direct correlation with the accuracy of the computed aerodynamic drag. Steps are taken to identify the sources of uncertainty. These are quantified when possible and the accuracy of the computed results is seen to significantly improve. When post processing alone does not resolve issues related to precision and accuracy, solutions are proposed. The improved quantitative wake analysis is applied to the wake data obtained. Guidelines are established that will lead to more successful implementation of these tools in future research programs. Close attention is paid to implementation of issues that are of crucial importance for the accuracy of the results and that are not detailed in the literature. The impact of ground effect on the flows in hand is qualitatively and quantitatively studied. Its impact on the accuracy of the computations as well as the wall drag incompatibility with the theoretical model followed are discussed. The newly developed quantitative analysis provides significantly increased accuracy. The aerodynamic drag coefficient is computed within one percent of balance measured value for the best cases.
Stochastic Analysis and Design of Heterogeneous Microstructural Materials System
NASA Astrophysics Data System (ADS)
Xu, Hongyi
Advanced materials system refers to new materials that are comprised of multiple traditional constituents but complex microstructure morphologies, which lead to superior properties over the conventional materials. To accelerate the development of new advanced materials system, the objective of this dissertation is to develop a computational design framework and the associated techniques for design automation of microstructure materials systems, with an emphasis on addressing the uncertainties associated with the heterogeneity of microstructural materials. Five key research tasks are identified: design representation, design evaluation, design synthesis, material informatics and uncertainty quantification. Design representation of microstructure includes statistical characterization and stochastic reconstruction. This dissertation develops a new descriptor-based methodology, which characterizes 2D microstructures using descriptors of composition, dispersion and geometry. Statistics of 3D descriptors are predicted based on 2D information to enable 2D-to-3D reconstruction. An efficient sequential reconstruction algorithm is developed to reconstruct statistically equivalent random 3D digital microstructures. In design evaluation, a stochastic decomposition and reassembly strategy is developed to deal with the high computational costs and uncertainties induced by material heterogeneity. The properties of Representative Volume Elements (RVE) are predicted by stochastically reassembling SVE elements with stochastic properties into a coarse representation of the RVE. In design synthesis, a new descriptor-based design framework is developed, which integrates computational methods of microstructure characterization and reconstruction, sensitivity analysis, Design of Experiments (DOE), metamodeling and optimization the enable parametric optimization of the microstructure for achieving the desired material properties. Material informatics is studied to efficiently reduce the dimension of microstructure design space. This dissertation develops a machine learning-based methodology to identify the key microstructure descriptors that highly impact properties of interest. In uncertainty quantification, a comparative study on data-driven random process models is conducted to provide guidance for choosing the most accurate model in statistical uncertainty quantification. Two new goodness-of-fit metrics are developed to provide quantitative measurements of random process models' accuracy. The benefits of the proposed methods are demonstrated by the example of designing the microstructure of polymer nanocomposites. This dissertation provides material-generic, intelligent modeling/design methodologies and techniques to accelerate the process of analyzing and designing new microstructural materials system.
Maximum warming occurs about one decade after carbon dioxide emission
NASA Astrophysics Data System (ADS)
Ricke, K.; Caldeira, K.
2014-12-01
There has been a long tradition of estimating the amount of climate change that would result from various carbon dioxide emission or concentration scenarios but there has been relatively little quantitative analysis of how long it takes to feel the consequences of an individual carbon dioxide emission. Using conjoined results of recent carbon-cycle and physical-climate model intercomparison projects, we find the median time between an emission and maximum warming is 10.1 years, with a 90% probability range of 6.6 to 30.7 years. We evaluate uncertainties in timing and amount of warming, partitioning them into three contributing factors: carbon cycle, climate sensitivity and ocean thermal inertia. To characterize the carbon cycle uncertainty associated with the global temperature response to a carbon dioxide emission today, we use fits to the time series of carbon dioxide concentrations from a CO2-impulse response function model intercomparison project's 15 ensemble members (1). To characterize both the uncertainty in climate sensitivity and in the thermal inertia of the climate system, we use fits to the time series of global temperature change from the Coupled Model Intercomparison Project phase 5 (CMIP5; 2) abrupt4xco2 experiment's 20 ensemble's members separating the effects of each uncertainty factors using one of two simple physical models for each CMIP5 climate model. This yields 6,000 possible combinations of these three factors using a standard convolution integral approach. Our results indicate that benefits of avoided climate damage from avoided CO2 emissions will be manifested within the lifetimes of people who acted to avoid that emission. While the relevant time lags imposed by the climate system are substantially shorter than a human lifetime, they are substantially longer than the typical political election cycle, making the delay and its associated uncertainties both economically and politically significant. References: 1. Joos F et al. (2013) Carbon dioxide and climate impulse response functions for the computation of greenhouse gas metrics: a multi-model analysis. Atmos Chem Phys 13:2793-2825. 2. Taylor KE, Stouffer RJ, Meehl GA (2011) An Overview of CMIP5 and the Experiment Design. Bull Am Meteorol Soc 93:485-498.
NASA Astrophysics Data System (ADS)
Zheng, N.
2017-12-01
Sensible heat flux (H) is one of the driving factors of surface turbulent motion and energy exchange. Therefore, it is particularly important to measure sensible heat flux accurately at the regional scale. However, due to the heterogeneity of the underlying surface, hydrothermal regime, and different weather conditions, it is difficult to estimate the represented flux at the kilometer scale. The scintillometer have been developed into an effective and universal equipment for deriving heat flux at the regional-scale which based on the turbulence effect of light in the atmosphere since the 1980s. The parameter directly obtained by the scintillometer is the structure parameter of the refractive index of air based on the changes of light intensity fluctuation. Combine with parameters such as temperature structure parameter, zero-plane displacement, surface roughness, wind velocity, air temperature and the other meteorological data heat fluxes can be derived. These additional parameters increase the uncertainties of flux because the difference between the actual feature of turbulent motion and the applicable conditions of turbulence theory. Most previous studies often focused on the constant flux layers that are above the rough sub-layers and homogeneous flat surfaces underlying surfaces with suitable weather conditions. Therefore, the criteria and modified forms of key parameters are invariable. In this study, we conduct investment over the hilly area of northern China with different plants, such as cork oak, cedar-black and locust. On the basis of key research on the threshold and modified forms of saturation with different turbulence intensity, modified forms of Bowen ratio with different drying-and-wetting conditions, universal function for the temperature structure parameter under different atmospheric stability, the dominant sources of uncertainty will be determined. The above study is significant to reveal influence mechanism of uncertainty and explore influence degree of uncertainty with quantitative analysis. The study can provide theoretical basis and technical support for accurately measuring sensible heat fluxes of forest ecosystem with scintillometer method, and can also provide work foundation for further study on role of forest ecosystem in energy balance and climate change.
NASA Astrophysics Data System (ADS)
McCall, Keisha C.
Identification and monitoring of sub-tumor targets will be a critical step for optimal design and evaluation of cancer therapies in general and biologically targeted radiotherapy (dose-painting) in particular. Quantitative PET imaging may be an important tool for these applications. Currently radiotherapy planning accounts for tumor motion by applying geometric margins. These margins create a motion envelope to encompass the most probable positions of the tumor, while also maintaining the appropriate tumor control and normal tissue complication probabilities. This motion envelope is effective for uniform dose prescriptions where the therapeutic dose is conformed to the external margins of the tumor. However, much research is needed to establish the equivalent margins for non-uniform fields, where multiple biological targets are present and each target is prescribed its own dose level. Additionally, the size of the biological targets and close proximity make it impractical to apply planning margins on the sub-tumor level. Also, the extent of high dose regions must be limited to avoid excessive dose to the surrounding tissue. As such, this research project is an investigation of the uncertainty within quantitative PET images of moving and displaced dose-painting targets, and an investigation of the residual errors that remain after motion management. This included characterization of the changes in PET voxel-values as objects are moved relative to the discrete sampling interval of PET imaging systems (SPECIFIC AIM 1). Additionally, the repeatability of PET distributions and the delineating dose-painting targets were measured (SPECIFIC AIM 2). The effect of imaging uncertainty on the dose distributions designed using these images (SPECIFIC AIM 3) has also been investigated. This project also included analysis of methods to minimize motion during PET imaging and reduce the dosimetric impact of motion/position-induced imaging uncertainty (SPECIFIC AIM 4).
NASA Astrophysics Data System (ADS)
Lin, G.; Stephan, E.; Elsethagen, T.; Meng, D.; Riihimaki, L. D.; McFarlane, S. A.
2012-12-01
Uncertainty quantification (UQ) is the science of quantitative characterization and reduction of uncertainties in applications. It determines how likely certain outcomes are if some aspects of the system are not exactly known. UQ studies such as the atmosphere datasets greatly increased in size and complexity because they now comprise of additional complex iterative steps, involve numerous simulation runs and can consist of additional analytical products such as charts, reports, and visualizations to explain levels of uncertainty. These new requirements greatly expand the need for metadata support beyond the NetCDF convention and vocabulary and as a result an additional formal data provenance ontology is required to provide a historical explanation of the origin of the dataset that include references between the explanations and components within the dataset. This work shares a climate observation data UQ science use case and illustrates how to reduce climate observation data uncertainty and use a linked science application called Provenance Environment (ProvEn) to enable and facilitate scientific teams to publish, share, link, and discover knowledge about the UQ research results. UQ results include terascale datasets that are published to an Earth Systems Grid Federation (ESGF) repository. Uncertainty exists in observation data sets, which is due to sensor data process (such as time averaging), sensor failure in extreme weather conditions, and sensor manufacture error etc. To reduce the uncertainty in the observation data sets, a method based on Principal Component Analysis (PCA) was proposed to recover the missing values in observation data. Several large principal components (PCs) of data with missing values are computed based on available values using an iterative method. The computed PCs can approximate the true PCs with high accuracy given a condition of missing values is met; the iterative method greatly improve the computational efficiency in computing PCs. Moreover, noise removal is done at the same time during the process of computing missing values by using only several large PCs. The uncertainty quantification is done through statistical analysis of the distribution of different PCs. To record above UQ process, and provide an explanation on the uncertainty before and after the UQ process on the observation data sets, additional data provenance ontology, such as ProvEn, is necessary. In this study, we demonstrate how to reduce observation data uncertainty on climate model-observation test beds and using ProvEn to record the UQ process on ESGF. ProvEn demonstrates how a scientific team conducting UQ studies can discover dataset links using its domain knowledgebase, allowing them to better understand and convey the UQ study research objectives, the experimental protocol used, the resulting dataset lineage, related analytical findings, ancillary literature citations, along with the social network of scientists associated with the study. Climate scientists will not only benefit from understanding a particular dataset within a knowledge context, but also benefit from the cross reference of knowledge among the numerous UQ studies being stored in ESGF.
Wu, Yan; He, Yi; He, Wenyi; Zhang, Yumei; Lu, Jing; Dai, Zhong; Ma, Shuangcheng; Lin, Ruichao
2014-03-01
Quantitative nuclear magnetic resonance spectroscopy (qNMR) has been developed into an important tool in the drug analysis, biomacromolecule detection, and metabolism study. Compared with mass balance method, qNMR method bears some advantages in the calibration of reference standard (RS): it determines the absolute amount of a sample; other chemical compound and its certified reference material (CRM) can be used as internal standard (IS) to obtain the purity of the sample. Protoberberine alkaloids have many biological activities and have been used as reference standards for the control of many herbal drugs. In present study, the qNMR methods were developed for the calibration of berberine hydrochloride, palmatine hydrochloride, tetrahydropalmatine, and phellodendrine hydrochloride with potassium hydrogen phthalate as IS. Method validation was carried out according to the guidelines for the method validation of Chinese Pharmacopoeia. The results of qNMR were compared with those of mass balance method and the differences between the results of two methods were acceptable based on the analysis of estimated measurement uncertainties. Therefore, qNMR is an effective and reliable analysis method for the calibration of RS and can be used as a good complementarity to the mass balance method. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Berthet, Lionel; Marty, Renaud; Bourgin, François; Viatgé, Julie; Piotte, Olivier; Perrin, Charles
2017-04-01
An increasing number of operational flood forecasting centres assess the predictive uncertainty associated with their forecasts and communicate it to the end users. This information can match the end-users needs (i.e. prove to be useful for an efficient crisis management) only if it is reliable: reliability is therefore a key quality for operational flood forecasts. In 2015, the French flood forecasting national and regional services (Vigicrues network; www.vigicrues.gouv.fr) implemented a framework to compute quantitative discharge and water level forecasts and to assess the predictive uncertainty. Among the possible technical options to achieve this goal, a statistical analysis of past forecasting errors of deterministic models has been selected (QUOIQUE method, Bourgin, 2014). It is a data-based and non-parametric approach based on as few assumptions as possible about the forecasting error mathematical structure. In particular, a very simple assumption is made regarding the predictive uncertainty distributions for large events outside the range of the calibration data: the multiplicative error distribution is assumed to be constant, whatever the magnitude of the flood. Indeed, the predictive distributions may not be reliable in extrapolation. However, estimating the predictive uncertainty for these rare events is crucial when major floods are of concern. In order to improve the forecasts reliability for major floods, an attempt at combining the operational strength of the empirical statistical analysis and a simple error modelling is done. Since the heteroscedasticity of forecast errors can considerably weaken the predictive reliability for large floods, this error modelling is based on the log-sinh transformation which proved to reduce significantly the heteroscedasticity of the transformed error in a simulation context, even for flood peaks (Wang et al., 2012). Exploratory tests on some operational forecasts issued during the recent floods experienced in France (major spring floods in June 2016 on the Loire river tributaries and flash floods in fall 2016) will be shown and discussed. References Bourgin, F. (2014). How to assess the predictive uncertainty in hydrological modelling? An exploratory work on a large sample of watersheds, AgroParisTech Wang, Q. J., Shrestha, D. L., Robertson, D. E. and Pokhrel, P (2012). A log-sinh transformation for data normalization and variance stabilization. Water Resources Research, , W05514, doi:10.1029/2011WR010973
Uncertainty in cloud optical depth estimates made from satellite radiance measurements
NASA Technical Reports Server (NTRS)
Pincus, Robert; Szczodrak, Malgorzata; Gu, Jiujing; Austin, Philip
1995-01-01
The uncertainty in optical depths retrieved from satellite measurements of visible wavelength radiance at the top of the atmosphere is quantified. Techniques are briefly reviewed for the estimation of optical depth from measurements of radiance, and it is noted that these estimates are always more uncertain at greater optical depths and larger solar zenith angles. The lack of radiometric calibration for visible wavelength imagers on operational satellites dominates the uncertainty retrievals of optical depth. This is true for both single-pixel retrievals and for statistics calculated from a population of individual retrievals. For individual estimates or small samples, sensor discretization can also be significant, but the sensitivity of the retrieval to the specification of the model atmosphere is less important. The relative uncertainty in calibration affects the accuracy with which optical depth distributions measured by different sensors may be quantitatively compared, while the absolute calibration uncertainty, acting through the nonlinear mapping of radiance to optical depth, limits the degree to which distributions measured by the same sensor may be distinguished.
Knotts, Thomas A.
2017-01-01
Molecular simulation has the ability to predict various physical properties that are difficult to obtain experimentally. For example, we implement molecular simulation to predict the critical constants (i.e., critical temperature, critical density, critical pressure, and critical compressibility factor) for large n-alkanes that thermally decompose experimentally (as large as C48). Historically, molecular simulation has been viewed as a tool that is limited to providing qualitative insight. One key reason for this perceived weakness in molecular simulation is the difficulty to quantify the uncertainty in the results. This is because molecular simulations have many sources of uncertainty that propagate and are difficult to quantify. We investigate one of the most important sources of uncertainty, namely, the intermolecular force field parameters. Specifically, we quantify the uncertainty in the Lennard-Jones (LJ) 12-6 parameters for the CH4, CH3, and CH2 united-atom interaction sites. We then demonstrate how the uncertainties in the parameters lead to uncertainties in the saturated liquid density and critical constant values obtained from Gibbs Ensemble Monte Carlo simulation. Our results suggest that the uncertainties attributed to the LJ 12-6 parameters are small enough that quantitatively useful estimates of the saturated liquid density and the critical constants can be obtained from molecular simulation. PMID:28527455
Nazem-Zadeh, Mohammad-Reza; Schwalb, Jason M.; Elisevich, Kost V.; Bagher-Ebadian, Hassan; Hamidian, Hajar; Akhondi-Asl, Ali-Reza; Jafari-Khouzani, Kourosh; Soltanian-Zadeh, Hamid
2014-01-01
Purpose To analyze the utility of a quantitative uncertainty analysis approach for evaluation and comparison of various MRI findings for lateralization of epileptogenicity in mesial temporal lobe epilepsy (mTLE), including novel diffusion-based analyses. Methods We estimated the hemispheric variation uncertainty (HVU) of hippocampal T1 volumetry and FLAIR (Fluid Attenuated Inversion Recovery) intensity. Using diffusion tensor images of 23 nonepileptic subjects, we estimated the HVU levels of mean diffusivity (MD) in the hippocampus, and fractional anisotropy (FA) in the posteroinferior cingulum and crus of fornix. Imaging from a retrospective cohort of 20 TLE patients who had undergone surgical resection with Engel class I outcomes was analyzed to determine whether asymmetry of preoperative volumetrics, FLAIR intensities, and MD values in hippocampi, as well as FA values in posteroinferior cingula and fornix crura correctly predicted laterality of seizure onset. Ten of the cohort had pathologically proven mesial temporal sclerosis (MTS). Seven of these patients had undergone extra-operative electrocorticography (ECoG) for lateralization or to rule out extra-temporal foci. Results HVU was estimated to be 3.1 × 10−5 for hippocampal MD, 0.027 for FA in posteroinferior cingulum, 0.018 for FA in crus of fornix, 0.069 for hippocampal normalized volume, and 0.099 for hippocampal normalized FLAIR intensity. Using HVU analysis, a higher hippocampal MD value, lower FA within the posteroinferior cingulum and crus of fornix, shrinkage in hippocampal volume, and higher hippocampal FLAIR intensity were observed beyond uncertainty on the side ipsilateral to seizure onset for 10, 10, 9, 9, and 10 out of 10 pathology-proven MTS patients, respectively. Considering all 20 TLE patients, these numbers were 18, 15, 14, 13, and 16, respectively. However, consolidating lateralization results of HVU analysis on these quantities by majority voting detected the epileptogenic side for 19 out of 20 cases with no wrong lateralization. Conclusion The presence of MTS in TLE patients is associated with an elevated MD value in the ipsilateral hippocampus and a reduced FA value in the posteroinferior subregion of the ipsilateral cingulum and crus of ipsilateral fornix. When considering all TLE patients, among the mentioned biomarkers the hippocampal MD had the best performance with true detection rate of 90% without any wrong lateralization. The proposed uncertainty based analyses hold promise for improving decision-making for surgical resection. PMID:24857759
Nazem-Zadeh, Mohammad-Reza; Schwalb, Jason M; Elisevich, Kost V; Bagher-Ebadian, Hassan; Hamidian, Hajar; Akhondi-Asl, Ali-Reza; Jafari-Khouzani, Kourosh; Soltanian-Zadeh, Hamid
2014-07-15
To analyze the utility of a quantitative uncertainty analysis approach for evaluation and comparison of various MRI findings for the lateralization of epileptogenicity in mesial temporal lobe epilepsy (mTLE), including novel diffusion-based analyses. We estimated the hemispheric variation uncertainty (HVU) of hippocampal T1 volumetry and FLAIR (Fluid Attenuated Inversion Recovery) intensity. Using diffusion tensor images of 23 nonepileptic subjects, we estimated the HVU levels of mean diffusivity (MD) in the hippocampus, and fractional anisotropy (FA) in the posteroinferior cingulum and crus of fornix. Imaging from a retrospective cohort of 20 TLE patients who had undergone surgical resection with Engel class I outcomes was analyzed to determine whether asymmetry of preoperative volumetrics, FLAIR intensities, and MD values in hippocampi, as well as FA values in posteroinferior cingula and fornix crura correctly predicted laterality of seizure onset. Ten of the cohort had pathologically proven mesial temporal sclerosis (MTS). Seven of these patients had undergone extraoperative electrocorticography (ECoG) for lateralization or to rule out extra-temporal foci. HVU was estimated to be 3.1×10(-5) for hippocampal MD, 0.027 for FA in posteroinferior cingulum, 0.018 for FA in crus of fornix, 0.069 for hippocampal normalized volume, and 0.099 for hippocampal normalized FLAIR intensity. Using HVU analysis, a higher hippocampal MD value, lower FA within the posteroinferior cingulum and crus of fornix, shrinkage in hippocampal volume, and higher hippocampal FLAIR intensity were observed beyond uncertainty on the side ipsilateral to seizure onset for 10, 10, 9, 9, and 10 out of 10 pathology-proven MTS patients, respectively. Considering all 20 TLE patients, these numbers were 18, 15, 14, 13, and 16, respectively. However, consolidating the lateralization results of HVU analysis on these quantities by majority voting has detected the epileptogenic side for 19 out of 20 cases with no wrong lateralization. The presence of MTS in TLE patients is associated with an elevated MD value in the ipsilateral hippocampus and a reduced FA value in the posteroinferior subregion of the ipsilateral cingulum and crus of ipsilateral fornix. When considering all TLE patients, among the mentioned biomarkers the hippocampal MD had the best performance with true detection rate of 90% without any wrong lateralization. The proposed uncertainty based analyses hold promise for improving decision-making for surgical resection. Copyright © 2014 Elsevier B.V. All rights reserved.
Justify Your Answer: The Role of Written Think Aloud in Script Concordance Testing.
Power, Alyssa; Lemay, Jean-Francois; Cooke, Suzette
2017-01-01
Construct: Clinical reasoning assessment is a growing area of interest in the medical education literature. Script concordance testing (SCT) evaluates clinical reasoning in conditions of uncertainty and has emerged as an innovative tool in the domain of clinical reasoning assessment. SCT quantifies the degree of concordance between a learner and an experienced clinician and attempts to capture the breadth of responses of expert clinicians, acknowledging the significant yet acceptable variation in practice under situations of uncertainty. SCT has been shown to be a valid and reliable clinical reasoning assessment tool. However, as SCT provides only quantitative information, it may not provide a complete assessment of clinical reasoning. Think aloud (TA) is a qualitative research tool used in clinical reasoning assessment in which learners verbalize their thought process around an assigned task. This study explores the use of TA, in the form of written reflection, in SCT to assess resident clinical reasoning, hypothesizing that the information obtained from the written TA would enrich the quantitative data obtained through SCT. Ninety-one pediatric postgraduate trainees and 21 pediatricians from 4 Canadian training centers completed an online test consisting of 24 SCT cases immediately followed by retrospective written TA. Six of 24 cases were selected to gather TA data. These cases were chosen to allow all phases of clinical decision making (diagnosis, investigation, and treatment) to be represented in the TA data. Inductive thematic analysis was employed when systematically reviewing TA responses. Three main benefits of adding written TA to SCT were identified: (a) uncovering instances of incorrect clinical reasoning despite a correct SCT response, (b) revealing sound clinical reasoning in the context of a suboptimal SCT response, and (c) detecting question misinterpretation. Written TA can optimize SCT by demonstrating when correct examinee responses are based on guessing or uncertainty rather than robust clinical rationale. TA can also enhance SCT by allowing examinees to provide justification for responses that otherwise would have been considered incorrect and by identifying questions that are frequently misinterpreted to avoid including them in future examinations. TA also has significant value in differentiating between acceptable variations in expert clinician responses and deviance associated with faulty rationale or question misinterpretation; this could improve SCT reliability. A written TA protocol appears to be a valuable tool to assess trainees' clinical reasoning and can strengthen the quantitative assessment provided by SCT.
Wagner, Monika; Khoury, Hanane; Willet, Jacob; Rindress, Donna; Goetghebeur, Mireille
2016-03-01
The multiplicity of issues, including uncertainty and ethical dilemmas, and policies involved in appraising interventions for rare diseases suggests that multicriteria decision analysis (MCDA) based on a holistic definition of value is uniquely suited for this purpose. The objective of this study was to analyze and further develop a comprehensive MCDA framework (EVIDEM) to address rare disease issues and policies, while maintaining its applicability across disease areas. Specific issues and policies for rare diseases were identified through literature review. Ethical and methodological foundations of the EVIDEM framework v3.0 were systematically analyzed from the perspective of these issues, and policies and modifications of the framework were performed accordingly to ensure their integration. Analysis showed that the framework integrates ethical dilemmas and issues inherent to appraising interventions for rare diseases but required further integration of specific aspects. Modification thus included the addition of subcriteria to further differentiate disease severity, disease-specific treatment outcomes, and economic consequences of interventions for rare diseases. Scoring scales were further developed to include negative scales for all comparative criteria. A methodology was established to incorporate context-specific population priorities and policies, such as those for rare diseases, into the quantitative part of the framework. This design allows making more explicit trade-offs between competing ethical positions of fairness (prioritization of those who are worst off), the goal of benefiting as many people as possible, the imperative to help, and wise use of knowledge and resources. It also allows addressing variability in institutional policies regarding prioritization of specific disease areas, in addition to existing uncertainty analysis available from EVIDEM. The adapted framework measures value in its widest sense, while being responsive to rare disease issues and policies. It provides an operationalizable platform to integrate values, competing ethical dilemmas, and uncertainty in appraising healthcare interventions.
A Benefit-Risk Analysis Approach to Capture Regulatory Decision-Making: Non-Small Cell Lung Cancer.
Raju, G K; Gurumurthi, K; Domike, R; Kazandjian, D; Blumenthal, G; Pazdur, R; Woodcock, J
2016-12-01
Drug regulators around the world make decisions about drug approvability based on qualitative benefit-risk analyses. There is much interest in quantifying regulatory approaches to benefit and risk. In this work the use of a quantitative benefit-risk analysis was applied to regulatory decision-making about new drugs to treat advanced non-small cell lung cancer (NSCLC). Benefits and risks associated with 20 US Food and Drug Administration (FDA) decisions associated with a set of candidate treatments submitted between 2003 and 2015 were analyzed. For benefit analysis, the median overall survival (OS) was used where available. When not available, OS was estimated based on overall response rate (ORR) or progression-free survival (PFS). Risks were analyzed based on magnitude (or severity) of harm and likelihood of occurrence. Additionally, a sensitivity analysis was explored to demonstrate analysis of systematic uncertainty. FDA approval decision outcomes considered were found to be consistent with the benefit-risk logic. © 2016 American Society for Clinical Pharmacology and Therapeutics.
Probabilistic, meso-scale flood loss modelling
NASA Astrophysics Data System (ADS)
Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno
2016-04-01
Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.
Task uncertainty and communication during nursing shift handovers.
Mayor, Eric; Bangerter, Adrian; Aribot, Myriam
2012-09-01
We explore variations in handover duration and communication in nursing units. We hypothesize that duration per patient is higher in units facing high task uncertainty. We expect both topics and functions of communication to vary depending on task uncertainty. Handovers are changing in modern healthcare organizations, where standardized procedures are increasingly advocated for efficiency and reliability reasons. However, redesign of handover should take environmental contingencies of different clinical unit types into account. An important contingency in institutions is task uncertainty, which may affect how communicative routines like handover are accomplished. Nurse unit managers of 80 care units in 18 hospitals were interviewed in 2008 about topics and functions of handover communication and duration in their unit. Interviews were content-analysed. Clinical units were classified into a theory-based typology (unit type) that gradually increases on task uncertainty. Quantitative analyses were performed. Unit type affected resource allocation. Unit types facing higher uncertainty had higher handover duration per patient. As expected, unit type also affected communication content. Clinical units facing higher uncertainty discussed fewer topics, discussing treatment and care and organization of work less frequently. Finally, unit type affected functions of handover: sharing emotions was less often mentioned in unit types facing higher uncertainty. Task uncertainty and its relationship with functions and topics of handover should be taken into account during the design of handover procedures. © 2011 Blackwell Publishing Ltd.
Zenil, Hector; Kiani, Narsis A.; Ball, Gordon; Gomez-Cabrero, David
2016-01-01
Systems in nature capable of collective behaviour are nonlinear, operating across several scales. Yet our ability to account for their collective dynamics differs in physics, chemistry and biology. Here, we briefly review the similarities and differences between mathematical modelling of adaptive living systems versus physico-chemical systems. We find that physics-based chemistry modelling and computational neuroscience have a shared interest in developing techniques for model reductions aiming at the identification of a reduced subsystem or slow manifold, capturing the effective dynamics. By contrast, as relations and kinetics between biological molecules are less characterized, current quantitative analysis under the umbrella of bioinformatics focuses on signal extraction, correlation, regression and machine-learning analysis. We argue that model reduction analysis and the ensuing identification of manifolds bridges physics and biology. Furthermore, modelling living systems presents deep challenges as how to reconcile rich molecular data with inherent modelling uncertainties (formalism, variables selection and model parameters). We anticipate a new generative data-driven modelling paradigm constrained by identified governing principles extracted from low-dimensional manifold analysis. The rise of a new generation of models will ultimately connect biology to quantitative mechanistic descriptions, thereby setting the stage for investigating the character of the model language and principles driving living systems. This article is part of the themed issue ‘Multiscale modelling at the physics–chemistry–biology interface’. PMID:27698038
NASA Astrophysics Data System (ADS)
Babendreier, J. E.
2002-05-01
Evaluating uncertainty and parameter sensitivity in environmental models can be a difficult task, even for low-order, single-media constructs driven by a unique set of site-specific data. The challenge of examining ever more complex, integrated, higher-order models is a formidable one, particularly in regulatory settings applied on a national scale. Quantitative assessment of uncertainty and sensitivity within integrated, multimedia models that simulate hundreds of sites, spanning multiple geographical and ecological regions, will ultimately require a systematic, comparative approach coupled with sufficient computational power. The Multimedia, Multipathway, and Multireceptor Risk Assessment Model (3MRA) is an important code being developed by the United States Environmental Protection Agency for use in site-scale risk assessment (e.g. hazardous waste management facilities). The model currently entails over 700 variables, 185 of which are explicitly stochastic. The 3MRA can start with a chemical concentration in a waste management unit (WMU). It estimates the release and transport of the chemical throughout the environment, and predicts associated exposure and risk. The 3MRA simulates multimedia (air, water, soil, sediments), pollutant fate and transport, multipathway exposure routes (food ingestion, water ingestion, soil ingestion, air inhalation, etc.), multireceptor exposures (resident, gardener, farmer, fisher, ecological habitats and populations), and resulting risk (human cancer and non-cancer effects, ecological population and community effects). The 3MRA collates the output for an overall national risk assessment, offering a probabilistic strategy as a basis for regulatory decisions. To facilitate model execution of 3MRA for purposes of conducting uncertainty and sensitivity analysis, a PC-based supercomputer cluster was constructed. Design of SuperMUSE, a 125 GHz Windows-based Supercomputer for Model Uncertainty and Sensitivity Evaluation is described, along with the conceptual layout of an accompanying java-based paralleling software toolset. Preliminary work is also reported for a scenario involving Benzene disposal that describes the relative importance of the vadose zone in driving risk levels for ecological receptors and human health. Incorporating landfills, waste piles, aerated tanks, surface impoundments, and land application units, the site-based data used in the analysis included 201 national facilities representing 419 site-WMU combinations.
NASA Astrophysics Data System (ADS)
Roobaert, Alizee; Laruelle, Goulven; Landschützer, Peter; Regnier, Pierre
2017-04-01
In lakes, rivers, estuaries and the ocean, the quantification of air-water CO2 exchange (FCO2) is still characterized by large uncertainties partly due to the lack of agreement over the parameterization of the gas exchange velocity (k). Although the ocean is generally regarded as the best constrained system because k is only controlled by the wind speed, numerous formulations are still currently used, leading to potentially large differences in FCO2. Here, a quantitative global spatial analysis of FCO2 is presented using several k-wind speed formulations in order to compare the effect of the choice of parameterization of k on FCO2. This analysis is performed at a 1 degree resolution using a sea surface pCO2 product generated using a two-step artificial neuronal network by Landschützer et al. (2015) over the 1991-2011 period. Four different global wind speed datasets (CCMP, ERA, NCEP 1 and NCEP 2) are also used to assess the effect of the choice of one wind speed product over the other when calculating the global and regional oceanic FCO2. Results indicate that this choice of wind speed product only leads to small discrepancies globally (6 %) except with NCEP 2 which produces a more intense global FCO2 compared to the other wind products. Regionally, theses differences are even more pronounced. For a given wind speed product, the choice of parametrization of k yields global FCO2 differences ranging from 7 % to 16 % depending on the wind product used. We also provide latitudinal profiles of FCO2 and its uncertainty calculated combining all combinations between the different k-relationships and the four wind speed products. Wind speeds >14 m s-1, which only account for 7 % of all observations, contributes disproportionately to the global oceanic FCO2 and, for this range of wind speeds, the uncertainty induced by the choice of formulation for k is maximum ( 50 %).
A convenient method for X-ray analysis in TEM that measures mass thickness and composition
NASA Astrophysics Data System (ADS)
Statham, P.; Sagar, J.; Holland, J.; Pinard, P.; Lozano-Perez, S.
2018-01-01
We consider a new approach for quantitative analysis in transmission electron microscopy (TEM) that offers the same convenience as single-standard quantitative analysis in scanning electron microscopy (SEM). Instead of a bulk standard, a thin film with known mass thickness is used as a reference. The procedure involves recording an X-ray spectrum from the reference film for each session of acquisitions on real specimens. There is no need to measure the beam current; the current only needs to be stable for the duration of the session. A new reference standard with a large (1 mm x 1 mm) area of uniform thickness of 100 nm silicon nitride is used to reveal regions of X-ray detector occlusion that would give misleading results for any X-ray method that measures thickness. Unlike previous methods, the new X-ray method does not require an accurate beam current monitor but delivers equivalent accuracy in mass thickness measurement. Quantitative compositional results are also automatically corrected for specimen self-absorption. The new method is tested using a wedge specimen of Inconel 600 that is used to calibrate the high angle angular dark field (HAADF) signal to provide a thickness reference and results are compared with electron energy-loss spectrometry (EELS) measurements. For the new X-ray method, element composition results are consistent with the expected composition for the alloy and the mass thickness measurement is shown to provide an accurate alternative to EELS for thickness determination in TEM without the uncertainty associated with mean free path estimates.
TRIM.Risk is used to integrate the information on exposure received from TRIM.FaTE or TRIM.Expo with that on dose-response or hazard assessment and to provide quantitative descriptions of risk or hazard and some of the attendant uncertainties.
NASA Technical Reports Server (NTRS)
Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac
1987-01-01
A preliminary uncertainty analysis was performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis is presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.
NASA Technical Reports Server (NTRS)
Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac
1987-01-01
A preliminary uncertainty analysis has been performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis are presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.
Wong, W K Tim; Kirby, Emma; Broom, Alex; Sibbritt, David; Francis, Kay; Karapetis, Christos S; Karikios, Deme; Harrup, Rosemary; Lwin, Zarnie
2018-01-26
A viable and sustainable medical oncology profession is integral for meeting the increasing demand for quality cancer care. The aim of this study was to explore the workforce-related experiences, perceptions and career expectations of early-career medical oncologists in Australia. A mixed-methods design, including a survey (n = 170) and nested qualitative semistructured interviews (n = 14) with early-career medical oncologists. Recruitment was through the Medical Oncology Group of Australia. Qualitative data were thematically analyzed and for the survey results, logistic regression modeling was conducted. Early-career medical oncologists experienced uncertainty regarding their future employment opportunities. The competitive job market has made them cautious about securing a preferred job leading to a perceived need to improve their qualifications through higher degree training and research activities. The following themes and trends were identified from the qualitative and quantitative analyses: age, career stage and associated early-career uncertainty; locale, professional competition and training preferences; participation in research and evolving professional expectations; and workload and career development opportunities as linked to career uncertainty. Perceived diminished employment opportunities in the medical oncology profession, and shifting expectations to be "more qualified," have increased uncertainty among junior medical oncologists in terms of their future career prospects. Structural factors relating to adequate funding of medical oncology positions may facilitate or inhibit progressive change in the workforce and its sustainability. Workforce planning and strategies informed by findings from this study will be necessary in ensuring that both the needs of cancer patients and of medical oncologists are met. © 2018 John Wiley & Sons Australia, Ltd.
NASA Astrophysics Data System (ADS)
Vergara, H. J.; Kirstetter, P.; Gourley, J. J.; Flamig, Z.; Hong, Y.
2015-12-01
The macro scale patterns of simulated streamflow errors are studied in order to characterize uncertainty in a hydrologic modeling system forced with the Multi-Radar/Multi-Sensor (MRMS; http://mrms.ou.edu) quantitative precipitation estimates for flood forecasting over the Conterminous United States (CONUS). The hydrologic model is centerpiece of the Flooded Locations And Simulated Hydrograph (FLASH; http://flash.ou.edu) real-time system. The hydrologic model is implemented at 1-km/5-min resolution to generate estimates of streamflow. Data from the CONUS-wide stream gauge network of the United States' Geological Survey (USGS) were used as a reference to evaluate the discrepancies with the hydrological model predictions. Streamflow errors were studied at the event scale with particular focus on the peak flow magnitude and timing. A total of 2,680 catchments over CONUS and 75,496 events from a 10-year period are used for the simulation diagnostic analysis. Associations between streamflow errors and geophysical factors were explored and modeled. It is found that hydro-climatic factors and radar coverage could explain significant underestimation of peak flow in regions of complex terrain. Furthermore, the statistical modeling of peak flow errors shows that other geophysical factors such as basin geomorphometry, pedology, and land cover/use could also provide explanatory information. Results from this research demonstrate the utility of uncertainty characterization in providing guidance to improve model adequacy, parameter estimates, and input quality control. Likewise, the characterization of uncertainty enables probabilistic flood forecasting that can be extended to ungauged locations.
NASA Astrophysics Data System (ADS)
Colombo, Ivo; Porta, Giovanni M.; Ruffo, Paolo; Guadagnini, Alberto
2017-03-01
This study illustrates a procedure conducive to a preliminary risk analysis of overpressure development in sedimentary basins characterized by alternating depositional events of sandstone and shale layers. The approach rests on two key elements: (1) forward modeling of fluid flow and compaction, and (2) application of a model-complexity reduction technique based on a generalized polynomial chaos expansion (gPCE). The forward model considers a one-dimensional vertical compaction processes. The gPCE model is then used in an inverse modeling context to obtain efficient model parameter estimation and uncertainty quantification. The methodology is applied to two field settings considered in previous literature works, i.e. the Venture Field (Scotian Shelf, Canada) and the Navarin Basin (Bering Sea, Alaska, USA), relying on available porosity and pressure information for model calibration. It is found that the best result is obtained when porosity and pressure data are considered jointly in the model calibration procedure. Uncertainty propagation from unknown input parameters to model outputs, such as pore pressure vertical distribution, is investigated and quantified. This modeling strategy enables one to quantify the relative importance of key phenomena governing the feedback between sediment compaction and fluid flow processes and driving the buildup of fluid overpressure in stratified sedimentary basins characterized by the presence of low-permeability layers. The results here illustrated (1) allow for diagnosis of the critical role played by the parameters of quantitative formulations linking porosity and permeability in compacted shales and (2) provide an explicit and detailed quantification of the effects of their uncertainty in field settings.
NASA Astrophysics Data System (ADS)
Cucinotta, Francis
Uncertainties in estimating health risks from exposures to galactic cosmic rays (GCR) — comprised of protons and high-energy and charge (HZE) nuclei are an important limitation to long duration space travel. HZE nuclei produce both qualitative and quantitative differences in biological effects compared to terrestrial radiation leading to large uncertainties in predicting risks to humans. Our NASA Space Cancer Risk Model-2012 (NSCR-2012) for estimating lifetime cancer risks from space radiation included several new features compared to earlier models from the National Council on Radiation Protection and Measurements (NCRP) used at NASA. New features of NSCR-2012 included the introduction of NASA defined radiation quality factors based on track structure concepts, a Bayesian analysis of the dose and dose-rate reduction effectiveness factor (DDREF) and its uncertainty, and the use of a never-smoker population to represent astronauts. However, NSCR-2012 did not include estimates of the role of qualitative differences between HZE particles and low LET radiation. In this report we discuss evidence for non-targeted effects increasing cancer risks at space relevant HZE particle absorbed doses in tissue (<0.2 Gy), and for increased tumor lethality due to the propensity for higher rates of metastatic tumors from high LET radiation suggested by animal experiments. The NSCR-2014 model considers how these qualitative differences modify the overall probability distribution functions (PDF) for cancer mortality risk estimates from space radiation. Predictions of NSCR-2014 for International Space Station missions and Mars exploration will be described, and compared to those of our earlier NSCR-2012 model.
Quantifying Errors in TRMM-Based Multi-Sensor QPE Products Over Land in Preparation for GPM
NASA Technical Reports Server (NTRS)
Peters-Lidard, Christa D.; Tian, Yudong
2011-01-01
Determining uncertainties in satellite-based multi-sensor quantitative precipitation estimates over land of fundamental importance to both data producers and hydro climatological applications. ,Evaluating TRMM-era products also lays the groundwork and sets the direction for algorithm and applications development for future missions including GPM. QPE uncertainties result mostly from the interplay of systematic errors and random errors. In this work, we will synthesize our recent results quantifying the error characteristics of satellite-based precipitation estimates. Both systematic errors and total uncertainties have been analyzed for six different TRMM-era precipitation products (3B42, 3B42RT, CMORPH, PERSIANN, NRL and GSMap). For systematic errors, we devised an error decomposition scheme to separate errors in precipitation estimates into three independent components, hit biases, missed precipitation and false precipitation. This decomposition scheme reveals hydroclimatologically-relevant error features and provides a better link to the error sources than conventional analysis, because in the latter these error components tend to cancel one another when aggregated or averaged in space or time. For the random errors, we calculated the measurement spread from the ensemble of these six quasi-independent products, and thus produced a global map of measurement uncertainties. The map yields a global view of the error characteristics and their regional and seasonal variations, reveals many undocumented error features over areas with no validation data available, and provides better guidance to global assimilation of satellite-based precipitation data. Insights gained from these results and how they could help with GPM will be highlighted.
Uncertainty Analysis of NASA Glenn's 8- by 6-Foot Supersonic Wind Tunnel
NASA Technical Reports Server (NTRS)
Stephens, Julia E.; Hubbard, Erin P.; Walter, Joel A.; McElroy, Tyler
2016-01-01
An analysis was performed to determine the measurement uncertainty of the Mach Number of the 8- by 6-foot Supersonic Wind Tunnel at the NASA Glenn Research Center. This paper details the analysis process used, including methods for handling limited data and complicated data correlations. Due to the complexity of the equations used, a Monte Carlo Method was utilized for this uncertainty analysis. A summary of the findings are presented as pertains to understanding what the uncertainties are, how they impact various research tests in the facility, and methods of reducing the uncertainties in the future.
Yoo, Kyung Hee
2007-06-01
This study was conducted to investigate the correlation among uncertainty, mastery and appraisal of uncertainty in hospitalized children's mothers. Self report questionnaires were used to measure the variables. Variables were uncertainty, mastery and appraisal of uncertainty. In data analysis, the SPSSWIN 12.0 program was utilized for descriptive statistics, Pearson's correlation coefficients, and regression analysis. Reliability of the instruments was cronbach's alpha=.84~.94. Mastery negatively correlated with uncertainty(r=-.444, p=.000) and danger appraisal of uncertainty(r=-.514, p=.000). In regression of danger appraisal of uncertainty, uncertainty and mastery were significant predictors explaining 39.9%. Mastery was a significant mediating factor between uncertainty and danger appraisal of uncertainty in hospitalized children's mothers. Therefore, nursing interventions which improve mastery must be developed for hospitalized children's mothers.
NASA Astrophysics Data System (ADS)
Debski, Wojciech
2015-06-01
The spatial location of sources of seismic waves is one of the first tasks when transient waves from natural (uncontrolled) sources are analysed in many branches of physics, including seismology, oceanology, to name a few. Source activity and its spatial variability in time, the geometry of recording network, the complexity and heterogeneity of wave velocity distribution are all factors influencing the performance of location algorithms and accuracy of the achieved results. Although estimating of the earthquake foci location is relatively simple, a quantitative estimation of the location accuracy is really a challenging task even if the probabilistic inverse method is used because it requires knowledge of statistics of observational, modelling and a priori uncertainties. In this paper, we addressed this task when statistics of observational and/or modelling errors are unknown. This common situation requires introduction of a priori constraints on the likelihood (misfit) function which significantly influence the estimated errors. Based on the results of an analysis of 120 seismic events from the Rudna copper mine operating in southwestern Poland, we propose an approach based on an analysis of Shanon's entropy calculated for the a posteriori distribution. We show that this meta-characteristic of the a posteriori distribution carries some information on uncertainties of the solution found.
Bendeck, Murielle; Serrano-Blanco, Antoni; García-Alonso, Carlos; Bonet, Pere; Jordà, Esther; Sabes-Figuera, Ramon; Salvador-Carulla, Luis
2013-04-01
Cost of illness (COI) studies are carried out under conditions of uncertainty and with incomplete information. There are concerns regarding their generalisability, accuracy and usability in evidence-informed care. A hybrid methodology is used to estimate the regional costs of depression in Catalonia (Spain) following an integrative approach. The cross-design synthesis included nominal groups and quantitative analysis of both top-down and bottom-up studies, and incorporated primary and secondary data from different sources of information in Catalonia. Sensitivity analysis used probabilistic Monte Carlo simulation modelling. A dissemination strategy was planned, including a standard form adapted from cost-effectiveness studies to summarise methods and results. The method used allows for a comprehensive estimate of the cost of depression in Catalonia. Health officers and decision-makers concluded that this methodology provided useful information and knowledge for evidence-informed planning in mental health. The mix of methods, combined with a simulation model, contributed to a reduction in data gaps and, in conditions of uncertainty, supplied more complete information on the costs of depression in Catalonia. This approach to COI should be differentiated from other COI designs to allow like-with-like comparisons. A consensus on COI typology, procedures and dissemination is needed.
Elemental Impurities in Pharmaceutical Excipients.
Li, Gang; Schoneker, Dave; Ulman, Katherine L; Sturm, Jason J; Thackery, Lisa M; Kauffman, John F
2015-12-01
Control of elemental impurities in pharmaceutical materials is currently undergoing a transition from control based on concentrations in components of drug products to control based on permitted daily exposures in drug products. Within the pharmaceutical community, there is uncertainty regarding the impact of these changes on manufactures of drug products. This uncertainty is fueled in part by a lack of publically available information on elemental impurity levels in common pharmaceutical excipients. This paper summarizes a recent survey of elemental impurity levels in common pharmaceutical excipients as well as some drug substances. A widely applicable analytical procedure was developed and was shown to be suitable for analysis of elements that are subject to United States Pharmacopoeia Chapter <232> and International Conference on Harmonization's Q3D Guideline on Elemental Impurities. The procedure utilizes microwave-assisted digestion of pharmaceutical materials and inductively coupled plasma mass spectrometry for quantitative analysis of these elements. The procedure was applied to 190 samples from 31 different excipients and 15 samples from eight drug substances provided through the International Pharmaceutical Excipient Council of the Americas. The results of the survey indicate that, for the materials included in the study, relatively low levels of elemental impurities are present. © 2015 The Authors. Journal of Pharmaceutical Sciences published by Wiley Periodicals, Inc. and the American Pharmacists Association.
Uncertainty Analysis of the NASA Glenn 8x6 Supersonic Wind Tunnel
NASA Technical Reports Server (NTRS)
Stephens, Julia; Hubbard, Erin; Walter, Joel; McElroy, Tyler
2016-01-01
This paper presents methods and results of a detailed measurement uncertainty analysis that was performed for the 8- by 6-foot Supersonic Wind Tunnel located at the NASA Glenn Research Center. The statistical methods and engineering judgments used to estimate elemental uncertainties are described. The Monte Carlo method of propagating uncertainty was selected to determine the uncertainty of calculated variables of interest. A detailed description of the Monte Carlo method as applied for this analysis is provided. Detailed uncertainty results for the uncertainty in average free stream Mach number as well as other variables of interest are provided. All results are presented as random (variation in observed values about a true value), systematic (potential offset between observed and true value), and total (random and systematic combined) uncertainty. The largest sources contributing to uncertainty are determined and potential improvement opportunities for the facility are investigated.
[Risk, uncertainty and ignorance in medicine].
Rørtveit, G; Strand, R
2001-04-30
Exploration of healthy patients' risk factors for disease has become a major medical activity. The rationale behind primary prevention through exploration and therapeutic risk reduction is not separated from the theoretical assumption that every form of uncertainty can be expressed as risk. Distinguishing "risk" (as quantitative probabilities in a known sample space), "strict uncertainty" (when the sample space is known, but probabilities of events cannot be quantified) and "ignorance" (when the sample space is not fully known), a typical clinical situation (primary risk of coronary disease) is analysed. It is shown how strict uncertainty and sometimes ignorance can be present, in which case the orthodox decision theoretical rationale for treatment breaks down. For use in such cases, a different ideal model of rationality is proposed, focusing on the patient's considered reasons. This model has profound implications for the current understanding of medical professionalism as well as for the design of clinical guidelines.
NASA Technical Reports Server (NTRS)
Wang, T.; Simon, T. W.
1988-01-01
Development of a recent experimental program to investigate the effects of streamwise curvature on boundary layer transition required making a bendable, heated and instrumented test wall, a rather nonconventional surface. The present paper describes this surface, the design choices made in its development and how uncertainty analysis was used, beginning early in the test program, to make such design choices. Published uncertainty analysis techniques were found to be of great value; but, it became clear that another step, one herein called the pre-test analysis, would aid the program development. Finally, it is shown how the uncertainty analysis was used to determine whether the test surface was qualified for service.
Uncertainty analysis of hydrological modeling in a tropical area using different algorithms
NASA Astrophysics Data System (ADS)
Rafiei Emam, Ammar; Kappas, Martin; Fassnacht, Steven; Linh, Nguyen Hoang Khanh
2018-01-01
Hydrological modeling outputs are subject to uncertainty resulting from different sources of errors (e.g., error in input data, model structure, and model parameters), making quantification of uncertainty in hydrological modeling imperative and meant to improve reliability of modeling results. The uncertainty analysis must solve difficulties in calibration of hydrological models, which further increase in areas with data scarcity. The purpose of this study is to apply four uncertainty analysis algorithms to a semi-distributed hydrological model, quantifying different source of uncertainties (especially parameter uncertainty) and evaluate their performance. In this study, the Soil and Water Assessment Tools (SWAT) eco-hydrological model was implemented for the watershed in the center of Vietnam. The sensitivity of parameters was analyzed, and the model was calibrated. The uncertainty analysis for the hydrological model was conducted based on four algorithms: Generalized Likelihood Uncertainty Estimation (GLUE), Sequential Uncertainty Fitting (SUFI), Parameter Solution method (ParaSol) and Particle Swarm Optimization (PSO). The performance of the algorithms was compared using P-factor and Rfactor, coefficient of determination (R 2), the Nash Sutcliffe coefficient of efficiency (NSE) and Percent Bias (PBIAS). The results showed the high performance of SUFI and PSO with P-factor>0.83, R-factor <0.56 and R 2>0.91, NSE>0.89, and 0.18
Quantitative Measures for Evaluation of Ultrasound Therapies of the Prostate
NASA Astrophysics Data System (ADS)
Kobelevskiy, Ilya; Burtnyk, Mathieu; Bronskill, Michael; Chopra, Rajiv
2010-03-01
Development of non-invasive techniques for prostate cancer treatment requires implementation of quantitative measures for evaluation of the treatment results. In this paper. we introduce measures that estimate spatial targeting accuracy and potential thermal damage to the structures surrounding the prostate. The measures were developed for the technique of treating prostate cancer with a transurethral ultrasound heating applicators guided by active MR temperature feedback. Variations of ultrasound element length and related MR imaging parameters such as MR slice thickness and update time were investigated by performing numerical simulations of the treatment on a database of ten patient prostate geometries segmented from clinical MR images. Susceptibility of each parameter configuration to uncertainty in MR temperature measurements was studied by adding noise to the temperature measurements. Gaussian noise with zero mean and standard deviation of 0, 1, 3 and 5° C was used to model different levels of uncertainty in MR temperature measurements. Results of simulations for each parameter configuration were averaged over the database of the ten prostate patient geometries studied. Results have shown that for update time of 5 seconds both 3- and 5-mm elements achieve appropriate performance for temperature uncertainty up to 3° C, while temperature uncertainty of 5° C leads to noticeable reduction in spatial accuracy and increased risk of damaging rectal wall. Ten-mm elements lacked spatial accuracy and had higher risk of damaging rectal wall compared to 3- and 5-mm elements, but were less sensitive to the level of temperature uncertainty. The effect of changing update time was studied for 5-mm elements. Simulations showed that update time had minor effects on all aspects of treatment for temperature uncertainty of 0° C and 1° C, while temperature uncertainties of 3° C and 5° C led to reduced spatial accuracy, increased potential damage to the rectal wall, and longer treatment times for update time above 5 seconds. Overall evaluation of results suggested that 5-mm elements showed best performance under physically reachable MR imaging parameters.
Quantifying Uncertainty in Instantaneous Orbital Data Products of TRMM over Indian Subcontinent
NASA Astrophysics Data System (ADS)
Jayaluxmi, I.; Nagesh, D.
2013-12-01
In the last 20 years, microwave radiometers have taken satellite images of earth's weather proving to be a valuable tool for quantitative estimation of precipitation from space. However, along with the widespread acceptance of microwave based precipitation products, it has also been recognized that they contain large uncertainties. While most of the uncertainty evaluation studies focus on the accuracy of rainfall accumulated over time (e.g., season/year), evaluation of instantaneous rainfall intensities from satellite orbital data products are relatively rare. These instantaneous products are known to potentially cause large uncertainties during real time flood forecasting studies at the watershed scale. Especially over land regions, where the highly varying land surface emissivity offer a myriad of complications hindering accurate rainfall estimation. The error components of orbital data products also tend to interact nonlinearly with hydrologic modeling uncertainty. Keeping these in mind, the present study fosters the development of uncertainty analysis using instantaneous satellite orbital data products (version 7 of 1B11, 2A25, 2A23) derived from the passive and active sensors onboard Tropical Rainfall Measuring Mission (TRMM) satellite, namely TRMM microwave imager (TMI) and Precipitation Radar (PR). The study utilizes 11 years of orbital data from 2002 to 2012 over the Indian subcontinent and examines the influence of various error sources on the convective and stratiform precipitation types. Analysis conducted over the land regions of India investigates three sources of uncertainty in detail. These include 1) Errors due to improper delineation of rainfall signature within microwave footprint (rain/no rain classification), 2) Uncertainty offered by the transfer function linking rainfall with TMI low frequency channels and 3) Sampling errors owing to the narrow swath and infrequent visits of TRMM sensors. Case study results obtained during the Indian summer monsoon months of June-September are presented using contingency table statistics, performance diagram, scatter plots and probability density functions. Our study demonstrates that theory of copula can be efficiently used to represent the highly non linear dependency structure of rainfall with respect to TMI low frequency channels of 19, 21, 37 GHz. This questions the exclusive usage of high frequency 85 GHz channel for TMI overland rainfall retrieval algorithms. Further, the PR sampling errors revealed using a statistical bootstrap technique was found to incur relative sampling errors <30% (for 2 degree grids) over India whose magnitudes were biased towards stratiform rainfall type and sampling technique employed. These findings clearly document that proper characterization of error structure offered by TMI and PR has wider implications for decision making prior to incorporating the resulting orbital products for basin scale hydrologic modeling.
Essentiality, toxicity, and uncertainty in the risk assessment of manganese.
Boyes, William K
2010-01-01
Risk assessments of manganese by inhalation or oral routes of exposure typically acknowledge the duality of manganese as an essential element at low doses and a toxic metal at high doses. Previously, however, risk assessors were unable to describe manganese pharmacokinetics quantitatively across dose levels and routes of exposure, to account for mass balance, and to incorporate this information into a quantitative risk assessment. In addition, the prior risk assessment of inhaled manganese conducted by the U.S. Environmental Protection Agency (EPA) identified a number of specific factors that contributed to uncertainty in the risk assessment. In response to a petition regarding the use of a fuel additive containing manganese, methylcyclopentadienyl manganese tricarbonyl (MMT), the U.S. EPA developed a test rule under the U.S. Clean Air Act that required, among other things, the generation of pharmacokinetic information. This information was intended not only to aid in the design of health outcome studies, but also to help address uncertainties in the risk assessment of manganese. To date, the work conducted in response to the test rule has yielded substantial pharmacokinetic data. This information will enable the generation of physiologically based pharmacokinetic (PBPK) models capable of making quantitative predictions of tissue manganese concentrations following inhalation and oral exposure, across dose levels, and accounting for factors such as duration of exposure, different species of manganese, and changes of age, gender, and reproductive status. The work accomplished in response to the test rule, in combination with other scientific evidence, will enable future manganese risk assessments to consider tissue dosimetry more comprehensively than was previously possible.
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
NASA Astrophysics Data System (ADS)
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-03-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster-Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty-sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights.
Kettler, Susanne; Kennedy, Marc; McNamara, Cronan; Oberdörfer, Regina; O'Mahony, Cian; Schnabel, Jürgen; Smith, Benjamin; Sprong, Corinne; Faludi, Roland; Tennant, David
2015-08-01
Uncertainty analysis is an important component of dietary exposure assessments in order to understand correctly the strength and limits of its results. Often, standard screening procedures are applied in a first step which results in conservative estimates. If through those screening procedures a potential exceedance of health-based guidance values is indicated, within the tiered approach more refined models are applied. However, the sources and types of uncertainties in deterministic and probabilistic models can vary or differ. A key objective of this work has been the mapping of different sources and types of uncertainties to better understand how to best use uncertainty analysis to generate more realistic comprehension of dietary exposure. In dietary exposure assessments, uncertainties can be introduced by knowledge gaps about the exposure scenario, parameter and the model itself. With this mapping, general and model-independent uncertainties have been identified and described, as well as those which can be introduced and influenced by the specific model during the tiered approach. This analysis identifies that there are general uncertainties common to point estimates (screening or deterministic methods) and probabilistic exposure assessment methods. To provide further clarity, general sources of uncertainty affecting many dietary exposure assessments should be separated from model-specific uncertainties. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and
Parameter Estimation (UA/SA/PE API) (also known as Calibration, Optimization and Sensitivity and Uncertainty (CUSO)) was developed in a joint effort between several members of both ...
Durability reliability analysis for corroding concrete structures under uncertainty
NASA Astrophysics Data System (ADS)
Zhang, Hao
2018-02-01
This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.
Tian, Yuan; Hassmiller Lich, Kristen; Osgood, Nathaniel D; Eom, Kirsten; Matchar, David B
2016-11-01
As health services researchers and decision makers tackle more difficult problems using simulation models, the number of parameters and the corresponding degree of uncertainty have increased. This often results in reduced confidence in such complex models to guide decision making. To demonstrate a systematic approach of linked sensitivity analysis, calibration, and uncertainty analysis to improve confidence in complex models. Four techniques were integrated and applied to a System Dynamics stroke model of US veterans, which was developed to inform systemwide intervention and research planning: Morris method (sensitivity analysis), multistart Powell hill-climbing algorithm and generalized likelihood uncertainty estimation (calibration), and Monte Carlo simulation (uncertainty analysis). Of 60 uncertain parameters, sensitivity analysis identified 29 needing calibration, 7 that did not need calibration but significantly influenced key stroke outcomes, and 24 not influential to calibration or stroke outcomes that were fixed at their best guess values. One thousand alternative well-calibrated baselines were obtained to reflect calibration uncertainty and brought into uncertainty analysis. The initial stroke incidence rate among veterans was identified as the most influential uncertain parameter, for which further data should be collected. That said, accounting for current uncertainty, the analysis of 15 distinct prevention and treatment interventions provided a robust conclusion that hypertension control for all veterans would yield the largest gain in quality-adjusted life years. For complex health care models, a mixed approach was applied to examine the uncertainty surrounding key stroke outcomes and the robustness of conclusions. We demonstrate that this rigorous approach can be practical and advocate for such analysis to promote understanding of the limits of certainty in applying models to current decisions and to guide future data collection. © The Author(s) 2016.
Total Risk Integrated Methodology (TRIM) - TRIM.Risk
TRIM.Riskis used to integrate the information on exposure received from TRIM.FaTE or TRIM.Expo with that on dose-response or hazard assessment and to provide quantitative descriptions of risk or hazard and some of the attendant uncertainties.
NASA Astrophysics Data System (ADS)
Sussmann, Ralf; Reichert, Andreas; Rettinger, Markus
2016-09-01
Quantitative knowledge of water vapor radiative processes in the atmosphere throughout the terrestrial and solar infrared spectrum is still incomplete even though this is crucial input to the radiation codes forming the core of both remote sensing methods and climate simulations. Beside laboratory spectroscopy, ground-based remote sensing field studies in the context of so-called radiative closure experiments are a powerful approach because this is the only way to quantify water absorption under cold atmospheric conditions. For this purpose, we have set up at the Zugspitze (47.42° N, 10.98° E; 2964 m a.s.l.) a long-term radiative closure experiment designed to cover the infrared spectrum between 400 and 7800 cm-1 (1.28-25 µm). As a benefit for such experiments, the atmospheric states at the Zugspitze frequently comprise very low integrated water vapor (IWV; minimum = 0.1 mm, median = 2.3 mm) and very low aerosol optical depth (AOD = 0.0024-0.0032 at 7800 cm-1 at air mass 1). All instruments for radiance measurements and atmospheric-state measurements are described along with their measurement uncertainties. Based on all parameter uncertainties and the corresponding radiance Jacobians, a systematic residual radiance uncertainty budget has been set up to characterize the sensitivity of the radiative closure over the whole infrared spectral range. The dominant uncertainty contribution in the spectral windows used for far-infrared (FIR) continuum quantification is from IWV uncertainties, while T profile uncertainties dominate in the mid-infrared (MIR). Uncertainty contributions to near-infrared (NIR) radiance residuals are dominated by water vapor line parameters in the vicinity of the strong water vapor bands. The window regions in between these bands are dominated by solar Fourier transform infrared (FTIR) calibration uncertainties at low NIR wavenumbers, while uncertainties due to AOD become an increasing and dominant contribution towards higher NIR wavenumbers. Exceptions are methane or nitrous oxide bands in the NIR, where the associated line parameter uncertainties dominate the overall uncertainty. As a first demonstration of the Zugspitze closure experiment, a water vapor continuum quantification in the FIR spectral region (400-580 cm-1) has been performed. The resulting FIR foreign-continuum coefficients are consistent with the MT_CKD 2.5.2 continuum model and also agree with the most recent atmospheric closure study carried out in Antarctica. Results from the first determination of the NIR water vapor continuum in a field experiment are detailed in a companion paper (Reichert and Sussmann, 2016) while a novel NIR calibration scheme for the underlying FTIR measurements of incoming solar radiance is presented in another companion paper (Reichert et al., 2016).
Satellite stratospheric aerosol measurement validation
NASA Technical Reports Server (NTRS)
Russell, P. B.; Mccormick, M. P.
1984-01-01
The validity of the stratospheric aerosol measurements made by the satellite sensors SAM II and SAGE was tested by comparing their results with each other and with results obtained by other techniques (lider, dustsonde, filter, and impactor). The latter type of comparison required the development of special techniques that convert the quantity measured by the correlative sensor (e.g. particle backscatter, number, or mass) to that measured by the satellite sensor (extinction) and quantitatively estimate the uncertainty in the conversion process. The results of both types of comparisons show agreement within the measurement and conversion uncertainties. Moreover, the satellite uncertainty is small compared to aerosol natural variability (caused by seasonal changes, volcanoes, sudden warmings, and vortex structure). It was concluded that the satellite measurements are valid.
Error Estimation and Uncertainty Propagation in Computational Fluid Mechanics
NASA Technical Reports Server (NTRS)
Zhu, J. Z.; He, Guowei; Bushnell, Dennis M. (Technical Monitor)
2002-01-01
Numerical simulation has now become an integral part of engineering design process. Critical design decisions are routinely made based on the simulation results and conclusions. Verification and validation of the reliability of the numerical simulation is therefore vitally important in the engineering design processes. We propose to develop theories and methodologies that can automatically provide quantitative information about the reliability of the numerical simulation by estimating numerical approximation error, computational model induced errors and the uncertainties contained in the mathematical models so that the reliability of the numerical simulation can be verified and validated. We also propose to develop and implement methodologies and techniques that can control the error and uncertainty during the numerical simulation so that the reliability of the numerical simulation can be improved.
Diagnosing Model Errors in Simulations of Solar Radiation on Inclined Surfaces: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Yu; Sengupta, Manajit
2016-06-01
Transposition models have been widely used in the solar energy industry to simulate solar radiation on inclined PV panels. Following numerous studies comparing the performance of transposition models, this paper aims to understand the quantitative uncertainty in the state-of-the-art transposition models and the sources leading to the uncertainty. Our results suggest that an isotropic transposition model developed by Badescu substantially underestimates diffuse plane-of-array (POA) irradiances when diffuse radiation is perfectly isotropic. In the empirical transposition models, the selection of empirical coefficients and land surface albedo can both result in uncertainty in the output. This study can be used as amore » guide for future development of physics-based transposition models.« less
Diagnosing Model Errors in Simulation of Solar Radiation on Inclined Surfaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Yu; Sengupta, Manajit
2016-11-21
Transposition models have been widely used in the solar energy industry to simulate solar radiation on inclined PV panels. Following numerous studies comparing the performance of transposition models, this paper aims to understand the quantitative uncertainty in the state-of-the-art transposition models and the sources leading to the uncertainty. Our results show significant differences between two highly used isotropic transposition models with one substantially underestimating the diffuse plane-of-array (POA) irradiances when diffuse radiation is perfectly isotropic. In the empirical transposition models, the selection of empirical coefficients and land surface albedo can both result in uncertainty in the output. This study canmore » be used as a guide for future development of physics-based transposition models.« less
Uncertainty in monitoring E. coli concentrations in streams and stormwater runoff
NASA Astrophysics Data System (ADS)
Harmel, R. D.; Hathaway, J. M.; Wagner, K. L.; Wolfe, J. E.; Karthikeyan, R.; Francesconi, W.; McCarthy, D. T.
2016-03-01
Microbial contamination of surface waters, a substantial public health concern throughout the world, is typically identified by fecal indicator bacteria such as Escherichia coli. Thus, monitoring E. coli concentrations is critical to evaluate current conditions, determine restoration effectiveness, and inform model development and calibration. An often overlooked component of these monitoring and modeling activities is understanding the inherent random and systematic uncertainty present in measured data. In this research, a review and subsequent analysis was performed to identify, document, and analyze measurement uncertainty of E. coli data collected in stream flow and stormwater runoff as individual discrete samples or throughout a single runoff event. Data on the uncertainty contributed by sample collection, sample preservation/storage, and laboratory analysis in measured E. coli concentrations were compiled and analyzed, and differences in sampling method and data quality scenarios were compared. The analysis showed that: (1) manual integrated sampling produced the lowest random and systematic uncertainty in individual samples, but automated sampling typically produced the lowest uncertainty when sampling throughout runoff events; (2) sample collection procedures often contributed the highest amount of uncertainty, although laboratory analysis introduced substantial random uncertainty and preservation/storage introduced substantial systematic uncertainty under some scenarios; and (3) the uncertainty in measured E. coli concentrations was greater than that of sediment and nutrients, but the difference was not as great as may be assumed. This comprehensive analysis of uncertainty in E. coli concentrations measured in streamflow and runoff should provide valuable insight for designing E. coli monitoring projects, reducing uncertainty in quality assurance efforts, regulatory and policy decision making, and fate and transport modeling.
Kim, Jung Hyeun; Mulholland, George W.; Kukuck, Scott R.; Pui, David Y. H.
2005-01-01
The slip correction factor has been investigated at reduced pressures and high Knudsen number using polystyrene latex (PSL) particles. Nano-differential mobility analyzers (NDMA) were used in determining the slip correction factor by measuring the electrical mobility of 100.7 nm, 269 nm, and 19.90 nm particles as a function of pressure. The aerosol was generated via electrospray to avoid multiplets for the 19.90 nm particles and to reduce the contaminant residue on the particle surface. System pressure was varied down to 8.27 kPa, enabling slip correction measurements for Knudsen numbers as large as 83. A condensation particle counter was modified for low pressure application. The slip correction factor obtained for the three particle sizes is fitted well by the equation: C = 1 + Kn (α + β exp(−γ/Kn)), with α = 1.165, β = 0.483, and γ = 0.997. The first quantitative uncertainty analysis for slip correction measurements was carried out. The expanded relative uncertainty (95 % confidence interval) in measuring slip correction factor was about 2 % for the 100.7 nm SRM particles, about 3 % for the 19.90 nm PSL particles, and about 2.5 % for the 269 nm SRM particles. The major sources of uncertainty are the diameter of particles, the geometric constant associated with NDMA, and the voltage. PMID:27308102
Fuzzy approaches to supplier selection problem
NASA Astrophysics Data System (ADS)
Ozkok, Beyza Ahlatcioglu; Kocken, Hale Gonce
2013-09-01
Supplier selection problem is a multi-criteria decision making problem which includes both qualitative and quantitative factors. In the selection process many criteria may conflict with each other, therefore decision-making process becomes complicated. In this study, we handled the supplier selection problem under uncertainty. In this context; we used minimum criterion, arithmetic mean criterion, regret criterion, optimistic criterion, geometric mean and harmonic mean. The membership functions created with the help of the characteristics of used criteria, and we tried to provide consistent supplier selection decisions by using these memberships for evaluating alternative suppliers. During the analysis, no need to use expert opinion is a strong aspect of the methodology used in the decision-making.
An uncertainty analysis of wildfire modeling [Chapter 13
Karin Riley; Matthew Thompson
2017-01-01
Before fire models can be understood, evaluated, and effectively applied to support decision making, model-based uncertainties must be analyzed. In this chapter, we identify and classify sources of uncertainty using an established analytical framework, and summarize results graphically in an uncertainty matrix. Our analysis facilitates characterization of the...
Analytic uncertainty and sensitivity analysis of models with input correlations
NASA Astrophysics Data System (ADS)
Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu
2018-03-01
Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.
Using Aoristic Analysis to Link Remote and Ground-Level Phenological Observations
NASA Astrophysics Data System (ADS)
Henebry, G. M.
2013-12-01
Phenology is about observing events in time and space. With the advent of publically accessible geospatial datastreams and easy to use mapping software, specifying where an event occurs is much less of a challenge than it was just two decades ago. In contrast, specifying when an event occurs remains a nontrivial function of a population of organismal responses, sampling interval, compositing period, and reporting precision. I explore how aoristic analysis can be used to analyzing spatiotemporal events for which the location is known to acceptable levels of precision but for which temporal coordinates are poorly specified or only partially bounded. Aoristic analysis was developed in the late 1990s in the field of quantitative criminology to leverage temporally imprecise geospatial data of crime reports. Here I demonstrate how aoristic analysis can be used to link remotely sensed observations of land surface phenology to ground-level observations of organismal phenophase transitions. Explicit representation of the windows of temporal uncertainty with aoristic weights enables cross-validation exercises and forecasting efforts to avoid false precision.
The fundamental parameter method applied to X-ray fluorescence analysis with synchrotron radiation
NASA Astrophysics Data System (ADS)
Pantenburg, F. J.; Beier, T.; Hennrich, F.; Mommsen, H.
1992-05-01
Quantitative X-ray fluorescence analysis applying the fundamental parameter method is usually restricted to monochromatic excitation sources. It is shown here, that such analyses can be performed as well with a white synchrotron radiation spectrum. To determine absolute elemental concentration values it is necessary to know the spectral distribution of this spectrum. A newly designed and tested experimental setup, which uses the synchrotron radiation emitted from electrons in a bending magnet of ELSA (electron stretcher accelerator of the university of Bonn) is presented. The determination of the exciting spectrum, described by the given electron beam parameters, is limited due to uncertainties in the vertical electron beam size and divergence. We describe a method which allows us to determine the relative and absolute spectral distributions needed for accurate analysis. First test measurements of different alloys and standards of known composition demonstrate that it is possible to determine exact concentration values in bulk and trace element analysis.
The application of decision analysis to life support research and technology development
NASA Technical Reports Server (NTRS)
Ballin, Mark G.
1994-01-01
Applied research and technology development is often characterized by uncertainty, risk, and significant delays before tangible returns are obtained. Decision making regarding which technologies to advance and what resources to devote to them is a challenging but essential task. In the application of life support technology to future manned space flight, new technology concepts typically are characterized by nonexistent data and rough approximations of technology performance, uncertain future flight program needs, and a complex, time-intensive process to develop technology to a flight-ready status. Decision analysis is a quantitative, logic-based discipline that imposes formalism and structure to complex problems. It also accounts for the limits of knowledge that may be available at the time a decision is needed. The utility of decision analysis to life support technology R & D was evaluated by applying it to two case studies. The methodology was found to provide insight that is not possible from more traditional analysis approaches.
Model parameter uncertainty analysis for an annual field-scale P loss model
NASA Astrophysics Data System (ADS)
Bolster, Carl H.; Vadas, Peter A.; Boykin, Debbie
2016-08-01
Phosphorous (P) fate and transport models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. Because all models are simplifications of complex systems, there will exist an inherent amount of uncertainty associated with their predictions. It is therefore important that efforts be directed at identifying, quantifying, and communicating the different sources of model uncertainties. In this study, we conducted an uncertainty analysis with the Annual P Loss Estimator (APLE) model. Our analysis included calculating parameter uncertainties and confidence and prediction intervals for five internal regression equations in APLE. We also estimated uncertainties of the model input variables based on values reported in the literature. We then predicted P loss for a suite of fields under different management and climatic conditions while accounting for uncertainties in the model parameters and inputs and compared the relative contributions of these two sources of uncertainty to the overall uncertainty associated with predictions of P loss. Both the overall magnitude of the prediction uncertainties and the relative contributions of the two sources of uncertainty varied depending on management practices and field characteristics. This was due to differences in the number of model input variables and the uncertainties in the regression equations associated with each P loss pathway. Inspection of the uncertainties in the five regression equations brought attention to a previously unrecognized limitation with the equation used to partition surface-applied fertilizer P between leaching and runoff losses. As a result, an alternate equation was identified that provided similar predictions with much less uncertainty. Our results demonstrate how a thorough uncertainty and model residual analysis can be used to identify limitations with a model. Such insight can then be used to guide future data collection and model development and evaluation efforts.
Uncertainties in s-process nucleosynthesis in massive stars determined by Monte Carlo variations
NASA Astrophysics Data System (ADS)
Nishimura, N.; Hirschi, R.; Rauscher, T.; St. J. Murphy, A.; Cescutti, G.
2017-08-01
The s-process in massive stars produces the weak component of the s-process (nuclei up to A ˜ 90), in amounts that match solar abundances. For heavier isotopes, such as barium, production through neutron capture is significantly enhanced in very metal-poor stars with fast rotation. However, detailed theoretical predictions for the resulting final s-process abundances have important uncertainties caused both by the underlying uncertainties in the nuclear physics (principally neutron-capture reaction and β-decay rates) as well as by the stellar evolution modelling. In this work, we investigated the impact of nuclear-physics uncertainties relevant to the s-process in massive stars. Using a Monte Carlo based approach, we performed extensive nuclear reaction network calculations that include newly evaluated upper and lower limits for the individual temperature-dependent reaction rates. We found that most of the uncertainty in the final abundances is caused by uncertainties in the neutron-capture rates, while β-decay rate uncertainties affect only a few nuclei near s-process branchings. The s-process in rotating metal-poor stars shows quantitatively different uncertainties and key reactions, although the qualitative characteristics are similar. We confirmed that our results do not significantly change at different metallicities for fast rotating massive stars in the very low metallicity regime. We highlight which of the identified key reactions are realistic candidates for improved measurement by future experiments.
Chen, Xing; Zhang, Xing
2016-01-01
Despite the importance of adoption of mobile health services by an organization on the diffusion of mobile technology in the big data era, it has received minimal attention in literature. This study investigates how relative advantage and perceived credibility affect an organization's adoption of mobile health services, as well as how environmental uncertainty changes the relationship of relative advantage and perceived credibility with adoption. A research model that integrates relative advantage, perceived credibility, environmental uncertainty, and an organization's intention to use mobile health service is developed. Quantitative data are collected from senior managers and information systems managers in 320 Chinese healthcare organizations. The empirical findings show that while relative advantage and perceived credibility both have positive effects on an organization's intention to use mobile health services, relative advantage plays a more important role than perceived credibility. Moreover, environmental uncertainty positively moderates the effect of relative advantage on an organization's adoption of mobile health services. Thus, mobile health services in environments characterized with high levels of uncertainty are more likely to be adopted because of relative advantage than in environments with low levels of uncertainty.
Lombardi, A M
2017-09-18
Stochastic models provide quantitative evaluations about the occurrence of earthquakes. A basic component of this type of models are the uncertainties in defining main features of an intrinsically random process. Even if, at a very basic level, any attempting to distinguish between types of uncertainty is questionable, an usual way to deal with this topic is to separate epistemic uncertainty, due to lack of knowledge, from aleatory variability, due to randomness. In the present study this problem is addressed in the narrow context of short-term modeling of earthquakes and, specifically, of ETAS modeling. By mean of an application of a specific version of the ETAS model to seismicity of Central Italy, recently struck by a sequence with a main event of Mw6.5, the aleatory and epistemic (parametric) uncertainty are separated and quantified. The main result of the paper is that the parametric uncertainty of the ETAS-type model, adopted here, is much lower than the aleatory variability in the process. This result points out two main aspects: an analyst has good chances to set the ETAS-type models, but he may retrospectively describe and forecast the earthquake occurrences with still limited precision and accuracy.
Bayesian networks improve causal environmental ...
Rule-based weight of evidence approaches to ecological risk assessment may not account for uncertainties and generally lack probabilistic integration of lines of evidence. Bayesian networks allow causal inferences to be made from evidence by including causal knowledge about the problem, using this knowledge with probabilistic calculus to combine multiple lines of evidence, and minimizing biases in predicting or diagnosing causal relationships. Too often, sources of uncertainty in conventional weight of evidence approaches are ignored that can be accounted for with Bayesian networks. Specifying and propagating uncertainties improve the ability of models to incorporate strength of the evidence in the risk management phase of an assessment. Probabilistic inference from a Bayesian network allows evaluation of changes in uncertainty for variables from the evidence. The network structure and probabilistic framework of a Bayesian approach provide advantages over qualitative approaches in weight of evidence for capturing the impacts of multiple sources of quantifiable uncertainty on predictions of ecological risk. Bayesian networks can facilitate the development of evidence-based policy under conditions of uncertainty by incorporating analytical inaccuracies or the implications of imperfect information, structuring and communicating causal issues through qualitative directed graph formulations, and quantitatively comparing the causal power of multiple stressors on value
NASA Technical Reports Server (NTRS)
Radespiel, Rolf; Hemsch, Michael J.
2007-01-01
The complexity of modern military systems, as well as the cost and difficulty associated with experimentally verifying system and subsystem design makes the use of high-fidelity based simulation a future alternative for design and development. The predictive ability of such simulations such as computational fluid dynamics (CFD) and computational structural mechanics (CSM) have matured significantly. However, for numerical simulations to be used with confidence in design and development, quantitative measures of uncertainty must be available. The AVT 147 Symposium has been established to compile state-of-the art methods of assessing computational uncertainty, to identify future research and development needs associated with these methods, and to present examples of how these needs are being addressed and how the methods are being applied. Papers were solicited that address uncertainty estimation associated with high fidelity, physics-based simulations. The solicitation included papers that identify sources of error and uncertainty in numerical simulation from either the industry perspective or from the disciplinary or cross-disciplinary research perspective. Examples of the industry perspective were to include how computational uncertainty methods are used to reduce system risk in various stages of design or development.
2016-01-01
Despite the importance of adoption of mobile health services by an organization on the diffusion of mobile technology in the big data era, it has received minimal attention in literature. This study investigates how relative advantage and perceived credibility affect an organization's adoption of mobile health services, as well as how environmental uncertainty changes the relationship of relative advantage and perceived credibility with adoption. A research model that integrates relative advantage, perceived credibility, environmental uncertainty, and an organization's intention to use mobile health service is developed. Quantitative data are collected from senior managers and information systems managers in 320 Chinese healthcare organizations. The empirical findings show that while relative advantage and perceived credibility both have positive effects on an organization's intention to use mobile health services, relative advantage plays a more important role than perceived credibility. Moreover, environmental uncertainty positively moderates the effect of relative advantage on an organization's adoption of mobile health services. Thus, mobile health services in environments characterized with high levels of uncertainty are more likely to be adopted because of relative advantage than in environments with low levels of uncertainty. PMID:28115932
A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules
NASA Astrophysics Data System (ADS)
Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.
2012-08-01
Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.
Fast Paleogene Motion of the Pacific Hotspots from Revised Global Plate Circuit Constraints
NASA Technical Reports Server (NTRS)
Raymond, C.; Stock, J.; Cande, S.
2000-01-01
Major improvements in late Cretaceous-early Tertiary Pacific-Antarctica plate reconstructions, and new East-West Antarctica rotations, allow a more definitive test of the relative motion between hotspots using global plate circuit reconstructions with quantitative uncertainties.
Numerical Uncertainty Quantification for Radiation Analysis Tools
NASA Technical Reports Server (NTRS)
Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha
2007-01-01
Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.
On the contribution of PRIDE-JUICE to Jovian system ephemerides
NASA Astrophysics Data System (ADS)
Dirkx, D.; Gurvits, L. I.; Lainey, V.; Lari, G.; Milani, A.; Cimò, G.; Bocanegra-Bahamon, T. M.; Visser, P. N. A. M.
2017-11-01
The Jupiter Icy Moons Explorer (JUICE) mission will perform detailed measurements of the properties of the Galilean moons, with a nominal in-system science-mission duration of about 3.5 years. Using both the radio tracking data, and (Earth- and JUICE-based) optical astrometry, the dynamics of the Galilean moons will be measured to unprecedented accuracy. This will provide crucial input to the determination of the ephemerides and physical properties of the system, most notably the dissipation in Io and Jupiter. The data from Planetary Radio Interferometry and Doppler Experiment (PRIDE) will provide the lateral position of the spacecraft in the International Celestial Reference Frame (ICRF). In this article, we analyze the relative quantitative influence of the JUICE-PRIDE observables to the determination of the ephemerides of the Jovian system and the associated physical parameters. We perform a covariance analysis for a broad range of mission and system characteristics. We analyze the influence of VLBI data quality, observation planning, as well as the influence of JUICE orbit determination quality. This provides key input for the further development of the PRIDE observational planning and ground segment development. Our analysis indicates that the VLBI data are especially important for constraining the dynamics of Ganymede and Callisto perpendicular to their orbital planes. Also, the use of the VLBI data makes the uncertainty in the ephemerides less dependent on the error in the orbit determination of the JUICE spacecraft itself. Furthermore, we find that optical astrometry data of especially Io using the JANUS instrument will be crucial for stabilizing the solution of the normal equations. Knowledge of the dissipation in the Jupiter system cannot be improved using satellite dynamics obtained from JUICE data alone, the uncertainty in Io's dissipation obtained from our simulations is similar to the present level of uncertainty.
Uncertainty as Knowledge: Constraints on Policy Choices Provided by Analysis of Uncertainty
NASA Astrophysics Data System (ADS)
Lewandowsky, S.; Risbey, J.; Smithson, M.; Newell, B. R.
2012-12-01
Uncertainty forms an integral part of climate science, and it is often cited in connection with arguments against mitigative action. We argue that an analysis of uncertainty must consider existing knowledge as well as uncertainty, and the two must be evaluated with respect to the outcomes and risks associated with possible policy options. Although risk judgments are inherently subjective, an analysis of the role of uncertainty within the climate system yields two constraints that are robust to a broad range of assumptions. Those constraints are that (a) greater uncertainty about the climate system is necessarily associated with greater expected damages from warming, and (b) greater uncertainty translates into a greater risk of the failure of mitigation efforts. These ordinal constraints are unaffected by subjective or cultural risk-perception factors, they are independent of the discount rate, and they are independent of the magnitude of the estimate for climate sensitivity. The constraints mean that any appeal to uncertainty must imply a stronger, rather than weaker, need to cut greenhouse gas emissions than in the absence of uncertainty.
NASA Lewis Research Center Futuring Workshop
NASA Technical Reports Server (NTRS)
Boroush, Mark; Stover, John; Thomas, Charles
1987-01-01
On October 21 and 22, 1986, the Futures Group ran a two-day Futuring Workshop on the premises of NASA Lewis Research Center. The workshop had four main goals: to acquaint participants with the general history of technology forecasting; to familiarize participants with the range of forecasting methodologies; to acquaint participants with the range of applicability, strengths, and limitations of each method; and to offer participants some hands-on experience by working through both judgmental and quantitative case studies. Among the topics addressed during this workshop were: information sources; judgmental techniques; quantitative techniques; merger of judgment with quantitative measurement; data collection methods; and dealing with uncertainty.
NASA Astrophysics Data System (ADS)
Han, Feng; Zheng, Yi
2018-06-01
Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.
Landslide Risk: Economic Valuation in The North-Eastern Zone of Medellin City
NASA Astrophysics Data System (ADS)
Vega, Johnny Alexander; Hidalgo, César Augusto; Johana Marín, Nini
2017-10-01
Natural disasters of a geodynamic nature can cause enormous economic and human losses. The economic costs of a landslide disaster include relocation of communities and physical repair of urban infrastructure. However, when performing a quantitative risk analysis, generally, the indirect economic consequences of such an event are not taken into account. A probabilistic approach methodology that considers several scenarios of hazard and vulnerability to measure the magnitude of the landslide and to quantify the economic costs is proposed. With this approach, it is possible to carry out a quantitative evaluation of the risk by landslides, allowing the calculation of the economic losses before a potential disaster in an objective, standardized and reproducible way, taking into account the uncertainty of the building costs in the study zone. The possibility of comparing different scenarios facilitates the urban planning process, the optimization of interventions to reduce risk to acceptable levels and an assessment of economic losses according to the magnitude of the damage. For the development and explanation of the proposed methodology, a simple case study is presented, located in north-eastern zone of the city of Medellín. This area has particular geomorphological characteristics, and it is also characterized by the presence of several buildings in bad structural conditions. The proposed methodology permits to obtain an estimative of the probable economic losses by earthquake-induced landslides, taking into account the uncertainty of the building costs in the study zone. The obtained estimative shows that the structural intervention of the buildings produces a reduction the order of 21 % in the total landslide risk.
Proterozoic Milankovitch cycles and the history of the solar system.
Meyers, Stephen R; Malinverno, Alberto
2018-06-19
The geologic record of Milankovitch climate cycles provides a rich conceptual and temporal framework for evaluating Earth system evolution, bestowing a sharp lens through which to view our planet's history. However, the utility of these cycles for constraining the early Earth system is hindered by seemingly insurmountable uncertainties in our knowledge of solar system behavior (including Earth-Moon history), and poor temporal control for validation of cycle periods (e.g., from radioisotopic dates). Here we address these problems using a Bayesian inversion approach to quantitatively link astronomical theory with geologic observation, allowing a reconstruction of Proterozoic astronomical cycles, fundamental frequencies of the solar system, the precession constant, and the underlying geologic timescale, directly from stratigraphic data. Application of the approach to 1.4-billion-year-old rhythmites indicates a precession constant of 85.79 ± 2.72 arcsec/year (2σ), an Earth-Moon distance of 340,900 ± 2,600 km (2σ), and length of day of 18.68 ± 0.25 hours (2σ), with dominant climatic precession cycles of ∼14 ky and eccentricity cycles of ∼131 ky. The results confirm reduced tidal dissipation in the Proterozoic. A complementary analysis of Eocene rhythmites (∼55 Ma) illustrates how the approach offers a means to map out ancient solar system behavior and Earth-Moon history using the geologic archive. The method also provides robust quantitative uncertainties on the eccentricity and climatic precession periods, and derived astronomical timescales. As a consequence, the temporal resolution of ancient Earth system processes is enhanced, and our knowledge of early solar system dynamics is greatly improved.
CCQM-K102: polybrominated diphenyl ethers in sediment
NASA Astrophysics Data System (ADS)
Ricci, Marina; Shegunova, Penka; Conneely, Patrick; Becker, Roland; Maldonado Torres, Mauricio; Arce Osuna, Mariana; On, Tang Po; Man, Lee Ho; Baek, Song-Yee; Kim, Byungjoo; Hopley, Christopher; Liscio, Camilla; Warren, John; Le Diouron, Véronique; Lardy-Fontan, Sophie; Lalere, Béatrice; Mingwu, Shao; Kucklick, John; Vamathevan, Veronica; Matsuyama, Shigetomo; Numata, Masahiko; Brits, Martin; Quinn, Laura; Fernandes-Whaley, Maria; Ceyhan Gören, Ahmet; Binici, Burcu; Konopelko, Leonid; Krylov, Anatoli; Mikheeva, Alena
2017-01-01
The key comparison CCQM-K102: Polybrominated diphenyl ethers in sediment was coordinated by the JRC, Directorate F - Health, Consumers & Reference Materials, Geel (Belgium) under the auspices of the Organic Analysis Working Group (OAWG) of the Comité Consultatif pour la Quantité de Matière (CCQM). Thirteen National Metrology institutes or Designated Institutes and the JRC participated. Participants were requested to report the mass fraction (on a dry mass basis) of BDE 47, 99 and 153 in the freshwater sediment study material. The sediment originated from a river in Belgium and contained PBDEs (and other pollutants) at levels commonly found in environmental samples. The comparison was designed to demonstrate participants' capability of analysing non-polar organic molecules in abiotic dried matrices (approximate range of molecular weights: 100 to 800 g/mol, polarity corresponding to pKow < -2, range of mass fraction: 1-1000 μg/kg). All participants (except one using ultrasonic extraction) applied Pressurised Liquid Extraction or Soxhlet, while the instrumental analysis was performed with GC-MS/MS, GC-MS or GC-HRMS. Isotope Dilution Mass Spectrometry approach was used for quantification (except in one case). The assigned Key Comparison Reference Values (KCRVs) were the medians of thirteen results for BDE 47 and eleven results for BDE 99 and 153, respectively. BDE 47 was assigned a KCRV of 15.60 μg/kg with a combined standard uncertainty of 0.41 μg/kg, BDE 99 was assigned a KCRV of 33.69 μg/kg with a combined standard uncertainty of 0.81 μg/kg and BDE 153 was assigned a KCRV of 6.28 μg/kg with a combined standard uncertainty of 0.28 μg/kg. The k-factor for the estimation of the expanded uncertainty of the KCRVs was chosen as k = 2. The degree of equivalence (with the KCRV) and its uncertainty were calculated for each result. Most of the participants to CCQM-K102 were able to demonstrate or confirm their capabilities in the analysis of non-polar organic molecules in abiotic dried matrices. Throughout the study it became clear that matrix interferences can influence the accurate quantification of the PBDEs, if the analytical methodology applied is not appropriately adapted and optimised. This comparison shows that quantification of PBDEs at the μg/kg low-middle range in a challenging environmental abiotic dried matrix can be achieved with relative expanded uncertainties below 15 % (more than 70 % of participating laboratories), well in line with the best measurement performances in the environmental analysis field. Main text To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).
Hunt, Randall J.
2012-01-01
Management decisions will often be directly informed by model predictions. However, we now know there can be no expectation of a single ‘true’ model; thus, model results are uncertain. Understandable reporting of underlying uncertainty provides necessary context to decision-makers, as model results are used for management decisions. This, in turn, forms a mechanism by which groundwater models inform a risk-management framework because uncertainty around a prediction provides the basis for estimating the probability or likelihood of some event occurring. Given that the consequences of management decisions vary, it follows that the extent of and resources devoted to an uncertainty analysis may depend on the consequences. For events with low impact, a qualitative, limited uncertainty analysis may be sufficient for informing a decision. For events with a high impact, on the other hand, the risks might be better assessed and associated decisions made using a more robust and comprehensive uncertainty analysis. The purpose of this chapter is to provide guidance on uncertainty analysis through discussion of concepts and approaches, which can vary from heuristic (i.e. the modeller’s assessment of prediction uncertainty based on trial and error and experience) to a comprehensive, sophisticated, statistics-based uncertainty analysis. Most of the material presented here is taken from Doherty et al. (2010) if not otherwise cited. Although the treatment here is necessarily brief, the reader can find citations for the source material and additional references within this chapter.
NASA Astrophysics Data System (ADS)
Rohmer, Jeremy; Verdel, Thierry
2017-04-01
Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e.g., Baudrit et al., 2007) for geo-hazard assessments. A graphical tool is then developed to explore: 1. the contribution of both types of uncertainty, aleatoric and epistemic; 2. the regions of the imprecise or random parameters which contribute the most to the imprecision on the failure probability P. The method is applied on two case studies (a mine pillar and a steep slope stability analysis, Rohmer and Verdel, 2014) to investigate the necessity for extra data acquisition on parameters whose imprecision can hardly be modelled by probabilities due to the scarcity of the available information (respectively the extraction ratio and the cliff geometry). References Baudrit, C., Couso, I., & Dubois, D. (2007). Joint propagation of probability and possibility in risk analysis: Towards a formal framework. International Journal of Approximate Reasoning, 45(1), 82-105. Rohmer, J., & Verdel, T. (2014). Joint exploration of regional importance of possibilistic and probabilistic uncertainty in stability analysis. Computers and Geotechnics, 61, 308-315.
NASA Technical Reports Server (NTRS)
Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.
2012-01-01
There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.
DECISION-COMPONENTS OF NICE'S TECHNOLOGY APPRAISALS ASSESSMENT FRAMEWORK.
de Folter, Joost; Trusheim, Mark; Jonsson, Pall; Garner, Sarah
2018-01-01
Value assessment frameworks have gained prominence recently in the context of U.S. healthcare. Such frameworks set out a series of factors that are considered in funding decisions. The UK's National Institute of Health and Care Excellence (NICE) is an established health technology assessment (HTA) agency. We present a novel application of text analysis that characterizes NICE's Technology Appraisals in the context of the newer assessment frameworks and present the results in a visual way. A total of 243 documents of NICE's medicines guidance from 2007 to 2016 were analyzed. Text analysis was used to identify a hierarchical set of decision factors considered in the assessments. The frequency of decision factors stated in the documents was determined and their association with terms related to uncertainty. The results were incorporated into visual representations of hierarchical factors. We identified 125 decision factors, and hierarchically grouped these into eight domains: Clinical Effectiveness, Cost Effectiveness, Condition, Current Practice, Clinical Need, New Treatment, Studies, and Other Factors. Textual analysis showed all domains appeared consistently in the guidance documents. Many factors were commonly associated with terms relating to uncertainty. A series of visual representations was created. This study reveals the complexity and consistency of NICE's decision-making processes and demonstrates that cost effectiveness is not the only decision-criteria. The study highlights the importance of processes and methodology that can take both quantitative and qualitative information into account. Visualizations can help effectively communicate this complex information during the decision-making process and subsequently to stakeholders.
NASA Astrophysics Data System (ADS)
Peres, David J.; Cancelliere, Antonino; Greco, Roberto; Bogaard, Thom A.
2018-03-01
Uncertainty in rainfall datasets and landslide inventories is known to have negative impacts on the assessment of landslide-triggering thresholds. In this paper, we perform a quantitative analysis of the impacts of uncertain knowledge of landslide initiation instants on the assessment of rainfall intensity-duration landslide early warning thresholds. The analysis is based on a synthetic database of rainfall and landslide information, generated by coupling a stochastic rainfall generator and a physically based hydrological and slope stability model, and is therefore error-free in terms of knowledge of triggering instants. This dataset is then perturbed according to hypothetical reporting scenarios
that allow simulation of possible errors in landslide-triggering instants as retrieved from historical archives. The impact of these errors is analysed jointly using different criteria to single out rainfall events from a continuous series and two typical temporal aggregations of rainfall (hourly and daily). The analysis shows that the impacts of the above uncertainty sources can be significant, especially when errors exceed 1 day or the actual instants follow the erroneous ones. Errors generally lead to underestimated thresholds, i.e. lower than those that would be obtained from an error-free dataset. Potentially, the amount of the underestimation can be enough to induce an excessive number of false positives, hence limiting possible landslide mitigation benefits. Moreover, the uncertain knowledge of triggering rainfall limits the possibility to set up links between thresholds and physio-geographical factors.
Facility Measurement Uncertainty Analysis at NASA GRC
NASA Technical Reports Server (NTRS)
Stephens, Julia; Hubbard, Erin
2016-01-01
This presentation provides and overview of the measurement uncertainty analysis currently being implemented in various facilities at NASA GRC. This presentation includes examples pertinent to the turbine engine community (mass flow and fan efficiency calculation uncertainties.
ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING
The overall goal of the EPA-ORD NERL research program on Computational Toxicology (CompTox) is to provide the Agency with the tools of modern chemistry, biology, and computing to improve quantitative risk assessments and reduce uncertainties in the source-to-adverse outcome conti...
Ecological periodic tables are an information organizing system. Their elements are categorical habitat types. Their attributes are quantitative, predictably recurring (periodic) properties of a target biotic community. Since they translate habitats as inputs into measures of ...
Strauss, Lukas; Serafin, Stefano; Haimov, Samuel; Grubišić, Vanda
2015-10-01
Atmospheric turbulence generated in flow over mountainous terrain is studied using airborne in situ and cloud radar measurements over the Medicine Bow Mountains in southeast Wyoming, USA. During the NASA Orographic Clouds Experiment (NASA06) in 2006, two complex mountain flow cases were documented by the University of Wyoming King Air research aircraft carrying the Wyoming Cloud Radar. The structure of turbulence and its intensity across the mountain range are described using the variance of vertical velocity σw2 and the cube root of the energy dissipation rate ɛ 1/3 (EDR). For a quantitative analysis of turbulence from the cloud radar, the uncertainties in the Doppler wind retrieval have to be taken into account, such as the variance of hydrometeor fall speed and the contamination of vertical Doppler velocity by the horizontal wind. A thorough analysis of the uncertainties shows that 25% accuracy or better can be achieved in regions of moderate to severe turbulence in the lee of the mountains, while only qualitative estimates of turbulence intensity can be obtained outside the most turbulent regions. Two NASA06 events exhibiting large-amplitude mountain waves, mid-tropospheric wave breaking, and rotor circulations are examined. Moderate turbulence is found in a wave-breaking region with σw2 and EDR reaching 4.8 m 2 s -2 and 0.25 m 2/3 s -1 , respectively. Severe turbulence is measured within the rotor circulations with σw2 and EDR respectively in the ranges of 7.8-16.4 m 2 s -2 and 0.50-0.77 m 2/3 s -1 . A unique result of this study is the quantitative estimation of the intensity of turbulence and its spatial distribution in the interior of atmospheric rotors, provided by the radar-derived turbulence fields.
NASA Astrophysics Data System (ADS)
Miller, S. D.; Combs, C.; Wagner, S.; Viticchiè, B.; Walther, A.; Solbrig, J.
2014-12-01
The VIIRS Day-Night Band provides the first calibrated observations of nocturnal low-light visible/near-infrared (~500-900 nm response, 710 nm central wavelength) radiances, including reflected moonlight down to values of 3 × 10-5 W·m-2·sr-1. These novel measurements afford the first opportunity to attempt nighttime retrievals of optical depth for optically thick clouds when moonlight is available, thereby advancing our ability to observe the diurnal cycle of such structures as marine stratocumuli which are thought to play an important role in determining climate and climate feedbacks. In order to leverage the Day-Night Band measurements in this capacity, we must first convert the upwelling top-of-atmosphere radiances to equivalent values of reflectance. Doing so requires a detailed knowledge of the down-welling top-of-atmosphere lunar spectral irradiance which, unlike sunlight, varies significantly over the course of the ~29.5 day lunar cycle. This research summarizes the ongoing development, validation, and refinement of a lunar irradiance model designed to convert Day-Night Band radiances to equivalent lunar reflectance. Comparisons between daytime and nighttime Day-Night Band reflectance for vicarious calibration targets offering radiometric stability (e.g., White Sands, Salar de Uyuni, Dome-C, and snow fields) confirms the model's performance to within an expected ~10% uncertainty. An observed lunar-phase-dependent trend associated with the model's assumption of a disk-averaged albedo was addressed via analysis of a version of the model adapted for comparison against Meteosat Second Generation SEVIRI lunar measurements. The analysis resulted in a phase-dependent 6th order polynomial correction to the model and expected model uncertainty improvements to within ~5%. Examples of lunar reflectance imagery for operational applications and the provisional quantitative application of Day-Night Band lunar reflectance to nighttime cloud optical property retrievals, bearing relevance to the diurnally resolved global climate data record, are shown.
NASA Astrophysics Data System (ADS)
Morin, Efrat; Marra, Francesco; Peleg, Nadav; Mei, Yiwen; Anagnostou, Emmanouil N.
2017-04-01
Rainfall frequency analysis is used to quantify the probability of occurrence of extreme rainfall and is traditionally based on rain gauge records. The limited spatial coverage of rain gauges is insufficient to sample the spatiotemporal variability of extreme rainfall and to provide the areal information required by management and design applications. Conversely, remote sensing instruments, even if quantitative uncertain, offer coverage and spatiotemporal detail that allow overcoming these issues. In recent years, remote sensing datasets began to be used for frequency analyses, taking advantage of increased record lengths and quantitative adjustments of the data. However, the studies so far made use of concepts and techniques developed for rain gauge (i.e. point or multiple-point) data and have been validated by comparison with gauge-derived analyses. These procedures add further sources of uncertainty and prevent from isolating between data and methodological uncertainties and from fully exploiting the available information. In this study, we step out of the gauge-centered concept presenting a direct comparison between at-site Intensity-Duration-Frequency (IDF) curves derived from different remote sensing datasets on corresponding spatial scales, temporal resolutions and records. We analyzed 16 years of homogeneously corrected and gauge-adjusted C-Band weather radar estimates, high-resolution CMORPH and gauge-adjusted high-resolution CMORPH over the Eastern Mediterranean. Results of this study include: (a) good spatial correlation between radar and satellite IDFs ( 0.7 for 2-5 years return period); (b) consistent correlation and dispersion in the raw and gauge adjusted CMORPH; (c) bias is almost uniform with return period for 12-24 h durations; (d) radar identifies thicker tail distributions than CMORPH and the tail of the distributions depends on the spatial and temporal scales. These results demonstrate the potential of remote sensing datasets for rainfall frequency analysis for management (e.g. warning and early-warning systems) and design (e.g. sewer design, large scale drainage planning)
U.S. Eastern Continental Shelf Carbon Cycling (USECoS): Modeling, Data Assimilation, and Analysis
NASA Technical Reports Server (NTRS)
Mannino, Antonio
2008-01-01
Although the oceans play a major role in the uptake of fossil fuel CO2 from the atmosphere, there is much debate about the contribution from continental shelves, since many key shelf fluxes are not yet well quantified: the exchange of carbon across the land-ocean and shelf-slope interfaces, air-sea exchange of CO2, burial, and biological processes including productivity. Our goal is to quantify these carbon fluxes along the eastern U.S. coast using models quantitatively verified by comparison to observations, and to establish a framework for predicting how these fluxes may be modified as a result of climate and land use change. Our research questions build on those addressed with previous NASA funding for the USECoS (U.S. Eastern Continental Shelf Carbon Cycling) project. We have developed a coupled biogeochemical ocean circulation model configured for this study region and have extensively evaluated this model with both in situ and remotely-sensed data. Results indicate that to further reduce uncertainties in the shelf component of the global carbon cycle, future efforts must be directed towards 1) increasing the resolution of the physical model via nesting and 2) making refinements to the biogeochemical model and quantitatively evaluating these via the assimilation of biogeochemical data (in situ and remotely-sensed). These model improvements are essential for better understanding and reducing estimates of uncertainties in current and future carbon transformations and cycling in continental shelf systems. Our approach and science questions are particularly germane to the carbon cycle science goals of the NASA Earth Science Research Program as well as the U.S. Climate Change Research Program and the North American Carbon Program. Our interdisciplinary research team consists of scientists who have expertise in the physics and biogeochemistry of the U.S. eastern continental shelf, remote-sensing data analysis and data assimilative numerical models.
NASA Technical Reports Server (NTRS)
Ichoku, Charles; Ellison, Luke
2011-01-01
Satellite remote sensing is providing us tremendous opportunities to measure the fire radiative energy (FRE) release rate or power (FRP) from open biomass burning, which affects many vegetated regions of the world on a seasonal basis. Knowledge of the biomass burning characteristics and emission source strengths of different (particulate and gaseous) smoke constituents is one of the principal ingredients upon which the assessment, modeling, and forecasting of their distribution and impacts depend. This knowledge can be gained through accurate measurement of FRP, which has been shown to have a direct relationship with the rates of biomass consumption and emissions of major smoke constituents. Over the last decade or so, FRP has been routinely measured from space by both the MODIS sensors aboard the polar orbiting Terra and Aqua satellites, and the SEVIRI sensor aboard the Meteosat Second Generation (MSG) geostationary satellite. During the last few years, FRP has steadily gained increasing recognition as an important parameter for facilitating the development of various scientific studies and applications relating to the quantitative characterization of biomass burning and their emissions. To establish the scientific integrity of the FRP as a stable quantity that can be measured consistently across a variety of sensors and platforms, with the potential of being utilized to develop a unified long-term climate data record of fire activity and impacts, it needs to be thoroughly evaluated, calibrated, and validated. Therefore, we are conducting a detailed analysis of the FRP products from MODIS to evaluate the uncertainties associated with them, such as those due to the effects of satellite variable observation geometry and other factors, in order to establish their error budget for use in diverse scientific research and applications. In this presentation, we will show recent results of the MODIS FRP uncertainty analysis and error mitigation solutions, and demonstrate their implications for biomass burning emissions assessment.
Use of Bayesian event trees in semi-quantitative volcano eruption forecasting and hazard analysis
NASA Astrophysics Data System (ADS)
Wright, Heather; Pallister, John; Newhall, Chris
2015-04-01
Use of Bayesian event trees to forecast eruptive activity during volcano crises is an increasingly common practice for the USGS-USAID Volcano Disaster Assistance Program (VDAP) in collaboration with foreign counterparts. This semi-quantitative approach combines conceptual models of volcanic processes with current monitoring data and patterns of occurrence to reach consensus probabilities. This approach allows a response team to draw upon global datasets, local observations, and expert judgment, where the relative influence of these data depends upon the availability and quality of monitoring data and the degree to which the volcanic history is known. The construction of such event trees additionally relies upon existence and use of relevant global databases and documented past periods of unrest. Because relevant global databases may be underpopulated or nonexistent, uncertainty in probability estimations may be large. Our 'hybrid' approach of combining local and global monitoring data and expert judgment facilitates discussion and constructive debate between disciplines: including seismology, gas geochemistry, geodesy, petrology, physical volcanology and technology/engineering, where difference in opinion between response team members contributes to definition of the uncertainty in the probability estimations. In collaboration with foreign colleagues, we have created event trees for numerous areas experiencing volcanic unrest. Event trees are created for a specified time frame and are updated, revised, or replaced as the crisis proceeds. Creation of an initial tree is often prompted by a change in monitoring data, such that rapid assessment of probability is needed. These trees are intended as a vehicle for discussion and a way to document relevant data and models, where the target audience is the scientists themselves. However, the probabilities derived through the event-tree analysis can also be used to help inform communications with emergency managers and the public. VDAP trees evaluate probabilities of: magmatic intrusion, likelihood of eruption, magnitude of eruption, and types of associated hazardous events and their extents. In a few cases, trees have been extended to also assess and communicate vulnerability and relative risk.
Gwinn, Maureen R; Craig, Jeneva; Axelrad, Daniel A; Cook, Rich; Dockins, Chris; Fann, Neal; Fegley, Robert; Guinnup, David E; Helfand, Gloria; Hubbell, Bryan; Mazur, Sarah L; Palma, Ted; Smith, Roy L; Vandenberg, John; Sonawane, Babasaheb
2011-01-01
Quantifying the benefits of reducing hazardous air pollutants (HAPs, or air toxics) has been limited by gaps in toxicological data, uncertainties in extrapolating results from high-dose animal experiments to estimate human effects at lower doses, limited ambient and personal exposure monitoring data, and insufficient economic research to support valuation of the health impacts often associated with exposure to individual air toxics. To address some of these issues, the U.S. Environmental Protection Agency held the Workshop on Estimating the Benefits of Reducing Hazardous Air Pollutants (HAPs) in Washington, DC, from 30 April to 1 May 2009. Experts from multiple disciplines discussed how best to move forward on air toxics benefits assessment, with a focus on developing near-term capability to conduct quantitative benefits assessment. Proposed methodologies involved analysis of data-rich pollutants and application of this analysis to other pollutants, using dose-response modeling of animal data for estimating benefits to humans, determining dose-equivalence relationships for different chemicals with similar health effects, and analysis similar to that used for criteria pollutants. Limitations and uncertainties in economic valuation of benefits assessment for HAPS were discussed as well. These discussions highlighted the complexities in estimating the benefits of reducing air toxics, and participants agreed that alternative methods for benefits assessment of HAPs are needed. Recommendations included clearly defining the key priorities of the Clean Air Act air toxics program to identify the most effective approaches for HAPs benefits analysis, focusing on susceptible and vulnerable populations, and improving dose-response estimation for quantification of benefits.
A framework to quantify uncertainties of seafloor backscatter from swath mapping echosounders
NASA Astrophysics Data System (ADS)
Malik, Mashkoor; Lurton, Xavier; Mayer, Larry
2018-06-01
Multibeam echosounders (MBES) have become a widely used acoustic remote sensing tool to map and study the seafloor, providing co-located bathymetry and seafloor backscatter. Although the uncertainty associated with MBES-derived bathymetric data has been studied extensively, the question of backscatter uncertainty has been addressed only minimally and hinders the quantitative use of MBES seafloor backscatter. This paper explores approaches to identifying uncertainty sources associated with MBES-derived backscatter measurements. The major sources of uncertainty are catalogued and the magnitudes of their relative contributions to the backscatter uncertainty budget are evaluated. These major uncertainty sources include seafloor insonified area (1-3 dB), absorption coefficient (up to > 6 dB), random fluctuations in echo level (5.5 dB for a Rayleigh distribution), and sonar calibration (device dependent). The magnitudes of these uncertainty sources vary based on how these effects are compensated for during data acquisition and processing. Various cases (no compensation, partial compensation and full compensation) for seafloor insonified area, transmission losses and random fluctuations were modeled to estimate their uncertainties in different scenarios. Uncertainty related to the seafloor insonified area can be reduced significantly by accounting for seafloor slope during backscatter processing while transmission losses can be constrained by collecting full water column absorption coefficient profiles (temperature and salinity profiles). To reduce random fluctuations to below 1 dB, at least 20 samples are recommended to be used while computing mean values. The estimation of uncertainty in backscatter measurements is constrained by the fact that not all instrumental components are characterized and documented sufficiently for commercially available MBES. Further involvement from manufacturers in providing this essential information is critically required.
Systematic uncertainties in long-baseline neutrino-oscillation experiments
NASA Astrophysics Data System (ADS)
Ankowski, Artur M.; Mariani, Camillo
2017-05-01
Future neutrino-oscillation experiments are expected to bring definite answers to the questions of neutrino-mass hierarchy and violation of charge-parity symmetry in the lepton-sector. To realize this ambitious program it is necessary to ensure a significant reduction of uncertainties, particularly those related to neutrino-energy reconstruction. In this paper, we discuss different sources of systematic uncertainties, paying special attention to those arising from nuclear effects and detector response. By analyzing nuclear effects we show the importance of developing accurate theoretical models, capable of providing a quantitative description of neutrino cross sections, together with the relevance of their implementation in Monte Carlo generators and extensive testing against lepton-scattering data. We also point out the fundamental role of efforts aiming to determine detector responses in test-beam exposures.
Numerical modelling of instantaneous plate tectonics
NASA Technical Reports Server (NTRS)
Minster, J. B.; Haines, E.; Jordan, T. H.; Molnar, P.
1974-01-01
Assuming lithospheric plates to be rigid, 68 spreading rates, 62 fracture zones trends, and 106 earthquake slip vectors are systematically inverted to obtain a self-consistent model of instantaneous relative motions for eleven major plates. The inverse problem is linearized and solved iteratively by a maximum-likelihood procedure. Because the uncertainties in the data are small, Gaussian statistics are shown to be adequate. The use of a linear theory permits (1) the calculation of the uncertainties in the various angular velocity vectors caused by uncertainties in the data, and (2) quantitative examination of the distribution of information within the data set. The existence of a self-consistent model satisfying all the data is strong justification of the rigid plate assumption. Slow movement between North and South America is shown to be resolvable.
Simon, Aaron B.; Dubowitz, David J.; Blockley, Nicholas P.; Buxton, Richard B.
2016-01-01
Calibrated blood oxygenation level dependent (BOLD) imaging is a multimodal functional MRI technique designed to estimate changes in cerebral oxygen metabolism from measured changes in cerebral blood flow and the BOLD signal. This technique addresses fundamental ambiguities associated with quantitative BOLD signal analysis; however, its dependence on biophysical modeling creates uncertainty in the resulting oxygen metabolism estimates. In this work, we developed a Bayesian approach to estimating the oxygen metabolism response to a neural stimulus and used it to examine the uncertainty that arises in calibrated BOLD estimation due to the presence of unmeasured model parameters. We applied our approach to estimate the CMRO2 response to a visual task using the traditional hypercapnia calibration experiment as well as to estimate the metabolic response to both a visual task and hypercapnia using the measurement of baseline apparent R2′ as a calibration technique. Further, in order to examine the effects of cerebral spinal fluid (CSF) signal contamination on the measurement of apparent R2′, we examined the effects of measuring this parameter with and without CSF-nulling. We found that the two calibration techniques provided consistent estimates of the metabolic response on average, with a median R2′-based estimate of the metabolic response to CO2 of 1.4%, and R2′- and hypercapnia-calibrated estimates of the visual response of 27% and 24%, respectively. However, these estimates were sensitive to different sources of estimation uncertainty. The R2′-calibrated estimate was highly sensitive to CSF contamination and to uncertainty in unmeasured model parameters describing flow-volume coupling, capillary bed characteristics, and the iso-susceptibility saturation of blood. The hypercapnia-calibrated estimate was relatively insensitive to these parameters but highly sensitive to the assumed metabolic response to CO2. PMID:26790354
Simon, Aaron B; Dubowitz, David J; Blockley, Nicholas P; Buxton, Richard B
2016-04-01
Calibrated blood oxygenation level dependent (BOLD) imaging is a multimodal functional MRI technique designed to estimate changes in cerebral oxygen metabolism from measured changes in cerebral blood flow and the BOLD signal. This technique addresses fundamental ambiguities associated with quantitative BOLD signal analysis; however, its dependence on biophysical modeling creates uncertainty in the resulting oxygen metabolism estimates. In this work, we developed a Bayesian approach to estimating the oxygen metabolism response to a neural stimulus and used it to examine the uncertainty that arises in calibrated BOLD estimation due to the presence of unmeasured model parameters. We applied our approach to estimate the CMRO2 response to a visual task using the traditional hypercapnia calibration experiment as well as to estimate the metabolic response to both a visual task and hypercapnia using the measurement of baseline apparent R2' as a calibration technique. Further, in order to examine the effects of cerebral spinal fluid (CSF) signal contamination on the measurement of apparent R2', we examined the effects of measuring this parameter with and without CSF-nulling. We found that the two calibration techniques provided consistent estimates of the metabolic response on average, with a median R2'-based estimate of the metabolic response to CO2 of 1.4%, and R2'- and hypercapnia-calibrated estimates of the visual response of 27% and 24%, respectively. However, these estimates were sensitive to different sources of estimation uncertainty. The R2'-calibrated estimate was highly sensitive to CSF contamination and to uncertainty in unmeasured model parameters describing flow-volume coupling, capillary bed characteristics, and the iso-susceptibility saturation of blood. The hypercapnia-calibrated estimate was relatively insensitive to these parameters but highly sensitive to the assumed metabolic response to CO2. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhang, Bowen; Tian, Hanqin; Lu, Chaoqun; Chen, Guangsheng; Pan, Shufen; Anderson, Christopher; Poulter, Benjamin
2017-09-01
A wide range of estimates on global wetland methane (CH4) fluxes has been reported during the recent two decades. This gives rise to urgent needs to clarify and identify the uncertainty sources, and conclude a reconciled estimate for global CH4 fluxes from wetlands. Most estimates by using bottom-up approach rely on wetland data sets, but these data sets show largely inconsistent in terms of both wetland extent and spatiotemporal distribution. A quantitative assessment of uncertainties associated with these discrepancies among wetland data sets has not been well investigated yet. By comparing the five widely used global wetland data sets (GISS, GLWD, Kaplan, GIEMS and SWAMPS-GLWD), it this study, we found large differences in the wetland extent, ranging from 5.3 to 10.2 million km2, as well as their spatial and temporal distributions among the five data sets. These discrepancies in wetland data sets resulted in large bias in model-estimated global wetland CH4 emissions as simulated by using the Dynamic Land Ecosystem Model (DLEM). The model simulations indicated that the mean global wetland CH4 emissions during 2000-2007 were 177.2 ± 49.7 Tg CH4 yr-1, based on the five different data sets. The tropical regions contributed the largest portion of estimated CH4 emissions from global wetlands, but also had the largest discrepancy. Among six continents, the largest uncertainty was found in South America. Thus, the improved estimates of wetland extent and CH4 emissions in the tropical regions and South America would be a critical step toward an accurate estimate of global CH4 emissions. This uncertainty analysis also reveals an important need for our scientific community to generate a global scale wetland data set with higher spatial resolution and shorter time interval, by integrating multiple sources of field and satellite data with modeling approaches, for cross-scale extrapolation.
NASA Astrophysics Data System (ADS)
Arnold, Jeffrey; Clark, Martyn; Gutmann, Ethan; Wood, Andy; Nijssen, Bart; Rasmussen, Roy
2016-04-01
The United States Army Corps of Engineers (USACE) has had primary responsibility for multi-purpose water resource operations on most of the major river systems in the U.S. for more than 200 years. In that time, the USACE projects and programs making up those operations have proved mostly robust against the range of natural climate variability encountered over their operating life spans. However, in some watersheds and for some variables, climate change now is known to be shifting the hydroclimatic baseline around which that natural variability occurs and changing the range of that variability as well. This makes historical stationarity an inappropriate basis for assessing continued project operations under climate-changed futures. That means new hydroclimatic projections are required at multiple scales to inform decisions about specific threats and impacts, and for possible adaptation responses to limit water-resource vulnerabilities and enhance operational resilience. However, projections of possible future hydroclimatologies have myriad complex uncertainties that require explicit guidance for interpreting and using them to inform those decisions about climate vulnerabilities and resilience. Moreover, many of these uncertainties overlap and interact. Recent work, for example, has shown the importance of assessing the uncertainties from multiple sources including: global model structure [Meehl et al., 2005; Knutti and Sedlacek, 2013]; internal climate variability [Deser et al., 2012; Kay et al., 2014]; climate downscaling methods [Gutmann et al., 2012; Mearns et al., 2013]; and hydrologic models [Addor et al., 2014; Vano et al., 2014; Mendoza et al., 2015]. Revealing, reducing, and representing these uncertainties is essential for defining the plausible quantitative climate change narratives required to inform water-resource decision-making. And to be useful, such quantitative narratives, or storylines, of climate change threats and hydrologic impacts must sample from the full range of uncertainties associated with all parts of the simulation chain, from global climate models with simulations of natural climate variability, through regional climate downscaling, and on to modeling of affected hydrologic processes and downstream water resources impacts. This talk will present part of the work underway now both to reveal and reduce some important uncertainties and to develop explicit guidance for future generation of quantitative hydroclimatic storylines. Topics will include: 1- model structural and parameter-set limitations of some methods widely used to quantify climate impacts to hydrologic processes [Gutmann et al., 2014; Newman et al., 2015]; 2- development and evaluation of new, spatially consistent, U.S. national-scale climate downscaling and hydrologic simulation capabilities directly relevant at the multiple scales of water-resource decision-making [Newman et al., 2015; Mizukami et al., 2015; Gutmann et al., 2016]; and 3- development and evaluation of advanced streamflow forecasting methods to reduce and represent integrated uncertainties in a tractable way [Wood et al., 2014; Wood et al., 2015]. A key focus will be areas where climatologic and hydrologic science is currently under-developed to inform decisions - or is perhaps wrongly scaled or misapplied in practice - indicating the need for additional fundamental science and interpretation.
Rahman, A.; Tsai, F.T.-C.; White, C.D.; Willson, C.S.
2008-01-01
This study investigates capture zone uncertainty that relates to the coupled semivariogram uncertainty of hydrogeological and geophysical data. Semivariogram uncertainty is represented by the uncertainty in structural parameters (range, sill, and nugget). We used the beta distribution function to derive the prior distributions of structural parameters. The probability distributions of structural parameters were further updated through the Bayesian approach with the Gaussian likelihood functions. Cokriging of noncollocated pumping test data and electrical resistivity data was conducted to better estimate hydraulic conductivity through autosemivariograms and pseudo-cross-semivariogram. Sensitivities of capture zone variability with respect to the spatial variability of hydraulic conductivity, porosity and aquifer thickness were analyzed using ANOVA. The proposed methodology was applied to the analysis of capture zone uncertainty at the Chicot aquifer in Southwestern Louisiana, where a regional groundwater flow model was developed. MODFLOW-MODPATH was adopted to delineate the capture zone. The ANOVA results showed that both capture zone area and compactness were sensitive to hydraulic conductivity variation. We concluded that the capture zone uncertainty due to the semivariogram uncertainty is much higher than that due to the kriging uncertainty for given semivariograms. In other words, the sole use of conditional variances of kriging may greatly underestimate the flow response uncertainty. Semivariogram uncertainty should also be taken into account in the uncertainty analysis. ?? 2008 ASCE.
Alderman, Phillip D.; Stanfill, Bryan
2016-10-06
Recent international efforts have brought renewed emphasis on the comparison of different agricultural systems models. Thus far, analysis of model-ensemble simulated results has not clearly differentiated between ensemble prediction uncertainties due to model structural differences per se and those due to parameter value uncertainties. Additionally, despite increasing use of Bayesian parameter estimation approaches with field-scale crop models, inadequate attention has been given to the full posterior distributions for estimated parameters. The objectives of this study were to quantify the impact of parameter value uncertainty on prediction uncertainty for modeling spring wheat phenology using Bayesian analysis and to assess the relativemore » contributions of model-structure-driven and parameter-value-driven uncertainty to overall prediction uncertainty. This study used a random walk Metropolis algorithm to estimate parameters for 30 spring wheat genotypes using nine phenology models based on multi-location trial data for days to heading and days to maturity. Across all cases, parameter-driven uncertainty accounted for between 19 and 52% of predictive uncertainty, while model-structure-driven uncertainty accounted for between 12 and 64%. Here, this study demonstrated the importance of quantifying both model-structure- and parameter-value-driven uncertainty when assessing overall prediction uncertainty in modeling spring wheat phenology. More generally, Bayesian parameter estimation provided a useful framework for quantifying and analyzing sources of prediction uncertainty.« less
Probabilistic flood damage modelling at the meso-scale
NASA Astrophysics Data System (ADS)
Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno
2014-05-01
Decisions on flood risk management and adaptation are usually based on risk analyses. Such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments. Most damage models have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood damage models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we show how the model BT-FLEMO (Bagging decision Tree based Flood Loss Estimation MOdel) can be applied on the meso-scale, namely on the basis of ATKIS land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany. The application of BT-FLEMO provides a probability distribution of estimated damage to residential buildings per municipality. Validation is undertaken on the one hand via a comparison with eight other damage models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official damage data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of damage estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation model BT-FLEMO is that it inherently provides quantitative information about the uncertainty of the prediction. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64.
The Fourth SeaWiFS HPLC Analysis Round-Robin Experiment (SeaHARRE-4)
NASA Technical Reports Server (NTRS)
Hooker, Stanford B.; Thomas, Crystal S.; van Heukelem, Laurie; Schlueter, louise; Russ, Mary E.; Ras, Josephine; Claustre, Herve; Clementson, Lesley; Canuti, Elisabetta; Berthon, Jean-Francois;
2010-01-01
Ten international laboratories specializing in the determination of marine pigment concentrations using high performance liquid chromatography (HPLC) were intercompared using in situ samples and a mixed pigment sample. Although prior Sea-viewing Wide Field-of-view Sensor (SeaWiFS) High Performance Liquid Chromatography (HPLC) Round-Robin Experiment (SeaHARRE) activities conducted in open-ocean waters covered a wide dynamic range in productivity, and some of the samples were collected in the coastal zone, none of the activities involved exclusively coastal samples. Consequently, SeaHARRE-4 was organized and executed as a strictly coastal activity and the field samples were collected from primarily eutrophic waters within the coastal zone of Denmark. The more restrictive perspective limited the dynamic range in chlorophyll concentration to approximately one and a half orders of magnitude (previous activities covered more than two orders of magnitude). The method intercomparisons were used for the following objectives: a) estimate the uncertainties in quantitating individual pigments and higher-order variables formed from sums and ratios; b) confirm if the chlorophyll a accuracy requirements for ocean color validation activities (approximately 25%, although 15% would allow for algorithm refinement) can be met in coastal waters; c) establish the reduction in uncertainties as a result of applying QA procedures; d) show the importance of establishing a properly defined referencing system in the computation of uncertainties; e) quantify the analytical benefits of performance metrics, and f) demonstrate the utility of a laboratory mix in understanding method performance. In addition, the remote sensing requirements for the in situ determination of total chlorophyll a were investigated to determine whether or not the average uncertainty for this measurement is being satisfied.
Gan, Yanjun; Duan, Qingyun; Gong, Wei; ...
2014-01-01
Sensitivity analysis (SA) is a commonly used approach for identifying important parameters that dominate model behaviors. We use a newly developed software package, a Problem Solving environment for Uncertainty Analysis and Design Exploration (PSUADE), to evaluate the effectiveness and efficiency of ten widely used SA methods, including seven qualitative and three quantitative ones. All SA methods are tested using a variety of sampling techniques to screen out the most sensitive (i.e., important) parameters from the insensitive ones. The Sacramento Soil Moisture Accounting (SAC-SMA) model, which has thirteen tunable parameters, is used for illustration. The South Branch Potomac River basin nearmore » Springfield, West Virginia in the U.S. is chosen as the study area. The key findings from this study are: (1) For qualitative SA methods, Correlation Analysis (CA), Regression Analysis (RA), and Gaussian Process (GP) screening methods are shown to be not effective in this example. Morris One-At-a-Time (MOAT) screening is the most efficient, needing only 280 samples to identify the most important parameters, but it is the least robust method. Multivariate Adaptive Regression Splines (MARS), Delta Test (DT) and Sum-Of-Trees (SOT) screening methods need about 400–600 samples for the same purpose. Monte Carlo (MC), Orthogonal Array (OA) and Orthogonal Array based Latin Hypercube (OALH) are appropriate sampling techniques for them; (2) For quantitative SA methods, at least 2777 samples are needed for Fourier Amplitude Sensitivity Test (FAST) to identity parameter main effect. McKay method needs about 360 samples to evaluate the main effect, more than 1000 samples to assess the two-way interaction effect. OALH and LPτ (LPTAU) sampling techniques are more appropriate for McKay method. For the Sobol' method, the minimum samples needed are 1050 to compute the first-order and total sensitivity indices correctly. These comparisons show that qualitative SA methods are more efficient but less accurate and robust than quantitative ones.« less
NASA Astrophysics Data System (ADS)
Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie
2017-09-01
Automotive brake systems are always subjected to various types of uncertainties and two types of random-fuzzy uncertainties may exist in the brakes. In this paper, a unified approach is proposed for squeal instability analysis of disc brakes with two types of random-fuzzy uncertainties. In the proposed approach, two uncertainty analysis models with mixed variables are introduced to model the random-fuzzy uncertainties. The first one is the random and fuzzy model, in which random variables and fuzzy variables exist simultaneously and independently. The second one is the fuzzy random model, in which uncertain parameters are all treated as random variables while their distribution parameters are expressed as fuzzy numbers. Firstly, the fuzziness is discretized by using α-cut technique and the two uncertainty analysis models are simplified into random-interval models. Afterwards, by temporarily neglecting interval uncertainties, the random-interval models are degraded into random models, in which the expectations, variances, reliability indexes and reliability probabilities of system stability functions are calculated. And then, by reconsidering the interval uncertainties, the bounds of the expectations, variances, reliability indexes and reliability probabilities are computed based on Taylor series expansion. Finally, by recomposing the analysis results at each α-cut level, the fuzzy reliability indexes and probabilities can be obtained, by which the brake squeal instability can be evaluated. The proposed approach gives a general framework to deal with both types of random-fuzzy uncertainties that may exist in the brakes and its effectiveness is demonstrated by numerical examples. It will be a valuable supplement to the systematic study of brake squeal considering uncertainty.
Doppler Global Velocimeter Development for the Large Wind Tunnels at Ames Research Center
NASA Technical Reports Server (NTRS)
Reinath, Michael S.
1997-01-01
Development of an optical, laser-based flow-field measurement technique for large wind tunnels is described. The technique uses laser sheet illumination and charged coupled device detectors to rapidly measure flow-field velocity distributions over large planar regions of the flow. Sample measurements are presented that illustrate the capability of the technique. An analysis of measurement uncertainty, which focuses on the random component of uncertainty, shows that precision uncertainty is not dependent on the measured velocity magnitude. For a single-image measurement, the analysis predicts a precision uncertainty of +/-5 m/s. When multiple images are averaged, this uncertainty is shown to decrease. For an average of 100 images, for example, the analysis shows that a precision uncertainty of +/-0.5 m/s can be expected. Sample applications show that vectors aligned with an orthogonal coordinate system are difficult to measure directly. An algebraic transformation is presented which converts measured vectors to the desired orthogonal components. Uncertainty propagation is then used to show how the uncertainty propagates from the direct measurements to the orthogonal components. For a typical forward-scatter viewing geometry, the propagation analysis predicts precision uncertainties of +/-4, +/-7, and +/-6 m/s, respectively, for the U, V, and W components at 68% confidence.
Variance-Based Sensitivity Analysis to Support Simulation-Based Design Under Uncertainty
Opgenoord, Max M. J.; Allaire, Douglas L.; Willcox, Karen E.
2016-09-12
Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as amore » function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.« less
Variance-Based Sensitivity Analysis to Support Simulation-Based Design Under Uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Opgenoord, Max M. J.; Allaire, Douglas L.; Willcox, Karen E.
Sensitivity analysis plays a critical role in quantifying uncertainty in the design of engineering systems. A variance-based global sensitivity analysis is often used to rank the importance of input factors, based on their contribution to the variance of the output quantity of interest. However, this analysis assumes that all input variability can be reduced to zero, which is typically not the case in a design setting. Distributional sensitivity analysis (DSA) instead treats the uncertainty reduction in the inputs as a random variable, and defines a variance-based sensitivity index function that characterizes the relative contribution to the output variance as amore » function of the amount of uncertainty reduction. This paper develops a computationally efficient implementation for the DSA formulation and extends it to include distributions commonly used in engineering design under uncertainty. Application of the DSA method to the conceptual design of a commercial jetliner demonstrates how the sensitivity analysis provides valuable information to designers and decision-makers on where and how to target uncertainty reduction efforts.« less
Factors controlling stream water nitrate and phosphor loads during precipitation events
NASA Astrophysics Data System (ADS)
Rozemeijer, J.; van der Velde, Y.; van Geer, F.; de Rooij, G. H.; Broers, H.; Bierkens, M. F.
2009-12-01
Pollution of surface waters in densely populated areas with intensive land use is a serious threat to their ecological, industrial and recreational utilization. European and national manure policies and several regional and local pilot projects aim at reducing pollution loads to surface waters. For the evaluation of measures, water authorities and environmental research institutes are putting a lot of effort into monitoring surface water quality. Within regional surface water quality monitoring networks, the measurement locations are usually situated in the downstream part of the catchment to represent a larger area. The monitoring frequency is usually low (e.g. monthly), due to the high costs for sampling and analysis. As a consequence, human induced trends in nutrient loads and concentrations in these monitoring data are often concealed by the large variability of surface water quality caused by meteorological variations. Because this natural variability in surface water quality is poorly understood, large uncertainties occur in the estimates of (trends in) nutrient loads or average concentrations. This study aims at uncertainty reduction in the estimates of mean concentrations and loads of N and P from regional monitoring data. For this purpose, we related continuous records of stream water N and P concentrations to easier and cheaper to collect quantitative data on precipitation, discharge, groundwater level and tube drain discharge. A specially designed multi scale experimental setup was installed in an agricultural lowland catchment in The Netherlands. At the catchment outlet, continuous measurements of water quality and discharge were performed from July 2007-January 2009. At an experimental field within the catchment we collected continuous measurements of precipitation, groundwater levels and tube drain discharges. 20 significant rainfall events with a variety of antecedent conditions, durations and intensities were selected for analysis. Singular and multiple regression analysis were used to identify relations between the N and P response to the rainfall events and the quantitative event characteristics. We successfully used these relations to predict the N and P responses to events and to improve the interpolation between low frequency grab sample measurements. Incorporating the predicted concentration changes during high discharge events dramatically improved the precision of our load estimations.
Hamilton, Joshua J.; Dwivedi, Vivek; Reed, Jennifer L.
2013-01-01
Constraint-based methods provide powerful computational techniques to allow understanding and prediction of cellular behavior. These methods rely on physiochemical constraints to eliminate infeasible behaviors from the space of available behaviors. One such constraint is thermodynamic feasibility, the requirement that intracellular flux distributions obey the laws of thermodynamics. The past decade has seen several constraint-based methods that interpret this constraint in different ways, including those that are limited to small networks, rely on predefined reaction directions, and/or neglect the relationship between reaction free energies and metabolite concentrations. In this work, we utilize one such approach, thermodynamics-based metabolic flux analysis (TMFA), to make genome-scale, quantitative predictions about metabolite concentrations and reaction free energies in the absence of prior knowledge of reaction directions, while accounting for uncertainties in thermodynamic estimates. We applied TMFA to a genome-scale network reconstruction of Escherichia coli and examined the effect of thermodynamic constraints on the flux space. We also assessed the predictive performance of TMFA against gene essentiality and quantitative metabolomics data, under both aerobic and anaerobic, and optimal and suboptimal growth conditions. Based on these results, we propose that TMFA is a useful tool for validating phenotypes and generating hypotheses, and that additional types of data and constraints can improve predictions of metabolite concentrations. PMID:23870272
Pelekis, Michael; Nicolich, Mark J; Gauthier, Joseph S
2003-12-01
Human health risk assessments use point values to develop risk estimates and thus impart a deterministic character to risk, which, by definition, is a probability phenomenon. The risk estimates are calculated based on individuals and then, using uncertainty factors (UFs), are extrapolated to the population that is characterized by variability. Regulatory agencies have recommended the quantification of the impact of variability in risk assessments through the application of probabilistic methods. In the present study, a framework that deals with the quantitative analysis of uncertainty (U) and variability (V) in target tissue dose in the population was developed by applying probabilistic analysis to physiologically-based toxicokinetic models. The mechanistic parameters that determine kinetics were described with probability density functions (PDFs). Since each PDF depicts the frequency of occurrence of all expected values of each parameter in the population, the combined effects of multiple sources of U/V were accounted for in the estimated distribution of tissue dose in the population, and a unified (adult and child) intraspecies toxicokinetic uncertainty factor UFH-TK was determined. The results show that the proposed framework accounts effectively for U/V in population toxicokinetics. The ratio of the 95th percentile to the 50th percentile of the annual average concentration of the chemical at the target tissue organ (i.e., the UFH-TK) varies with age. The ratio is equivalent to a unified intraspecies toxicokinetic UF, and it is one of the UFs by which the NOAEL can be divided to obtain the RfC/RfD. The 10-fold intraspecies UF is intended to account for uncertainty and variability in toxicokinetics (3.2x) and toxicodynamics (3.2x). This article deals exclusively with toxicokinetic component of UF. The framework provides an alternative to the default methodology and is advantageous in that the evaluation of toxicokinetic variability is based on the distribution of the effective target tissue dose, rather than applied dose. It allows for the replacement of the default adult and children intraspecies UF with toxicokinetic data-derived values and provides accurate chemical-specific estimates for their magnitude. It shows that proper application of probability and toxicokinetic theories can reduce uncertainties when establishing exposure limits for specific compounds and provide better assurance that established limits are adequately protective. It contributes to the development of a probabilistic noncancer risk assessment framework and will ultimately lead to the unification of cancer and noncancer risk assessment methodologies.
A Two-Step Approach to Uncertainty Quantification of Core Simulators
Yankov, Artem; Collins, Benjamin; Klein, Markus; ...
2012-01-01
For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor andmore » in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.« less
NASA Astrophysics Data System (ADS)
Zhu, Q.; Xu, Y. P.; Gu, H.
2014-12-01
Traditionally, regional frequency analysis methods were developed for stationary environmental conditions. Nevertheless, recent studies have identified significant changes in hydrological records, leading to the 'death' of stationarity. Besides, uncertainty in hydrological frequency analysis is persistent. This study aims to investigate the impact of one of the most important uncertainty sources, parameter uncertainty, together with nonstationarity, on design rainfall depth in Qu River Basin, East China. A spatial bootstrap is first proposed to analyze the uncertainty of design rainfall depth estimated by regional frequency analysis based on L-moments and estimated on at-site scale. Meanwhile, a method combining the generalized additive models with 30-year moving window is employed to analyze non-stationarity existed in the extreme rainfall regime. The results show that the uncertainties of design rainfall depth with 100-year return period under stationary conditions estimated by regional spatial bootstrap can reach 15.07% and 12.22% with GEV and PE3 respectively. On at-site scale, the uncertainties can reach 17.18% and 15.44% with GEV and PE3 respectively. In non-stationary conditions, the uncertainties of maximum rainfall depth (corresponding to design rainfall depth) with 0.01 annual exceedance probability (corresponding to 100-year return period) are 23.09% and 13.83% with GEV and PE3 respectively. Comparing the 90% confidence interval, the uncertainty of design rainfall depth resulted from parameter uncertainty is less than that from non-stationarity frequency analysis with GEV, however, slightly larger with PE3. This study indicates that the spatial bootstrap can be successfully applied to analyze the uncertainty of design rainfall depth on both regional and at-site scales. And the non-stationary analysis shows that the differences between non-stationary quantiles and their stationary equivalents are important for decision makes of water resources management and risk management.
NERL'S QUEST TO REDUCE UNCERTAINTIES IN CHILDREN'S HEALTH AND RISK ASSESSMENTS
Data on children's exposure to chemical contaminants and their activities are very limited. Quantitative assessments rely heavily on major default assumptions as substitutes for missing information (Cohen Hubal et al. 2000a, b). The goal of the EPA/NERL measurement research p...
NASA Technical Reports Server (NTRS)
Nagpal, Vinod K.; Tong, Michael; Murthy, P. L. N.; Mital, Subodh
1998-01-01
An integrated probabilistic approach has been developed to assess composites for high temperature applications. This approach was used to determine thermal and mechanical properties and their probabilistic distributions of a 5-harness 0/90 Sylramic fiber/CVI-SiC/Mi-SiC woven Ceramic Matrix Composite (CMC) at high temperatures. The purpose of developing this approach was to generate quantitative probabilistic information on this CMC to help complete the evaluation for its potential application for HSCT combustor liner. This approach quantified the influences of uncertainties inherent in constituent properties called primitive variables on selected key response variables of the CMC at 2200 F. The quantitative information is presented in the form of Cumulative Density Functions (CDFs). Probability Density Functions (PDFS) and primitive variable sensitivities on response. Results indicate that the scatters in response variables were reduced by 30-50% when the uncertainties in the primitive variables, which showed the most influence, were reduced by 50%.
Geuna, S
2000-11-20
Quantitative morphology of the nervous system has undergone great developments over recent years, and several new technical procedures have been devised and applied successfully to neuromorphological research. However, a lively debate has arisen on some issues, and a great deal of confusion appears to exist that is definitely responsible for the slow spread of the new techniques among scientists. One such element of confusion is related to uncertainty about the meaning, implications, and advantages of the design-based sampling strategy that characterize the new techniques. In this article, to help remove this uncertainty, morphoquantitative methods are described and contrasted on the basis of the inferential paradigm of the sampling strategy: design-based vs model-based. Moreover, some recommendations are made to help scientists judge the appropriateness of a method used for a given study in relation to its specific goals. Finally, the use of the term stereology to label, more or less expressly, only some methods is critically discussed. Copyright 2000 Wiley-Liss, Inc.
Retzbach, Joachim; Otto, Lukas; Maier, Michaela
2016-08-01
Many scholars have argued for the need to communicate openly not only scientific successes to the public but also limitations, such as the tentativeness of research findings, in order to enhance public trust and engagement. Yet, it has not been quantitatively assessed how the perception of scientific uncertainties relates to engagement with science on an individual level. In this article, we report the development and testing of a new questionnaire in English and German measuring the perceived uncertainty of scientific evidence. Results indicate that the scale is reliable and valid in both language versions and that its two subscales are differentially related to measures of engagement: Science-friendly attitudes were positively related only to 'subjectively' perceived uncertainty, whereas interest in science as well as behavioural engagement actions and intentions were largely uncorrelated. We conclude that perceiving scientific knowledge to be uncertain is only weakly, but positively related to engagement with science. © The Author(s) 2015.
Low Reynolds number wind tunnel measurements - The importance of being earnest
NASA Technical Reports Server (NTRS)
Mueller, Thomas J.; Batill, Stephen M.; Brendel, Michael; Perry, Mark L.; Bloch, Diane R.
1986-01-01
A method for obtaining two-dimensional aerodynamic force coefficients at low Reynolds numbers using a three-component external platform balance is presented. Regardless of method, however, the importance of understanding the possible influence of the test facility and instrumentation on the final results cannot be overstated. There is an uncertainty in the ability of the facility to simulate a two-dimensional flow environment due to the confinement effect of the wind tunnel and the method used to mount the airfoil. Additionally, the ability of the instrumentation to accurately measure forces and pressures has an associated uncertainty. This paper focuses on efforts taken to understand the errors introduced by the techniques and apparatus used at the University of Notre Dame, and, the importance of making an earnest estimate of the uncertainty. Although quantitative estimates of facility induced errors are difficult to obtain, the uncertainty in measured results can be handled in a straightforward manner and provide the experimentalist, and others, with a basis to evaluate experimental results.
CASMO5/TSUNAMI-3D spent nuclear fuel reactivity uncertainty analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferrer, R.; Rhodes, J.; Smith, K.
2012-07-01
The CASMO5 lattice physics code is used in conjunction with the TSUNAMI-3D sequence in ORNL's SCALE 6 code system to estimate the uncertainties in hot-to-cold reactivity changes due to cross-section uncertainty for PWR assemblies at various burnup points. The goal of the analysis is to establish the multiplication factor uncertainty similarity between various fuel assemblies at different conditions in a quantifiable manner and to obtain a bound on the hot-to-cold reactivity uncertainty over the various assembly types and burnup attributed to fundamental cross-section data uncertainty. (authors)
Health impact assessment – A survey on quantifying tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fehr, Rainer, E-mail: rainer.fehr@uni-bielefeld.de; Mekel, Odile C.L., E-mail: odile.mekel@lzg.nrw.de; Fintan Hurley, J., E-mail: fintan.hurley@iom-world.org
Integrating human health into prospective impact assessments is known to be challenging. This is true for both approaches: dedicated health impact assessments (HIA) as well as inclusion of health into more general impact assessments. Acknowledging the full range of participatory, qualitative, and quantitative approaches, this study focuses on the latter, especially on computational tools for quantitative health modelling. We conducted a survey among tool developers concerning the status quo of development and availability of such tools; experiences made with model usage in real-life situations; and priorities for further development. Responding toolmaker groups described 17 such tools, most of them beingmore » maintained and reported as ready for use and covering a wide range of topics, including risk & protective factors, exposures, policies, and health outcomes. In recent years, existing models have been improved and were applied in new ways, and completely new models emerged. There was high agreement among respondents on the need to further develop methods for assessment of inequalities and uncertainty. The contribution of quantitative modeling to health foresight would benefit from building joint strategies of further tool development, improving the visibility of quantitative tools and methods, and engaging continuously with actual and potential users. - Highlights: • A survey investigated computational tools for health impact quantification. • Formal evaluation of such tools has been rare. • Handling inequalities and uncertainties are priority areas for further development. • Health foresight would benefit from tool developers and users forming a community. • Joint development strategies across computational tools are needed.« less
Spatially resolved hazard and exposure assessments: an example of lead in soil at Lavrion, Greece.
Tristán, E; Demetriades, A; Ramsey, M H; Rosenbaum, M S; Stavrakis, P; Thornton, I; Vassiliades, E; Vergou, K
2000-01-01
Spatially resolved hazard assessment (SRHA) and spatially resolved exposure assessment (SREA) are methodologies that have been devised for assessing child exposure to soil containing environmental pollutants. These are based on either a quantitative or a semiquantitative approach. The feasibility of the methodologies has been demonstrated in a study assessing child exposure to Pb accessible in soil at the town of Lavrion in Greece. Using a quantitative approach, both measured and kriged concentrations of Pb in soil are compared with an "established" statutory threshold value. The probabilistic approach gives a refined classification of the contaminated land, since it takes into consideration the uncertainty in both the actual measurement and estimated kriged values. Two exposure assessment models (i.e., IEUBK and HESP) are used as the basis of the quantitative SREA methodologies. The significant correlation between the blood-Pb predictions, using the IEUBK model, and measured concentrations provides a partial validation of the method, because it allows for the uncertainty in the measurements and the lack of some site-specific measurements. The semiquantitative applications of SRHA and SREA incorporate both qualitative information (e.g., land use and dustiness of waste) and quantitative information (e.g., distance from wastes and distance from industry). The significant correlation between the results of these assessments and the measured blood-Pb levels confirms the robust nature of this approach. Successful application of these methodologies could reduce the cost of the assessment and allow areas to be prioritized for further investigation, remediation, or risk management.
Feizizadeh, Bakhtiar; Jankowski, Piotr; Blaschke, Thomas
2014-01-01
GIS multicriteria decision analysis (MCDA) techniques are increasingly used in landslide susceptibility mapping for the prediction of future hazards, land use planning, as well as for hazard preparedness. However, the uncertainties associated with MCDA techniques are inevitable and model outcomes are open to multiple types of uncertainty. In this paper, we present a systematic approach to uncertainty and sensitivity analysis. We access the uncertainty of landslide susceptibility maps produced with GIS-MCDA techniques. A new spatially-explicit approach and Dempster–Shafer Theory (DST) are employed to assess the uncertainties associated with two MCDA techniques, namely Analytical Hierarchical Process (AHP) and Ordered Weighted Averaging (OWA) implemented in GIS. The methodology is composed of three different phases. First, weights are computed to express the relative importance of factors (criteria) for landslide susceptibility. Next, the uncertainty and sensitivity of landslide susceptibility is analyzed as a function of weights using Monte Carlo Simulation and Global Sensitivity Analysis. Finally, the results are validated using a landslide inventory database and by applying DST. The comparisons of the obtained landslide susceptibility maps of both MCDA techniques with known landslides show that the AHP outperforms OWA. However, the OWA-generated landslide susceptibility map shows lower uncertainty than the AHP-generated map. The results demonstrate that further improvement in the accuracy of GIS-based MCDA can be achieved by employing an integrated uncertainty–sensitivity analysis approach, in which the uncertainty of landslide susceptibility model is decomposed and attributed to model's criteria weights. PMID:25843987
Tug-of-war lacunarity—A novel approach for estimating lacunarity
NASA Astrophysics Data System (ADS)
Reiss, Martin A.; Lemmerer, Birgit; Hanslmeier, Arnold; Ahammer, Helmut
2016-11-01
Modern instrumentation provides us with massive repositories of digital images that will likely only increase in the future. Therefore, it has become increasingly important to automatize the analysis of digital images, e.g., with methods from pattern recognition. These methods aim to quantify the visual appearance of captured textures with quantitative measures. As such, lacunarity is a useful multi-scale measure of texture's heterogeneity but demands high computational efforts. Here we investigate a novel approach based on the tug-of-war algorithm, which estimates lacunarity in a single pass over the image. We computed lacunarity for theoretical and real world sample images, and found that the investigated approach is able to estimate lacunarity with low uncertainties. We conclude that the proposed method combines low computational efforts with high accuracy, and that its application may have utility in the analysis of high-resolution images.
Mwakanyamale, Kisa; Day-Lewis, Frederick D.; Slater, Lee D.
2013-01-01
Fiber-optic distributed temperature sensing (FO-DTS) increasingly is used to map zones of focused groundwater/surface-water exchange (GWSWE). Previous studies of GWSWE using FO-DTS involved identification of zones of focused GWSWE based on arbitrary cutoffs of FO-DTS time-series statistics (e.g., variance, cross-correlation between temperature and stage, or spectral power). New approaches are needed to extract more quantitative information from large, complex FO-DTS data sets while concurrently providing an assessment of uncertainty associated with mapping zones of focused GSWSE. Toward this end, we present a strategy combining discriminant analysis (DA) and spectral analysis (SA). We demonstrate the approach using field experimental data from a reach of the Columbia River adjacent to the Hanford 300 Area site. Results of the combined SA/DA approach are shown to be superior to previous results from qualitative interpretation of FO-DTS spectra alone.
NASA Astrophysics Data System (ADS)
Khan, Tanvir R.; Perlinger, Judith A.
2017-10-01
Despite considerable effort to develop mechanistic dry particle deposition parameterizations for atmospheric transport models, current knowledge has been inadequate to propose quantitative measures of the relative performance of available parameterizations. In this study, we evaluated the performance of five dry particle deposition parameterizations developed by Zhang et al. (2001) (Z01), Petroff and Zhang (2010) (PZ10), Kouznetsov and Sofiev (2012) (KS12), Zhang and He (2014) (ZH14), and Zhang and Shao (2014) (ZS14), respectively. The evaluation was performed in three dimensions: model ability to reproduce observed deposition velocities, Vd (accuracy); the influence of imprecision in input parameter values on the modeled Vd (uncertainty); and identification of the most influential parameter(s) (sensitivity). The accuracy of the modeled Vd was evaluated using observations obtained from five land use categories (LUCs): grass, coniferous and deciduous forests, natural water, and ice/snow. To ascertain the uncertainty in modeled Vd, and quantify the influence of imprecision in key model input parameters, a Monte Carlo uncertainty analysis was performed. The Sobol' sensitivity analysis was conducted with the objective to determine the parameter ranking from the most to the least influential. Comparing the normalized mean bias factors (indicators of accuracy), we find that the ZH14 parameterization is the most accurate for all LUCs except for coniferous forest, for which it is second most accurate. From Monte Carlo simulations, the estimated mean normalized uncertainties in the modeled Vd obtained for seven particle sizes (ranging from 0.005 to 2.5 µm) for the five LUCs are 17, 12, 13, 16, and 27 % for the Z01, PZ10, KS12, ZH14, and ZS14 parameterizations, respectively. From the Sobol' sensitivity results, we suggest that the parameter rankings vary by particle size and LUC for a given parameterization. Overall, for dp = 0.001 to 1.0 µm, friction velocity was one of the three most influential parameters in all parameterizations. For giant particles (dp = 10 µm), relative humidity was the most influential parameter. Because it is the least complex of the five parameterizations, and it has the greatest accuracy and least uncertainty, we propose that the ZH14 parameterization is currently superior for incorporation into atmospheric transport models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Emery, Keith
The measurement of photovoltaic (PV) performance with respect to reference conditions requires measuring current versus voltage for a given tabular reference spectrum, junction temperature, and total irradiance. This report presents the procedures implemented by the PV Cell and Module Performance Characterization Group at the National Renewable Energy Laboratory (NREL) to achieve the lowest practical uncertainty. A rigorous uncertainty analysis of these procedures is presented, which follows the International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement. This uncertainty analysis is required for the team’s laboratory accreditation under ISO standard 17025, “General Requirements for the Competence ofmore » Testing and Calibration Laboratories.” The report also discusses additional areas where the uncertainty can be reduced.« less
Robust Decision Making Approach to Managing Water Resource Risks (Invited)
NASA Astrophysics Data System (ADS)
Lempert, R.
2010-12-01
The IPCC and US National Academies of Science have recommended iterative risk management as the best approach for water management and many other types of climate-related decisions. Such an approach does not rely on a single set of judgments at any one time but rather actively updates and refines strategies as new information emerges. In addition, the approach emphasizes that a portfolio of different types of responses, rather than any single action, often provides the best means to manage uncertainty. Implementing an iterative risk management approach can however prove difficult in actual decision support applications. This talk will suggest that robust decision making (RDM) provides a particularly useful set of quantitative methods for implementing iterative risk management. This RDM approach is currently being used in a wide variety of water management applications. RDM employs three key concepts that differentiate it from most types of probabilistic risk analysis: 1) characterizing uncertainty with multiple views of the future (which can include sets of probability distributions) rather than a single probabilistic best-estimate, 2) employing a robustness rather than an optimality criterion to assess alternative policies, and 3) organizing the analysis with a vulnerability and response option framework, rather than a predict-then-act framework. This talk will summarize the RDM approach, describe its use in several different types of water management applications, and compare the results to those obtained with other methods.
Identification of gene regulation models from single-cell data
NASA Astrophysics Data System (ADS)
Weber, Lisa; Raymond, William; Munsky, Brian
2018-09-01
In quantitative analyses of biological processes, one may use many different scales of models (e.g. spatial or non-spatial, deterministic or stochastic, time-varying or at steady-state) or many different approaches to match models to experimental data (e.g. model fitting or parameter uncertainty/sloppiness quantification with different experiment designs). These different analyses can lead to surprisingly different results, even when applied to the same data and the same model. We use a simplified gene regulation model to illustrate many of these concerns, especially for ODE analyses of deterministic processes, chemical master equation and finite state projection analyses of heterogeneous processes, and stochastic simulations. For each analysis, we employ MATLAB and PYTHON software to consider a time-dependent input signal (e.g. a kinase nuclear translocation) and several model hypotheses, along with simulated single-cell data. We illustrate different approaches (e.g. deterministic and stochastic) to identify the mechanisms and parameters of the same model from the same simulated data. For each approach, we explore how uncertainty in parameter space varies with respect to the chosen analysis approach or specific experiment design. We conclude with a discussion of how our simulated results relate to the integration of experimental and computational investigations to explore signal-activated gene expression models in yeast (Neuert et al 2013 Science 339 584–7) and human cells (Senecal et al 2014 Cell Rep. 8 75–83)5.
NASA Astrophysics Data System (ADS)
Min, Xiangjun; Zhu, Yonghao
1998-08-01
Inflight experiment of Modular Airborne Imaging Spectrometer (MAIS) and ground-based measurements using GER MARK-V spectroradiometer simultaneously with the MAIS overpass were performed during Autumn 1995 at the semiarid area of Inner Mongolia, China. Based on these measurements and MAIS image data, we designed a method for the radiometric calibration of MAIS sensor using 6S and LOWTRAN 7 codes. The results show that the uncertainty of MAIS calibration is about 8% in the visible and near infrared wavelengths (0.4 - 1.2 micrometer). To verify our calibration algorithm, the calibrated results of MAIS sensor was used to derive the ground reflectances. The accuracy of reflectance retrieval is about 8.5% in the spectral range of 0.4 to 1.2 micrometer, i.e., the uncertainty of derived near-nadir reflectances is within 0.01 - 0.05 in reflectance unit at ground reflectance between 3% and 50%. The distinguishing feature of the ground-based measurements, which will be paid special attention in this paper, is that obtaining simultaneously the reflectance factors of the calibration target, atmospheric optical depth, and water vapor abundance from the same one set of measurement data by only one suit of instruments. The analysis indicates that the method presented here is suitable to the quantitative analysis of imaging spectral data in China.
Composition-explicit distillation curves of aviation fuel JP-8 and a coal-based jet fuel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beverly L. Smith; Thomas J. Bruno
2007-09-15
We have recently introduced several important improvements in the measurement of distillation curves for complex fluids. The modifications to the classical measurement provide for (1) a composition explicit data channel for each distillate fraction (for both qualitative and quantitative analysis); (2) temperature measurements that are true thermodynamic state points; (3) temperature, volume, and pressure measurements of low uncertainty suitable for an equation of state development; (4) consistency with a century of historical data; (5) an assessment of the energy content of each distillate fraction; (6) a trace chemical analysis of each distillate fraction; and (7) a corrosivity assessment of eachmore » distillate fraction. The most significant modification is achieved with a new sampling approach that allows precise qualitative as well as quantitative analyses of each fraction, on the fly. We have applied the new method to the measurement of rocket propellant, gasoline, and jet fuels. In this paper, we present the application of the technique to representative batches of the military aviation fuel JP-8, and also to a coal-derived fuel developed as a potential substitute. We present not only the distillation curves but also a chemical characterization of each fraction and discuss the contrasts between the two fluids. 26 refs., 5 figs., 6 tabs.« less
Ji, Qinqin; Salomon, Arthur R.
2015-01-01
The activation of T-lymphocytes through antigen-mediated T-cell receptor (TCR) clustering is vital in regulating the adaptive-immune response. Although T cell receptor signaling has been extensively studied, the fundamental mechanisms for signal initiation are not fully understood. Reduced temperature initiated some of the hallmarks of TCR signaling such as increased phosphorylation and activation on ERK and calcium release from the endoplasmic reticulum as well as coalesce T-cell membrane microdomains. The precise mechanism of TCR signaling initiation due to temperature change remains obscure. One critical question is whether signaling initiated by cold treatment of T cells differs from signaling initiated by crosslinking of the T cell receptor. To address this uncertainty, a wide-scale, quantitative mass spectrometry-based phosphoproteomic analysis was performed on T cells stimulated either by temperature shift or through crosslinking of the TCR. Careful statistical comparison between the two stimulations revealed a striking level of identity between the subset of 339 sites that changed significantly with both stimulations. This study demonstrates for the first time, at unprecedented detail, that T cell cold treatment was sufficient to initiate signaling patterns nearly identical to soluble antibody stimulation, shedding new light on the mechanism of activation of these critically important immune cells. PMID:25839225
Assessment of Radiative Heating Uncertainty for Hyperbolic Earth Entry
NASA Technical Reports Server (NTRS)
Johnston, Christopher O.; Mazaheri, Alireza; Gnoffo, Peter A.; Kleb, W. L.; Sutton, Kenneth; Prabhu, Dinesh K.; Brandis, Aaron M.; Bose, Deepak
2011-01-01
This paper investigates the shock-layer radiative heating uncertainty for hyperbolic Earth entry, with the main focus being a Mars return. In Part I of this work, a baseline simulation approach involving the LAURA Navier-Stokes code with coupled ablation and radiation is presented, with the HARA radiation code being used for the radiation predictions. Flight cases representative of peak-heating Mars or asteroid return are de ned and the strong influence of coupled ablation and radiation on their aerothermodynamic environments are shown. Structural uncertainties inherent in the baseline simulations are identified, with turbulence modeling, precursor absorption, grid convergence, and radiation transport uncertainties combining for a +34% and ..24% structural uncertainty on the radiative heating. A parametric uncertainty analysis, which assumes interval uncertainties, is presented. This analysis accounts for uncertainties in the radiation models as well as heat of formation uncertainties in the flow field model. Discussions and references are provided to support the uncertainty range chosen for each parameter. A parametric uncertainty of +47.3% and -28.3% is computed for the stagnation-point radiative heating for the 15 km/s Mars-return case. A breakdown of the largest individual uncertainty contributors is presented, which includes C3 Swings cross-section, photoionization edge shift, and Opacity Project atomic lines. Combining the structural and parametric uncertainty components results in a total uncertainty of +81.3% and ..52.3% for the Mars-return case. In Part II, the computational technique and uncertainty analysis presented in Part I are applied to 1960s era shock-tube and constricted-arc experimental cases. It is shown that experiments contain shock layer temperatures and radiative ux values relevant to the Mars-return cases of present interest. Comparisons between the predictions and measurements, accounting for the uncertainty in both, are made for a range of experiments. A measure of comparison quality is de ned, which consists of the percent overlap of the predicted uncertainty bar with the corresponding measurement uncertainty bar. For nearly all cases, this percent overlap is greater than zero, and for most of the higher temperature cases (T >13,000 K) it is greater than 50%. These favorable comparisons provide evidence that the baseline computational technique and uncertainty analysis presented in Part I are adequate for Mars-return simulations. In Part III, the computational technique and uncertainty analysis presented in Part I are applied to EAST shock-tube cases. These experimental cases contain wavelength dependent intensity measurements in a wavelength range that covers 60% of the radiative intensity for the 11 km/s, 5 m radius flight case studied in Part I. Comparisons between the predictions and EAST measurements are made for a range of experiments. The uncertainty analysis presented in Part I is applied to each prediction, and comparisons are made using the metrics defined in Part II. The agreement between predictions and measurements is excellent for velocities greater than 10.5 km/s. Both the wavelength dependent and wavelength integrated intensities agree within 30% for nearly all cases considered. This agreement provides confidence in the computational technique and uncertainty analysis presented in Part I, and provides further evidence that this approach is adequate for Mars-return simulations. Part IV of this paper reviews existing experimental data that include the influence of massive ablation on radiative heating. It is concluded that this existing data is not sufficient for the present uncertainty analysis. Experiments to capture the influence of massive ablation on radiation are suggested as future work, along with further studies of the radiative precursor and improvements in the radiation properties of ablation products.
The Effect of the Ill-posed Problem on Quantitative Error Assessment in Digital Image Correlation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lehoucq, R. B.; Reu, P. L.; Turner, D. Z.
Here, this work explores the effect of the ill-posed problem on uncertainty quantification for motion estimation using digital image correlation (DIC) (Sutton et al. 2009). We develop a correction factor for standard uncertainty estimates based on the cosine of the angle between the true motion and the image gradients, in an integral sense over a subregion of the image. This correction factor accounts for variability in the DIC solution previously unaccounted for when considering only image noise, interpolation bias, contrast, and the software settings such as subset size and spacing.
How measurement science can improve confidence in research results.
Plant, Anne L; Becker, Chandler A; Hanisch, Robert J; Boisvert, Ronald F; Possolo, Antonio M; Elliott, John T
2018-04-01
The current push for rigor and reproducibility is driven by a desire for confidence in research results. Here, we suggest a framework for a systematic process, based on consensus principles of measurement science, to guide researchers and reviewers in assessing, documenting, and mitigating the sources of uncertainty in a study. All study results have associated ambiguities that are not always clarified by simply establishing reproducibility. By explicitly considering sources of uncertainty, noting aspects of the experimental system that are difficult to characterize quantitatively, and proposing alternative interpretations, the researcher provides information that enhances comparability and reproducibility.
The Effect of the Ill-posed Problem on Quantitative Error Assessment in Digital Image Correlation
Lehoucq, R. B.; Reu, P. L.; Turner, D. Z.
2017-11-27
Here, this work explores the effect of the ill-posed problem on uncertainty quantification for motion estimation using digital image correlation (DIC) (Sutton et al. 2009). We develop a correction factor for standard uncertainty estimates based on the cosine of the angle between the true motion and the image gradients, in an integral sense over a subregion of the image. This correction factor accounts for variability in the DIC solution previously unaccounted for when considering only image noise, interpolation bias, contrast, and the software settings such as subset size and spacing.
Cuevas, Soledad
Agriculture is a major contributor to greenhouse gas emissions, an important part of which is associated to deforestation and indirect land use change. Appropriate and coherent food policies can play an important role in aligning health, economic and environmental goals. From the point of view of policy analysis, however, this requires multi-sectoral, interdisciplinary approaches which can be highly complex. Important methodological advances in the area are not exempted from limitations and criticism. We argue that there is scope for further developments in integrated quantitative and qualitative policy analysis combining existing methods, including mathematical modelling and stakeholder analysis. We outline methodological trends in the field, briefly characterise integrated mixed methods policy analysis and identify contributions, challenges and opportunities for future research. In particular, this type of approach can help address issues of uncertainty and context-specific validity, incorporate multiple perspectives and help advance meaningful interdisciplinary collaboration in the field. Substantial challenges remain, however, such as the integration of key issues related to non-communicable disease, or the incorporation of a broader range of qualitative approaches that can address important cultural and ethical dimensions of food.
Robustness Analysis and Optimally Robust Control Design via Sum-of-Squares
NASA Technical Reports Server (NTRS)
Dorobantu, Andrei; Crespo, Luis G.; Seiler, Peter J.
2012-01-01
A control analysis and design framework is proposed for systems subject to parametric uncertainty. The underlying strategies are based on sum-of-squares (SOS) polynomial analysis and nonlinear optimization to design an optimally robust controller. The approach determines a maximum uncertainty range for which the closed-loop system satisfies a set of stability and performance requirements. These requirements, de ned as inequality constraints on several metrics, are restricted to polynomial functions of the uncertainty. To quantify robustness, SOS analysis is used to prove that the closed-loop system complies with the requirements for a given uncertainty range. The maximum uncertainty range, calculated by assessing a sequence of increasingly larger ranges, serves as a robustness metric for the closed-loop system. To optimize the control design, nonlinear optimization is used to enlarge the maximum uncertainty range by tuning the controller gains. Hence, the resulting controller is optimally robust to parametric uncertainty. This approach balances the robustness margins corresponding to each requirement in order to maximize the aggregate system robustness. The proposed framework is applied to a simple linear short-period aircraft model with uncertain aerodynamic coefficients.
Samad, Noor Asma Fazli Abdul; Sin, Gürkan; Gernaey, Krist V; Gani, Rafiqul
2013-11-01
This paper presents the application of uncertainty and sensitivity analysis as part of a systematic model-based process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty, while for sensitivity analysis, global methods including the standardized regression coefficients (SRC) and Morris screening are used to identify the most significant parameters. The potassium dihydrogen phosphate (KDP) crystallization process is used as a case study, both in open-loop and closed-loop operation. In the uncertainty analysis, the impact on the predicted output of uncertain parameters related to the nucleation and the crystal growth model has been investigated for both a one- and two-dimensional crystal size distribution (CSD). The open-loop results show that the input uncertainties lead to significant uncertainties on the CSD, with appearance of a secondary peak due to secondary nucleation for both cases. The sensitivity analysis indicated that the most important parameters affecting the CSDs are nucleation order and growth order constants. In the proposed PAT system design (closed-loop), the target CSD variability was successfully reduced compared to the open-loop case, also when considering uncertainty in nucleation and crystal growth model parameters. The latter forms a strong indication of the robustness of the proposed PAT system design in achieving the target CSD and encourages its transfer to full-scale implementation. Copyright © 2013 Elsevier B.V. All rights reserved.
Benchmark dose analysis via nonparametric regression modeling
Piegorsch, Walter W.; Xiong, Hui; Bhattacharya, Rabi N.; Lin, Lizhen
2013-01-01
Estimation of benchmark doses (BMDs) in quantitative risk assessment traditionally is based upon parametric dose-response modeling. It is a well-known concern, however, that if the chosen parametric model is uncertain and/or misspecified, inaccurate and possibly unsafe low-dose inferences can result. We describe a nonparametric approach for estimating BMDs with quantal-response data based on an isotonic regression method, and also study use of corresponding, nonparametric, bootstrap-based confidence limits for the BMD. We explore the confidence limits’ small-sample properties via a simulation study, and illustrate the calculations with an example from cancer risk assessment. It is seen that this nonparametric approach can provide a useful alternative for BMD estimation when faced with the problem of parametric model uncertainty. PMID:23683057
A mathematical model for CTL effect on a latently infected cell inclusive HIV dynamics and treatment
NASA Astrophysics Data System (ADS)
Tarfulea, N. E.
2017-10-01
This paper investigates theoretically and numerically the effect of immune effectors, such as the cytotoxic lymphocyte (CTL), in modeling HIV pathogenesis (via a newly developed mathematical model); our results suggest the significant impact of the immune response on the control of the virus during primary infection. Qualitative aspects (including positivity, boundedness, stability, uncertainty, and sensitivity analysis) are addressed. Additionally, by introducing drug therapy, we analyze numerically the model to assess the effect of treatment consisting of a combination of several antiretroviral drugs. Our results show that the inclusion of the CTL compartment produces a higher rebound for an individual's healthy helper T-cell compartment than drug therapy alone. Furthermore, we quantitatively characterize successful drugs or drug combination scenarios.