Sample records for uncertainty evaluation program

  1. Evaluation of Uncertainty in Constituent Input Parameters for Modeling the Fate of IMX 101 Components

    DTIC Science & Technology

    2017-05-01

    ER D C/ EL T R- 17 -7 Environmental Security Technology Certification Program (ESTCP) Evaluation of Uncertainty in Constituent Input...Environmental Security Technology Certification Program (ESTCP) ERDC/EL TR-17-7 May 2017 Evaluation of Uncertainty in Constituent Input Parameters...Environmental Evaluation and Characterization Sys- tem (TREECS™) was applied to a groundwater site and a surface water site to evaluate the sensitivity

  2. Effect of a mind-body therapeutic program for infertile women repeating in vitro fertilization treatment on uncertainty, anxiety, and implantation rate.

    PubMed

    Kim, Miok; Kim, Sue; Chang, Soon-bok; Yoo, Ji-Soo; Kim, Hee Kyung; Cho, Jung Hyun

    2014-03-01

    The study aimed to develop a mind-body therapeutic program and evaluate its effects on mitigating uncertainty, anxiety, and implantation rate of second-trial in vitro fertilization (IVF) women. This study employed a nonequivalent control group nonsynchronized design. The conceptual framework and program content were developed from a preliminary survey of eight infertile women and the extensive review of the literature. Program focuses on three uncertainty-induced anxieties in infertile women: cognitive, emotional, and biological responses. To evaluate the effect of the intervention, the infertile women with unknown cause preparing for a second IVF treatment were sampled at convenience (26 experimental and 24 control). The experimental group in the study showed greater decrease in uncertainty and anxiety in premeasurements and postmeasurements than the control group did. However, no statistically significant differences in the implantation rate between groups were observed. This study is meaningful as the first intervention program for alleviating uncertainty and anxiety provided during the IVF treatment process. The positive effects of the mind-body therapeutic program in alleviating both uncertainty and anxiety have direct meaning for clinical applications. Copyright © 2014. Published by Elsevier B.V.

  3. Evaluating a multispecies adaptive management framework: Must uncertainty impede effective decision-making?

    USGS Publications Warehouse

    Smith, David R.; McGowan, Conor P.; Daily, Jonathan P.; Nichols, James D.; Sweka, John A.; Lyons, James E.

    2013-01-01

    Application of adaptive management to complex natural resource systems requires careful evaluation to ensure that the process leads to improved decision-making. As part of that evaluation, adaptive policies can be compared with alternative nonadaptive management scenarios. Also, the value of reducing structural (ecological) uncertainty to achieving management objectives can be quantified.A multispecies adaptive management framework was recently adopted by the Atlantic States Marine Fisheries Commission for sustainable harvest of Delaware Bay horseshoe crabs Limulus polyphemus, while maintaining adequate stopover habitat for migrating red knots Calidris canutus rufa, the focal shorebird species. The predictive model set encompassed the structural uncertainty in the relationships between horseshoe crab spawning, red knot weight gain and red knot vital rates. Stochastic dynamic programming was used to generate a state-dependent strategy for harvest decisions given that uncertainty. In this paper, we employed a management strategy evaluation approach to evaluate the performance of this adaptive management framework. Active adaptive management was used by including model weights as state variables in the optimization and reducing structural uncertainty by model weight updating.We found that the value of information for reducing structural uncertainty is expected to be low, because the uncertainty does not appear to impede effective management. Harvest policy responded to abundance levels of both species regardless of uncertainty in the specific relationship that generated those abundances. Thus, the expected horseshoe crab harvest and red knot abundance were similar when the population generating model was uncertain or known, and harvest policy was robust to structural uncertainty as specified.Synthesis and applications. The combination of management strategy evaluation with state-dependent strategies from stochastic dynamic programming was an informative approach to evaluate adaptive management performance and value of learning. Although natural resource decisions are characterized by uncertainty, not all uncertainty will cause decisions to be altered substantially, as we found in this case. It is important to incorporate uncertainty into the decision framing and evaluate the effect of reducing that uncertainty on achieving the desired outcomes

  4. STochastic Analysis of Technical Systems (STATS): A model for evaluating combined effects of multiple uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kranz, L.; VanKuiken, J.C.; Gillette, J.L.

    1989-12-01

    The STATS model, now modified to run on microcomputers, uses user- defined component uncertainties to calculate composite uncertainty distributions for systems or technologies. The program can be used to investigate uncertainties for a single technology on to compare two technologies. Although the term technology'' is used throughout the program screens, the program can accommodate very broad problem definitions. For example, electrical demand uncertainties, health risks associated with toxic material exposures, or traffic queuing delay times can be estimated. The terminology adopted in this version of STATS reflects the purpose of the earlier version, which was to aid in comparing advancedmore » electrical generating technologies. A comparison of two clean coal technologies in two power plants is given as a case study illustration. 7 refs., 35 figs., 7 tabs.« less

  5. CALiPER Exploratory Study: Accounting for Uncertainty in Lumen Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bergman, Rolf; Paget, Maria L.; Richman, Eric E.

    2011-03-31

    With a well-defined and shared understanding of uncertainty in lumen measurements, testing laboratories can better evaluate their processes, contributing to greater consistency and credibility of lighting testing a key component of the U.S. Department of Energy (DOE) Commercially Available LED Product Evaluation and Reporting (CALiPER) program. Reliable lighting testing is a crucial underlying factor contributing toward the success of many energy-efficient lighting efforts, such as the DOE GATEWAY demonstrations, Lighting Facts Label, ENERGY STAR® energy efficient lighting programs, and many others. Uncertainty in measurements is inherent to all testing methodologies, including photometric and other lighting-related testing. Uncertainty exists for allmore » equipment, processes, and systems of measurement in individual as well as combined ways. A major issue with testing and the resulting accuracy of the tests is the uncertainty of the complete process. Individual equipment uncertainties are typically identified, but their relative value in practice and their combined value with other equipment and processes in the same test are elusive concepts, particularly for complex types of testing such as photometry. The total combined uncertainty of a measurement result is important for repeatable and comparative measurements for light emitting diode (LED) products in comparison with other technologies as well as competing products. This study provides a detailed and step-by-step method for determining uncertainty in lumen measurements, working closely with related standards efforts and key industry experts. This report uses the structure proposed in the Guide to Uncertainty Measurements (GUM) for evaluating and expressing uncertainty in measurements. The steps of the procedure are described and a spreadsheet format adapted for integrating sphere and goniophotometric uncertainty measurements is provided for entering parameters, ordering the information, calculating intermediate values and, finally, obtaining expanded uncertainties. Using this basis and examining each step of the photometric measurement and calibration methods, mathematical uncertainty models are developed. Determination of estimated values of input variables is discussed. Guidance is provided for the evaluation of the standard uncertainties of each input estimate, covariances associated with input estimates and the calculation of the result measurements. With this basis, the combined uncertainty of the measurement results and finally, the expanded uncertainty can be determined.« less

  6. Data Availability in Appliance Standards and Labeling Program Development and Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romankiewicz, John; Khanna, Nina; Vine, Edward

    2013-05-01

    In this report, we describe the necessary data inputs for both standards development and program evaluation and perform an initial assessment of the availability and uncertainty of those data inputs in China. For standards development, we find that China and its standards and labeling program administrators currently has access to the basic market and technical data needed for conducting market and technology assessment and technological and economic analyses. Some data, such as shipments data, is readily available from the China Energy Label product registration database while the availability of other data, including average unit energy consumption, prices and design options,more » needs improvement. Unlike some other countries such as the United States, most of the necessary data for conducting standards development analyses are not publicly available or compiled in a consolidated data source. In addition, improved data on design and efficiency options as well as cost data (e.g., manufacturing costs, mark-ups, production and product use-phase costs) – key inputs to several technoeconomic analyses – are particularly in need given China’s unconsolidated manufacturing industry. For program evaluation, we find that while China can conduct simple savings evaluations on its incentive programs with the data it currently has available from the Ministry of Finance – the program administrator, the savings estimates produced by such an evaluation will carry high uncertainty. As such, China could benefit from an increase in surveying and metering in the next one to three years to decrease the uncertainty surrounding key data points such as unit energy savings and free ridership.« less

  7. Uncertainty Measurement for Trace Element Analysis of Uranium and Plutonium Samples by Inductively Coupled Plasma-Atomic Emission Spectrometry (ICP-AES) and Inductively Coupled Plasma-Mass Spectrometry (ICP-MS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallimore, David L.

    2012-06-13

    The measurement uncertainty estimatino associated with trace element analysis of impurities in U and Pu was evaluated using the Guide to the Expression of Uncertainty Measurement (GUM). I this evalution the uncertainty sources were identified and standard uncertainties for the components were categorized as either Type A or B. The combined standard uncertainty was calculated and a coverage factor k = 2 was applied to obtain the expanded uncertainty, U. The ICP-AES and ICP-MS methods used were deveoped for the multi-element analysis of U and Pu samples. A typical analytical run consists of standards, process blanks, samples, matrix spiked samples,more » post digestion spiked samples and independent calibration verification standards. The uncertainty estimation was performed on U and Pu samples that have been analyzed previously as part of the U and Pu Sample Exchange Programs. Control chart results and data from the U and Pu metal exchange programs were combined with the GUM into a concentration dependent estimate of the expanded uncertainty. Comparison of trace element uncertainties obtained using this model was compared to those obtained for trace element results as part of the Exchange programs. This process was completed for all trace elements that were determined to be above the detection limit for the U and Pu samples.« less

  8. Life support technology investment strategies for flight programs: An application of decision analysis

    NASA Technical Reports Server (NTRS)

    Schlater, Nelson J.; Simonds, Charles H.; Ballin, Mark G.

    1993-01-01

    Applied research and technology development (R&TD) is often characterized by uncertainty, risk, and significant delays before tangible returns are obtained. Given the increased awareness of limitations in resources, effective R&TD today needs a method for up-front assessment of competing technologies to help guide technology investment decisions. Such an assessment approach must account for uncertainties in system performance parameters, mission requirements and architectures, and internal and external events influencing a development program. The methodology known as decision analysis has the potential to address these issues. It was evaluated by performing a case study assessment of alternative carbon dioxide removal technologies for NASA"s proposed First Lunar Outpost program. An approach was developed that accounts for the uncertainties in each technology's cost and performance parameters as well as programmatic uncertainties such as mission architecture. Life cycle cost savings relative to a baseline, adjusted for the cost of money, was used as a figure of merit to evaluate each of the alternative carbon dioxide removal technology candidates. The methodology was found to provide a consistent decision-making strategy for the develpoment of new life support technology. The case study results provided insight that was not possible from more traditional analysis approaches.

  9. Life support technology investment strategies for flight programs: An application of decision analysis

    NASA Technical Reports Server (NTRS)

    Schlater, Nelson J.; Simonds, Charles H.; Ballin, Mark G.

    1993-01-01

    Applied research and technology development (R&TD) is often characterized by uncertainty, risk, and significant delays before tangible returns are obtained. Given the increased awareness of limitations in resources, effective R&TD today needs a method for up-front assessment of competing technologies to help guide technology investment decisions. Such an assessment approach must account for uncertainties in system performance parameters, mission requirements and architectures, and internal and external events influencing a development program. The methodology known as decision analysis has the potential to address these issues. It was evaluated by performing a case study assessment of alternative carbon dioxide removal technologies for NASA's proposed First Lunar Outpost program. An approach was developed that accounts for the uncertainties in each technology's cost and performance parameters as well as programmatic uncertainties such as mission architecture. Life cycle cost savings relative to a baseline, adjusted for the cost of money, was used as a figure of merit to evaluate each of the alternative carbon dioxide removal technology candidates. The methodology was found to provide a consistent decision-making strategy for development of new life support technology. The case study results provided insight that was not possible from more traditional analysis approaches.

  10. Multi-criteria group decision making for evaluating the performance of e-waste recycling programs under uncertainty.

    PubMed

    Wibowo, Santoso; Deng, Hepu

    2015-06-01

    This paper presents a multi-criteria group decision making approach for effectively evaluating the performance of e-waste recycling programs under uncertainty in an organization. Intuitionistic fuzzy numbers are used for adequately representing the subjective and imprecise assessments of the decision makers in evaluating the relative importance of evaluation criteria and the performance of individual e-waste recycling programs with respect to individual criteria in a given situation. An interactive fuzzy multi-criteria decision making algorithm is developed for facilitating consensus building in a group decision making environment to ensure that all the interest of individual decision makers have been appropriately considered in evaluating alternative e-waste recycling programs with respect to their corporate sustainability performance. The developed algorithm is then incorporated into a multi-criteria decision support system for making the overall performance evaluation process effectively and simple to use. Such a multi-criteria decision making system adequately provides organizations with a proactive mechanism for incorporating the concept of corporate sustainability into their regular planning decisions and business practices. An example is presented for demonstrating the applicability of the proposed approach in evaluating the performance of e-waste recycling programs in organizations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Development and status of data quality assurance program at NASA Langley research center: Toward national standards

    NASA Technical Reports Server (NTRS)

    Hemsch, Michael J.

    1996-01-01

    As part of a continuing effort to re-engineer the wind tunnel testing process, a comprehensive data quality assurance program is being established at NASA Langley Research Center (LaRC). The ultimate goal of the program is routing provision of tunnel-to-tunnel reproducibility with total uncertainty levels acceptable for test and evaluation of civilian transports. The operational elements for reaching such levels of reproducibility are: (1) statistical control, which provides long term measurement uncertainty predictability and a base for continuous improvement, (2) measurement uncertainty prediction, which provides test designs that can meet data quality expectations with the system's predictable variation, and (3) national standards, which provide a means for resolving tunnel-to-tunnel differences. The paper presents the LaRC design for the program and discusses the process of implementation.

  12. Kiwi: An Evaluated Library of Uncertainties in Nuclear Data and Package for Nuclear Sensitivity Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pruet, J

    2007-06-23

    This report describes Kiwi, a program developed at Livermore to enable mature studies of the relation between imperfectly known nuclear physics and uncertainties in simulations of complicated systems. Kiwi includes a library of evaluated nuclear data uncertainties, tools for modifying data according to these uncertainties, and a simple interface for generating processed data used by transport codes. As well, Kiwi provides access to calculations of k eigenvalues for critical assemblies. This allows the user to check implications of data modifications against integral experiments for multiplying systems. Kiwi is written in python. The uncertainty library has the same format and directorymore » structure as the native ENDL used at Livermore. Calculations for critical assemblies rely on deterministic and Monte Carlo codes developed by B division.« less

  13. Evaluative Research in Corrections: The Uncertain Road.

    ERIC Educational Resources Information Center

    Adams, Stuart

    Martinson's provocative article in Public Interest (Spring, 1974), denying efficacy in prisoner reform, singled out one of the uncertainties in correctional research. In their totality, these uncertainties embrace not only rehabilitative programs but also the method, theory, and organization of correctional research. To comprehend the status and…

  14. Overview of the AVT-191 Project to Assess Sensitivity Analysis and Uncertainty Quantification Methods for Military Vehicle Design

    NASA Technical Reports Server (NTRS)

    Benek, John A.; Luckring, James M.

    2017-01-01

    A NATO symposium held in 2008 identified many promising sensitivity analysis and un-certainty quantification technologies, but the maturity and suitability of these methods for realistic applications was not known. The STO Task Group AVT-191 was established to evaluate the maturity and suitability of various sensitivity analysis and uncertainty quantification methods for application to realistic problems of interest to NATO. The program ran from 2011 to 2015, and the work was organized into four discipline-centric teams: external aerodynamics, internal aerodynamics, aeroelasticity, and hydrodynamics. This paper presents an overview of the AVT-191 program content.

  15. An application of multiattribute decision analysis to the Space Station Freedom program. Case study: Automation and robotics technology evaluation

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey H.; Levin, Richard R.; Carpenter, Elisabeth J.

    1990-01-01

    The results are described of an application of multiattribute analysis to the evaluation of high leverage prototyping technologies in the automation and robotics (A and R) areas that might contribute to the Space Station (SS) Freedom baseline design. An implication is that high leverage prototyping is beneficial to the SS Freedom Program as a means for transferring technology from the advanced development program to the baseline program. The process also highlights the tradeoffs to be made between subsidizing high value, low risk technology development versus high value, high risk technology developments. Twenty one A and R Technology tasks spanning a diverse array of technical concepts were evaluated using multiattribute decision analysis. Because of large uncertainties associated with characterizing the technologies, the methodology was modified to incorporate uncertainty. Eight attributes affected the rankings: initial cost, operation cost, crew productivity, safety, resource requirements, growth potential, and spinoff potential. The four attributes of initial cost, operations cost, crew productivity, and safety affected the rankings the most.

  16. Accounting for Uncertainty and Time Lags in Equivalency Calculations for Offsetting in Aquatic Resources Management Programs

    NASA Astrophysics Data System (ADS)

    Bradford, Michael J.

    2017-10-01

    Biodiversity offset programs attempt to minimize unavoidable environmental impacts of anthropogenic activities by requiring offsetting measures in sufficient quantity to counterbalance losses due to the activity. Multipliers, or offsetting ratios, have been used to increase the amount of offsets to account for uncertainty but those ratios have generally been derived from theoretical or ad-hoc considerations. I analyzed uncertainty in the offsetting process in the context of offsetting for impacts to freshwater fisheries productivity. For aquatic habitats I demonstrate that an empirical risk-based approach for evaluating prediction uncertainty is feasible, and if data are available appropriate adjustments to offset requirements can be estimated. For two data-rich examples I estimate multipliers in the range of 1.5:1 - 2.5:1 are sufficient to account for the uncertainty in the prediction of gains and losses. For aquatic habitats adjustments for time delays in the delivery of offset benefits can also be calculated and are likely smaller than those for prediction uncertainty. However, the success of a biodiversity offsetting program will also depend on the management of the other components of risk not addressed by these adjustments.

  17. Accounting for Uncertainty and Time Lags in Equivalency Calculations for Offsetting in Aquatic Resources Management Programs.

    PubMed

    Bradford, Michael J

    2017-10-01

    Biodiversity offset programs attempt to minimize unavoidable environmental impacts of anthropogenic activities by requiring offsetting measures in sufficient quantity to counterbalance losses due to the activity. Multipliers, or offsetting ratios, have been used to increase the amount of offsets to account for uncertainty but those ratios have generally been derived from theoretical or ad-hoc considerations. I analyzed uncertainty in the offsetting process in the context of offsetting for impacts to freshwater fisheries productivity. For aquatic habitats I demonstrate that an empirical risk-based approach for evaluating prediction uncertainty is feasible, and if data are available appropriate adjustments to offset requirements can be estimated. For two data-rich examples I estimate multipliers in the range of 1.5:1 - 2.5:1 are sufficient to account for the uncertainty in the prediction of gains and losses. For aquatic habitats adjustments for time delays in the delivery of offset benefits can also be calculated and are likely smaller than those for prediction uncertainty. However, the success of a biodiversity offsetting program will also depend on the management of the other components of risk not addressed by these adjustments.

  18. Benchmark Evaluation of HTR-PROTEUS Pebble Bed Experimental Program

    DOE PAGES

    Bess, John D.; Montierth, Leland; Köberl, Oliver; ...

    2014-10-09

    Benchmark models were developed to evaluate 11 critical core configurations of the HTR-PROTEUS pebble bed experimental program. Various additional reactor physics measurements were performed as part of this program; currently only a total of 37 absorber rod worth measurements have been evaluated as acceptable benchmark experiments for Cores 4, 9, and 10. Dominant uncertainties in the experimental keff for all core configurations come from uncertainties in the ²³⁵U enrichment of the fuel, impurities in the moderator pebbles, and the density and impurity content of the radial reflector. Calculations of k eff with MCNP5 and ENDF/B-VII.0 neutron nuclear data aremore » greater than the benchmark values but within 1% and also within the 3σ uncertainty, except for Core 4, which is the only randomly packed pebble configuration. Repeated calculations of k eff with MCNP6.1 and ENDF/B-VII.1 are lower than the benchmark values and within 1% (~3σ) except for Cores 5 and 9, which calculate lower than the benchmark eigenvalues within 4σ. The primary difference between the two nuclear data libraries is the adjustment of the absorption cross section of graphite. Simulations of the absorber rod worth measurements are within 3σ of the benchmark experiment values. The complete benchmark evaluation details are available in the 2014 edition of the International Handbook of Evaluated Reactor Physics Benchmark Experiments.« less

  19. A Conceptual Methodology for Assessing Acquisition Requirements Robustness against Technology Uncertainties

    NASA Astrophysics Data System (ADS)

    Chou, Shuo-Ju

    2011-12-01

    In recent years the United States has shifted from a threat-based acquisition policy that developed systems for countering specific threats to a capabilities-based strategy that emphasizes the acquisition of systems that provide critical national defense capabilities. This shift in policy, in theory, allows for the creation of an "optimal force" that is robust against current and future threats regardless of the tactics and scenario involved. In broad terms, robustness can be defined as the insensitivity of an outcome to "noise" or non-controlled variables. Within this context, the outcome is the successful achievement of defense strategies and the noise variables are tactics and scenarios that will be associated with current and future enemies. Unfortunately, a lack of system capability, budget, and schedule robustness against technology performance and development uncertainties has led to major setbacks in recent acquisition programs. This lack of robustness stems from the fact that immature technologies have uncertainties in their expected performance, development cost, and schedule that cause to variations in system effectiveness and program development budget and schedule requirements. Unfortunately, the Technology Readiness Assessment process currently used by acquisition program managers and decision-makers to measure technology uncertainty during critical program decision junctions does not adequately capture the impact of technology performance and development uncertainty on program capability and development metrics. The Technology Readiness Level metric employed by the TRA to describe program technology elements uncertainties can only provide a qualitative and non-descript estimation of the technology uncertainties. In order to assess program robustness, specifically requirements robustness, against technology performance and development uncertainties, a new process is needed. This process should provide acquisition program managers and decision-makers with the ability to assess or measure the robustness of program requirements against such uncertainties. A literature review of techniques for forecasting technology performance and development uncertainties and subsequent impacts on capability, budget, and schedule requirements resulted in the conclusion that an analysis process that coupled a probabilistic analysis technique such as Monte Carlo Simulations with quantitative and parametric models of technology performance impact and technology development time and cost requirements would allow the probabilities of meeting specific constraints of these requirements to be established. These probabilities of requirements success metrics can then be used as a quantitative and probabilistic measure of program requirements robustness against technology uncertainties. Combined with a Multi-Objective Genetic Algorithm optimization process and computer-based Decision Support System, critical information regarding requirements robustness against technology uncertainties can be captured and quantified for acquisition decision-makers. This results in a more informed and justifiable selection of program technologies during initial program definition as well as formulation of program development and risk management strategies. To meet the stated research objective, the ENhanced TEchnology Robustness Prediction and RISk Evaluation (ENTERPRISE) methodology was formulated to provide a structured and transparent process for integrating these enabling techniques to provide a probabilistic and quantitative assessment of acquisition program requirements robustness against technology performance and development uncertainties. In order to demonstrate the capabilities of the ENTERPRISE method and test the research Hypotheses, an demonstration application of this method was performed on a notional program for acquiring the Carrier-based Suppression of Enemy Air Defenses (SEAD) using Unmanned Combat Aircraft Systems (UCAS) and their enabling technologies. The results of this implementation provided valuable insights regarding the benefits and inner workings of this methodology as well as its limitations that should be addressed in the future to narrow the gap between current state and the desired state.

  20. Manpower Evaluations: Vulnerable but Useful

    ERIC Educational Resources Information Center

    Killingsworth, Charles C.

    1975-01-01

    Most of the evaluations of institutional training under the Manpower Development and Training Act are highly favorable. Negative criticisms, however, emphasize the uncertainties in these studies and displacement effects of the programs. The article answers these criticisms. (MW)

  1. 7 CFR 3415.15 - Evaluation factors.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... AGRICULTURE BIOTECHNOLOGY RISK ASSESSMENT RESEARCH GRANTS PROGRAM Scientific Peer Review of Research Grant... criteria are specified in the annual program solicitation: (a) Scientific merit of the proposal. (1... uncertainty for United States agriculture. (1) Scientific contribution of research in leading to important...

  2. 7 CFR 3415.15 - Evaluation factors.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... AGRICULTURE BIOTECHNOLOGY RISK ASSESSMENT RESEARCH GRANTS PROGRAM Scientific Peer Review of Research Grant... criteria are specified in the annual program solicitation: (a) Scientific merit of the proposal. (1... uncertainty for United States agriculture. (1) Scientific contribution of research in leading to important...

  3. 7 CFR 3415.15 - Evaluation factors.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... AGRICULTURE BIOTECHNOLOGY RISK ASSESSMENT RESEARCH GRANTS PROGRAM Scientific Peer Review of Research Grant... criteria are specified in the annual program solicitation: (a) Scientific merit of the proposal. (1... uncertainty for United States agriculture. (1) Scientific contribution of research in leading to important...

  4. Hierarchical Bayesian Model Averaging for Chance Constrained Remediation Designs

    NASA Astrophysics Data System (ADS)

    Chitsazan, N.; Tsai, F. T.

    2012-12-01

    Groundwater remediation designs are heavily relying on simulation models which are subjected to various sources of uncertainty in their predictions. To develop a robust remediation design, it is crucial to understand the effect of uncertainty sources. In this research, we introduce a hierarchical Bayesian model averaging (HBMA) framework to segregate and prioritize sources of uncertainty in a multi-layer frame, where each layer targets a source of uncertainty. The HBMA framework provides an insight to uncertainty priorities and propagation. In addition, HBMA allows evaluating model weights in different hierarchy levels and assessing the relative importance of models in each level. To account for uncertainty, we employ a chance constrained (CC) programming for stochastic remediation design. Chance constrained programming was implemented traditionally to account for parameter uncertainty. Recently, many studies suggested that model structure uncertainty is not negligible compared to parameter uncertainty. Using chance constrained programming along with HBMA can provide a rigorous tool for groundwater remediation designs under uncertainty. In this research, the HBMA-CC was applied to a remediation design in a synthetic aquifer. The design was to develop a scavenger well approach to mitigate saltwater intrusion toward production wells. HBMA was employed to assess uncertainties from model structure, parameter estimation and kriging interpolation. An improved harmony search optimization method was used to find the optimal location of the scavenger well. We evaluated prediction variances of chloride concentration at the production wells through the HBMA framework. The results showed that choosing the single best model may lead to a significant error in evaluating prediction variances for two reasons. First, considering the single best model, variances that stem from uncertainty in the model structure will be ignored. Second, considering the best model with non-dominant model weight may underestimate or overestimate prediction variances by ignoring other plausible propositions. Chance constraints allow developing a remediation design with a desirable reliability. However, considering the single best model, the calculated reliability will be different from the desirable reliability. We calculated the reliability of the design for the models at different levels of HBMA. The results showed that by moving toward the top layers of HBMA, the calculated reliability converges to the chosen reliability. We employed the chance constrained optimization along with the HBMA framework to find the optimal location and pumpage for the scavenger well. The results showed that using models at different levels in the HBMA framework, the optimal location of the scavenger well remained the same, but the optimal extraction rate was altered. Thus, we concluded that the optimal pumping rate was sensitive to the prediction variance. Also, the prediction variance was changed by using different extraction rate. Using very high extraction rate will cause prediction variances of chloride concentration at the production wells to approach zero regardless of which HBMA models used.

  5. Rainfall Product Evaluation for the TRMM Ground Validation Program

    NASA Technical Reports Server (NTRS)

    Amitai, E.; Wolff, D. B.; Robinson, M.; Silberstein, D. S.; Marks, D. A.; Kulie, M. S.; Fisher, B.; Einaudi, Franco (Technical Monitor)

    2000-01-01

    Evaluation of the Tropical Rainfall Measuring Mission (TRMM) satellite observations is conducted through a comprehensive Ground Validation (GV) Program. Standardized instantaneous and monthly rainfall products are routinely generated using quality-controlled ground based radar data from four primary GV sites. As part of the TRMM GV program, effort is being made to evaluate these GV products and to determine the uncertainties of the rainfall estimates. The evaluation effort is based on comparison to rain gauge data. The variance between the gauge measurement and the true averaged rain amount within the radar pixel is a limiting factor in the evaluation process. While monthly estimates are relatively simple to evaluate, the evaluation of the instantaneous products are much more of a challenge. Scattegrams of point comparisons between radar and rain gauges are extremely noisy for several reasons (e.g. sample volume discrepancies, timing and navigation mismatches, variability of Z(sub e)-R relationships), and therefore useless for evaluating the estimates. Several alternative methods, such as the analysis of the distribution of rain volume by rain rate as derived from gauge intensities and from reflectivities above the gauge network will be presented. Alternative procedures to increase the accuracy of the estimates and to reduce their uncertainties also will be discussed.

  6. Conflicting Expertise and Uncertainty: Quality Assurance in High-Level Radioactive Waste Management.

    ERIC Educational Resources Information Center

    Fitzgerald, Michael R.; McCabe, Amy Snyder

    1991-01-01

    Dynamics of a large, expensive, and controversial surface and underground evaluation of a radioactive waste management program at the Yucca Mountain power plant are reviewed. The use of private contractors in the quality assurance study complicates the evaluation. This case study illustrates high stakes evaluation problems. (SLD)

  7. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use

    ERIC Educational Resources Information Center

    Patton, Michael Quinn

    2010-01-01

    Developmental evaluation (DE) offers a powerful approach to monitoring and supporting social innovations by working in partnership with program decision makers. In this book, eminent authority shows how to conduct evaluations within a DE framework. Patton draws on insights about complex dynamic systems, uncertainty, nonlinearity, and emergence. He…

  8. Modelling of the X,Y,Z positioning errors and uncertainty evaluation for the LNE’s mAFM using the Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Ceria, Paul; Ducourtieux, Sebastien; Boukellal, Younes; Allard, Alexandre; Fischer, Nicolas; Feltin, Nicolas

    2017-03-01

    In order to evaluate the uncertainty budget of the LNE’s mAFM, a reference instrument dedicated to the calibration of nanoscale dimensional standards, a numerical model has been developed to evaluate the measurement uncertainty of the metrology loop involved in the XYZ positioning of the tip relative to the sample. The objective of this model is to overcome difficulties experienced when trying to evaluate some uncertainty components which cannot be experimentally determined and more specifically, the one linked to the geometry of the metrology loop. The model is based on object-oriented programming and developed under Matlab. It integrates one hundred parameters that allow the control of the geometry of the metrology loop without using analytical formulae. The created objects, mainly the reference and the mobile prism and their mirrors, the interferometers and their laser beams, can be moved and deformed freely to take into account several error sources. The Monte Carlo method is then used to determine the positioning uncertainty of the instrument by randomly drawing the parameters according to their associated tolerances and their probability density functions (PDFs). The whole process follows Supplement 2 to ‘The Guide to the Expression of the Uncertainty in Measurement’ (GUM). Some advanced statistical tools like Morris design and Sobol indices are also used to provide a sensitivity analysis by identifying the most influential parameters and quantifying their contribution to the XYZ positioning uncertainty. The approach validated in the paper shows that the actual positioning uncertainty is about 6 nm. As the final objective is to reach 1 nm, we engage in a discussion to estimate the most effective way to reduce the uncertainty.

  9. Impact of Parameter Uncertainty Assessment of Critical SWAT Output Simulations

    USDA-ARS?s Scientific Manuscript database

    Watershed models are increasingly being utilized to evaluate alternate management scenarios for improving water quality. The concern for using these tools in extensive programs such as the National Total Maximum Daily Load (TMDL) program is that the certainty of model results and efficacy of managem...

  10. A Computer Program to Evaluate Timber Production Investments Under Uncertainty

    Treesearch

    Dennis L. Schweitzer

    1968-01-01

    A computer program has been written in Fortran IV to calculate probability distributions of present worths of investments in timber production. Inputs can include both point and probabilistic estimates of future costs, prices, and yields. Distributions of rates of return can also be constructed.

  11. Evaluative Research of the Mentoring Process of the PGDT, with Particular Reference to Cluster Centers under Jimma University Facilitation

    ERIC Educational Resources Information Center

    Tegegne, Worku Fentie; Gelaneh, Alebachew Hailu

    2015-01-01

    The objective of the study is to evaluate the mentoring process of the PGDT program which was under the supervision of Jimma University in the regional states of Oromia and SNNP, Ethiopia. The overall intention was to see whether the program was being underway as expected. Because, there was uncertainty regarding the proper running of it as it was…

  12. A fuzzy chance-constrained programming model with type 1 and type 2 fuzzy sets for solid waste management under uncertainty

    NASA Astrophysics Data System (ADS)

    Ma, Xiaolin; Ma, Chi; Wan, Zhifang; Wang, Kewei

    2017-06-01

    Effective management of municipal solid waste (MSW) is critical for urban planning and development. This study aims to develop an integrated type 1 and type 2 fuzzy sets chance-constrained programming (ITFCCP) model for tackling regional MSW management problem under a fuzzy environment, where waste generation amounts are supposed to be type 2 fuzzy variables and treated capacities of facilities are assumed to be type 1 fuzzy variables. The evaluation and expression of uncertainty overcome the drawbacks in describing fuzzy possibility distributions as oversimplified forms. The fuzzy constraints are converted to their crisp equivalents through chance-constrained programming under the same or different confidence levels. Regional waste management of the City of Dalian, China, was used as a case study for demonstration. The solutions under various confidence levels reflect the trade-off between system economy and reliability. It is concluded that the ITFCCP model is capable of helping decision makers to generate reasonable waste-allocation alternatives under uncertainties.

  13. Probabilistic simulation of uncertainties in thermal structures

    NASA Technical Reports Server (NTRS)

    Chamis, Christos C.; Shiao, Michael

    1990-01-01

    Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.

  14. Chapter 6: Residential Lighting Evaluation Protocol. The Uniform Methods Project: Methods for Determining Energy Efficiency Savings for Specific Measures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurnik, Charles W; Dimetrosky, Scott; Parkinson, Katie

    Given new regulations, increased complexity in the market, and the general shift from CFLs to LEDs, this evaluation protocol was updated in 2017 to shift the focus of the protocols toward LEDs and away from CFLs and to resolve evaluation uncertainties affecting residential lighting incentive programs.

  15. Recent developments in measurement and evaluation of FAC damage in power plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garud, Y.S.; Besuner, P.; Cohn, M.J.

    1999-11-01

    This paper describes some recent developments in the measurement and evaluation of flow-accelerated corrosion (FAC) damage in power plants. The evaluation focuses on data checking and smoothing to account for gross errors, noise, and uncertainty in the wall thickness measurements from ultrasonic or pulsed eddy-current data. Also, the evaluation method utilizes advanced regression analysis for spatial and temporal evolution of the wall loss, providing statistically robust predictions of wear rates and associated uncertainty. Results of the application of these new tools are presented for several components in actual service. More importantly, the practical implications of using these advances are discussedmore » in relation to the likely impact on the scope and effectiveness of FAC related inspection programs.« less

  16. Predicting long-range transport: a systematic evaluation of two multimedia transport models.

    PubMed

    Bennett, D H; Scheringer, M; McKone, T E; Hungerbühler, K

    2001-03-15

    The United Nations Environment Program has recently developed criteria to identify and restrict chemicals with a potential for persistence and long-range transport (persistent organic pollutants or POPs). There are many stakeholders involved, and the issues are not only scientific but also include social, economic, and political factors. This work focuses on one aspect of the POPs debate, the criteria for determining the potential for long-range transport (LRT). Our goal is to determine if current models are reliable enough to support decisions that classify a chemical based on the LRT potential. We examine the robustness of two multimedia fate models for determining the relative ranking and absolute spatial range of various chemicals in the environment. We also consider the effect of parameter uncertainties and the model uncertainty associated with the selection of an algorithm for gas-particle partitioning on the model results. Given the same chemical properties, both models give virtually the same ranking. However, when chemical parameter uncertainties and model uncertainties such as particle partitioning are considered, the spatial range distributions obtained for the individual chemicals overlap, preventing a distinct rank order. The absolute values obtained for the predicted spatial range or travel distance differ significantly between the two models for the uncertainties evaluated. We find that to evaluate a chemical when large and unresolved uncertainties exist, it is more informative to use two or more models and include multiple types of uncertainty. Model differences and uncertainties must be explicitly confronted to determine how the limitations of scientific knowledge impact predictions in the decision-making process.

  17. Cost and Economics for Advanced Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Whitfield, Jeff

    1998-01-01

    Market sensitivity and weight-based cost estimating relationships are key drivers in determining the financial viability of advanced space launch vehicle designs. Due to decreasing space transportation budgets and increasing foreign competition, it has become essential for financial assessments of prospective launch vehicles to be performed during the conceptual design phase. As part of this financial assessment, it is imperative to understand the relationship between market volatility, the uncertainty of weight estimates, and the economic viability of an advanced space launch vehicle program. This paper reports the results of a study that evaluated the economic risk inherent in market variability and the uncertainty of developing weight estimates for an advanced space launch vehicle program. The purpose of this study was to determine the sensitivity of a business case for advanced space flight design with respect to the changing nature of market conditions and the complexity of determining accurate weight estimations during the conceptual design phase. The expected uncertainty associated with these two factors drives the economic risk of the overall program. The study incorporates Monte Carlo simulation techniques to determine the probability of attaining specific levels of economic performance when the market and weight parameters are allowed to vary. This structured approach toward uncertainties allows for the assessment of risks associated with a launch vehicle program's economic performance. This results in the determination of the value of the additional risk placed on the project by these two factors.

  18. 7 CFR 3415.15 - Evaluation factors.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., AND EXTENSION SERVICE, DEPARTMENT OF AGRICULTURE BIOTECHNOLOGY RISK ASSESSMENT RESEARCH GRANTS PROGRAM...) Novelty, uniqueness and originality; and (7) Appropriateness to regulation of biotechnology and risk... solving biotechnology regulatory uncertainty for United States agriculture. (1) Scientific contribution of...

  19. Uncertainty of Acute Stroke Patients: A Cross-sectional Descriptive and Correlational Study.

    PubMed

    Ni, Chunping; Peng, Jing; Wei, Yuanyuan; Hua, Yan; Ren, Xiaoran; Su, Xiangni; Shi, Ruijie

    2018-06-12

    Uncertainty is a chronic and pervasive source of psychological distress for patients and plays an important role in the rehabilitation of stroke survivors. Little is known about the level and correlates of uncertainty among patients in the acute phase of stroke. The purposes of this study were to describe the uncertainty of patients in the acute phase of stroke and to explore characteristics of patients associated with that uncertainty. A cross-sectional descriptive and correlational study was conducted with a convenience sample of 451 consecutive hospitalized acute stroke patients recruited from the neurology department of 2 general hospitals of China. Uncertainty was measured using Chinese versions of Mishel Uncertainty in Illness Scale for Adults on the fourth day of patients' admission. The patients had moderately high Mishel Uncertainty in Illness Scale for Adults scores (mean [SD], 74.37 [9.22]) in the acute phase of stroke. A total of 95.2% and 2.9% of patients were in moderate and high levels of uncertainty, respectively. The mean (SD) score of ambiguity (3.05 [0.39]) was higher than that of complexity (2.88 [0.52]). Each of the following characteristics was independently associated with greater uncertainty: functional status (P = .000), suffering from other chronic diseases (P = .000), time since the first-ever stroke (P = .000), self-evaluated economic pressure (P = .000), family monthly income (P = .001), educational level (P = .006), and self-evaluated severity of disease (P = .000). Patients experienced persistently, moderately high uncertainty in the acute phase of stroke. Ameliorating uncertainty should be an integral part of the rehabilitation program. Better understanding of uncertainty and its associated characteristics may help nurses identify patients at the highest risk who may benefit from targeted interventions.

  20. Uncertain programming models for portfolio selection with uncertain returns

    NASA Astrophysics Data System (ADS)

    Zhang, Bo; Peng, Jin; Li, Shengguo

    2015-10-01

    In an indeterminacy economic environment, experts' knowledge about the returns of securities consists of much uncertainty instead of randomness. This paper discusses portfolio selection problem in uncertain environment in which security returns cannot be well reflected by historical data, but can be evaluated by the experts. In the paper, returns of securities are assumed to be given by uncertain variables. According to various decision criteria, the portfolio selection problem in uncertain environment is formulated as expected-variance-chance model and chance-expected-variance model by using the uncertainty programming. Within the framework of uncertainty theory, for the convenience of solving the models, some crisp equivalents are discussed under different conditions. In addition, a hybrid intelligent algorithm is designed in the paper to provide a general method for solving the new models in general cases. At last, two numerical examples are provided to show the performance and applications of the models and algorithm.

  1. Creating Extension Programs for Change: Forest Landowners and Climate Change Communication

    ERIC Educational Resources Information Center

    Krantz, Shelby; Monroe, Martha; Bartels, Wendy-Lin

    2013-01-01

    The Cooperative Extension Service in the United States can play an important role in educating forest landowners to improve forest resilience in the face of climatic uncertainty. Two focus groups in Florida informed the development of a program that was conducted in Leon County; presurveys and postsurveys and observation provided evaluation data.…

  2. A Structural Evaluation of a Large-Scale Quasi-Experimental Microfinance Initiative

    PubMed Central

    Kaboski, Joseph P.; Townsend, Robert M.

    2010-01-01

    This paper uses a structural model to understand, predict, and evaluate the impact of an exogenous microcredit intervention program, the Thai Million Baht Village Fund program. We model household decisions in the face of borrowing constraints, income uncertainty, and high-yield indivisible investment opportunities. After estimation of parameters using pre-program data, we evaluate the model’s ability to predict and interpret the impact of the village fund intervention. Simulations from the model mirror the data in yielding a greater increase in consumption than credit, which is interpreted as evidence of credit constraints. A cost-benefit analysis using the model indicates that some households value the program much more than its per household cost, but overall the program costs 20 percent more than the sum of these benefits. PMID:22162594

  3. A Structural Evaluation of a Large-Scale Quasi-Experimental Microfinance Initiative.

    PubMed

    Kaboski, Joseph P; Townsend, Robert M

    2011-09-01

    This paper uses a structural model to understand, predict, and evaluate the impact of an exogenous microcredit intervention program, the Thai Million Baht Village Fund program. We model household decisions in the face of borrowing constraints, income uncertainty, and high-yield indivisible investment opportunities. After estimation of parameters using pre-program data, we evaluate the model's ability to predict and interpret the impact of the village fund intervention. Simulations from the model mirror the data in yielding a greater increase in consumption than credit, which is interpreted as evidence of credit constraints. A cost-benefit analysis using the model indicates that some households value the program much more than its per household cost, but overall the program costs 20 percent more than the sum of these benefits.

  4. Risk in fire management decisionmaking: techniques and criteria

    Treesearch

    Gail Blatternberger; William F. Hyde; Thomas J. Mills

    1984-01-01

    In the past, decisionmaking in wildland fire management generally has not included a full consideration of the risk and uncertainty that is inherent in evaluating alternatives. Fire management policies in some Federal land management agencies now require risk evaluation. The model for estimating the economic efficiency of fire program alternatives is the minimization...

  5. Study of aerodynamic technology for single-cruise-engine V/STOL fighter/attack aircraft

    NASA Technical Reports Server (NTRS)

    Hess, J. R.; Bear, R. L.

    1982-01-01

    A viable, single engine, supersonic V/STOL fighter/attack aircraft concept was defined. This vectored thrust, canard wing configuration utilizes an advanced technology separated flow engine with fan stream burning. The aerodynamic characteristics of this configuration were estimated and performance evaluated. Significant aerodynamic and aerodynamic propulsion interaction uncertainties requiring additional investigation were identified. A wind tunnel model concept and test program to resolve these uncertainties and validate the aerodynamic prediction methods were defined.

  6. Reliability and risk assessment of structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.

    1991-01-01

    Development of reliability and risk assessment of structural components and structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) the evaluation of the various uncertainties in terms of cumulative distribution functions for various structural response variables based on known or assumed uncertainties in primitive structural variables; (2) evaluation of the failure probability; (3) reliability and risk-cost assessment; and (4) an outline of an emerging approach for eventual certification of man-rated structures by computational methods. Collectively, the results demonstrate that the structural durability/reliability of man-rated structural components and structures can be effectively evaluated by using formal probabilistic methods.

  7. Summary Findings from the AVT-191 Project to Assess Sensitivity Analysis and Uncertainty Quantification Methods for Military Vehicle Design

    NASA Technical Reports Server (NTRS)

    Benek, John A.; Luckring, James M.

    2017-01-01

    A NATO symposium held in Greece in 2008 identified many promising sensitivity analysis and uncertainty quantification technologies, but the maturity and suitability of these methods for realistic applications was not clear. The NATO Science and Technology Organization, Task Group AVT-191 was established to evaluate the maturity and suitability of various sensitivity analysis and uncertainty quantification methods for application to realistic vehicle development problems. The program ran from 2011 to 2015, and the work was organized into four discipline-centric teams: external aerodynamics, internal aerodynamics, aeroelasticity, and hydrodynamics. This paper summarizes findings and lessons learned from the task group.

  8. Probabilistic structural analysis methods of hot engine structures

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1989-01-01

    Development of probabilistic structural analysis methods for hot engine structures at Lewis Research Center is presented. Three elements of the research program are: (1) composite load spectra methodology; (2) probabilistic structural analysis methodology; and (3) probabilistic structural analysis application. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) turbine blade temperature, pressure, and torque of the space shuttle main engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; and (3) evaluation of the failure probability. Collectively, the results demonstrate that the structural durability of hot engine structural components can be effectively evaluated in a formal probabilistic/reliability framework.

  9. JUPITER PROJECT - MERGING INVERSE PROBLEM FORMULATION TECHNOLOGIES

    EPA Science Inventory

    The JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) project seeks to enhance and build on the technology and momentum behind two of the most popular sensitivity analysis, data assessment, calibration, and uncertainty analysis programs used in envi...

  10. Linking trading ratio with TMDL (total maximum daily load) allocation matrix and uncertainty analysis.

    PubMed

    Zhang, H X

    2008-01-01

    An innovative approach for total maximum daily load (TMDL) allocation and implementation is the watershed-based pollutant trading. Given the inherent scientific uncertainty for the tradeoffs between point and nonpoint sources, setting of trading ratios can be a contentious issue and was already listed as an obstacle by several pollutant trading programs. One of the fundamental reasons that a trading ratio is often set higher (e.g. greater than 2) is to allow for uncertainty in the level of control needed to attain water quality standards, and to provide a buffer in case traded reductions are less effective than expected. However, most of the available studies did not provide an approach to explicitly address the determination of trading ratio. Uncertainty analysis has rarely been linked to determination of trading ratio.This paper presents a practical methodology in estimating "equivalent trading ratio (ETR)" and links uncertainty analysis with trading ratio determination from TMDL allocation process. Determination of ETR can provide a preliminary evaluation of "tradeoffs" between various combination of point and nonpoint source control strategies on ambient water quality improvement. A greater portion of NPS load reduction in overall TMDL load reduction generally correlates with greater uncertainty and thus requires greater trading ratio. The rigorous quantification of trading ratio will enhance the scientific basis and thus public perception for more informed decision in overall watershed-based pollutant trading program. (c) IWA Publishing 2008.

  11. Toward Affordable Systems II: Portfolio Management for Army Science and Technology Programs Under Uncertainties

    DTIC Science & Technology

    2011-01-01

    5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Rand Corporation ,Arroyo Center,PO Box...2138, 1776 Main Street,Santa Monica,CA,90407-2138 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES...research, development, test , and evaluation programs; and those who are interested in the optimal allocation of funds among different programs and/or

  12. Quantifying acoustic doppler current profiler discharge uncertainty: A Monte Carlo based tool for moving-boat measurements

    USGS Publications Warehouse

    Mueller, David S.

    2017-01-01

    This paper presents a method using Monte Carlo simulations for assessing uncertainty of moving-boat acoustic Doppler current profiler (ADCP) discharge measurements using a software tool known as QUant, which was developed for this purpose. Analysis was performed on 10 data sets from four Water Survey of Canada gauging stations in order to evaluate the relative contribution of a range of error sources to the total estimated uncertainty. The factors that differed among data sets included the fraction of unmeasured discharge relative to the total discharge, flow nonuniformity, and operator decisions about instrument programming and measurement cross section. As anticipated, it was found that the estimated uncertainty is dominated by uncertainty of the discharge in the unmeasured areas, highlighting the importance of appropriate selection of the site, the instrument, and the user inputs required to estimate the unmeasured discharge. The main contributor to uncertainty was invalid data, but spatial inhomogeneity in water velocity and bottom-track velocity also contributed, as did variation in the edge velocity, uncertainty in the edge distances, edge coefficients, and the top and bottom extrapolation methods. To a lesser extent, spatial inhomogeneity in the bottom depth also contributed to the total uncertainty, as did uncertainty in the ADCP draft at shallow sites. The estimated uncertainties from QUant can be used to assess the adequacy of standard operating procedures. They also provide quantitative feedback to the ADCP operators about the quality of their measurements, indicating which parameters are contributing most to uncertainty, and perhaps even highlighting ways in which uncertainty can be reduced. Additionally, QUant can be used to account for self-dependent error sources such as heading errors, which are a function of heading. The results demonstrate the importance of a Monte Carlo method tool such as QUant for quantifying random and bias errors when evaluating the uncertainty of moving-boat ADCP measurements.

  13. Generation Expansion Planning With Large Amounts of Wind Power via Decision-Dependent Stochastic Programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhan, Yiduo; Zheng, Qipeng P.; Wang, Jianhui

    Power generation expansion planning needs to deal with future uncertainties carefully, given that the invested generation assets will be in operation for a long time. Many stochastic programming models have been proposed to tackle this challenge. However, most previous works assume predetermined future uncertainties (i.e., fixed random outcomes with given probabilities). In several recent studies of generation assets' planning (e.g., thermal versus renewable), new findings show that the investment decisions could affect the future uncertainties as well. To this end, this paper proposes a multistage decision-dependent stochastic optimization model for long-term large-scale generation expansion planning, where large amounts of windmore » power are involved. In the decision-dependent model, the future uncertainties are not only affecting but also affected by the current decisions. In particular, the probability distribution function is determined by not only input parameters but also decision variables. To deal with the nonlinear constraints in our model, a quasi-exact solution approach is then introduced to reformulate the multistage stochastic investment model to a mixed-integer linear programming model. The wind penetration, investment decisions, and the optimality of the decision-dependent model are evaluated in a series of multistage case studies. The results show that the proposed decision-dependent model provides effective optimization solutions for long-term generation expansion planning.« less

  14. Flexibility evaluation of multiechelon supply chains.

    PubMed

    Almeida, João Flávio de Freitas; Conceição, Samuel Vieira; Pinto, Luiz Ricardo; de Camargo, Ricardo Saraiva; Júnior, Gilberto de Miranda

    2018-01-01

    Multiechelon supply chains are complex logistics systems that require flexibility and coordination at a tactical level to cope with environmental uncertainties in an efficient and effective manner. To cope with these challenges, mathematical programming models are developed to evaluate supply chain flexibility. However, under uncertainty, supply chain models become complex and the scope of flexibility analysis is generally reduced. This paper presents a unified approach that can evaluate the flexibility of a four-echelon supply chain via a robust stochastic programming model. The model simultaneously considers the plans of multiple business divisions such as marketing, logistics, manufacturing, and procurement, whose goals are often conflicting. A numerical example with deterministic parameters is presented to introduce the analysis, and then, the model stochastic parameters are considered to evaluate flexibility. The results of the analysis on supply, manufacturing, and distribution flexibility are presented. Tradeoff analysis of demand variability and service levels is also carried out. The proposed approach facilitates the adoption of different management styles, thus improving supply chain resilience. The model can be extended to contexts pertaining to supply chain disruptions; for example, the model can be used to explore operation strategies when subtle events disrupt supply, manufacturing, or distribution.

  15. Flexibility evaluation of multiechelon supply chains

    PubMed Central

    Conceição, Samuel Vieira; Pinto, Luiz Ricardo; de Camargo, Ricardo Saraiva; Júnior, Gilberto de Miranda

    2018-01-01

    Multiechelon supply chains are complex logistics systems that require flexibility and coordination at a tactical level to cope with environmental uncertainties in an efficient and effective manner. To cope with these challenges, mathematical programming models are developed to evaluate supply chain flexibility. However, under uncertainty, supply chain models become complex and the scope of flexibility analysis is generally reduced. This paper presents a unified approach that can evaluate the flexibility of a four-echelon supply chain via a robust stochastic programming model. The model simultaneously considers the plans of multiple business divisions such as marketing, logistics, manufacturing, and procurement, whose goals are often conflicting. A numerical example with deterministic parameters is presented to introduce the analysis, and then, the model stochastic parameters are considered to evaluate flexibility. The results of the analysis on supply, manufacturing, and distribution flexibility are presented. Tradeoff analysis of demand variability and service levels is also carried out. The proposed approach facilitates the adoption of different management styles, thus improving supply chain resilience. The model can be extended to contexts pertaining to supply chain disruptions; for example, the model can be used to explore operation strategies when subtle events disrupt supply, manufacturing, or distribution. PMID:29584755

  16. Direct heuristic dynamic programming for damping oscillations in a large power system.

    PubMed

    Lu, Chao; Si, Jennie; Xie, Xiaorong

    2008-08-01

    This paper applies a neural-network-based approximate dynamic programming method, namely, the direct heuristic dynamic programming (direct HDP), to a large power system stability control problem. The direct HDP is a learning- and approximation-based approach to addressing nonlinear coordinated control under uncertainty. One of the major design parameters, the controller learning objective function, is formulated to directly account for network-wide low-frequency oscillation with the presence of nonlinearity, uncertainty, and coupling effect among system components. Results include a novel learning control structure based on the direct HDP with applications to two power system problems. The first case involves static var compensator supplementary damping control, which is used to provide a comprehensive evaluation of the learning control performance. The second case aims at addressing a difficult complex system challenge by providing a new solution to a large interconnected power network oscillation damping control problem that frequently occurs in the China Southern Power Grid.

  17. Probabilistic dual heuristic programming-based adaptive critic

    NASA Astrophysics Data System (ADS)

    Herzallah, Randa

    2010-02-01

    Adaptive critic (AC) methods have common roots as generalisations of dynamic programming for neural reinforcement learning approaches. Since they approximate the dynamic programming solutions, they are potentially suitable for learning in noisy, non-linear and non-stationary environments. In this study, a novel probabilistic dual heuristic programming (DHP)-based AC controller is proposed. Distinct to current approaches, the proposed probabilistic (DHP) AC method takes uncertainties of forward model and inverse controller into consideration. Therefore, it is suitable for deterministic and stochastic control problems characterised by functional uncertainty. Theoretical development of the proposed method is validated by analytically evaluating the correct value of the cost function which satisfies the Bellman equation in a linear quadratic control problem. The target value of the probabilistic critic network is then calculated and shown to be equal to the analytically derived correct value. Full derivation of the Riccati solution for this non-standard stochastic linear quadratic control problem is also provided. Moreover, the performance of the proposed probabilistic controller is demonstrated on linear and non-linear control examples.

  18. The use of social media in dental hygiene programs: a survey of program directors.

    PubMed

    Henry, Rachel K; Pieren, Jennifer A

    2014-08-01

    The use of social media and social networking sites has become increasingly common by the current generation of students. Colleges and universities are using social media and social networking sites to advertise, engage and recruit prospective students. The purpose of this study was to evaluate how social media is being used in dental hygiene program admissions and policy. Researchers developed a survey instrument investigating the use of social media. The survey included questions about demographic information, personal use of social media, program use of social media, social media use in admissions and social media policies. An email was sent to 321 dental hygiene program directors asking them to complete the survey. All participants were provided 4 weeks to complete the survey, and 2 reminder emails were sent. A total of 155 responses were received (48.3% response rate). While 84% of respondents indicated their program had a web page, only 20% had an official Facebook page for the program and 2% had a Twitter page. Thirty-five percent had a program policy specifically addressing the use of social media and 31% indicated that their university or institution had a policy. Only 4% of programs evaluate a potential student's Internet presence, mostly by searching on Facebook. Statistically significant differences (p≤0.05) were noted between those respondents with more personal social media accounts and those with fewer accounts, as those with more accounts were more likely to evaluate a potential student's Internet presence. Open ended responses included concern about social media issues, but some uncertainty on how to handle social media in the program. The concern for social media and professionalism was evident and more research and discussion in this area is warranted. Social media is currently being used in a variety of ways in dental hygiene programs, but not in the area of admissions. There is some uncertainty about the role social media should play in a professional environment. Copyright © 2014 The American Dental Hygienists’ Association.

  19. Evaluating land cover influences on model uncertainties—A case study of cropland carbon dynamics in the Mid-Continent Intensive Campaign region

    USGS Publications Warehouse

    Li, Zhengpeng; Liu, Shuguang; Zhang, Xuesong; West, Tristram O.; Ogle, Stephen M.; Zhou, Naijun

    2016-01-01

    Quantifying spatial and temporal patterns of carbon sources and sinks and their uncertainties across agriculture-dominated areas remains challenging for understanding regional carbon cycles. Characteristics of local land cover inputs could impact the regional carbon estimates but the effect has not been fully evaluated in the past. Within the North American Carbon Program Mid-Continent Intensive (MCI) Campaign, three models were developed to estimate carbon fluxes on croplands: an inventory-based model, the Environmental Policy Integrated Climate (EPIC) model, and the General Ensemble biogeochemical Modeling System (GEMS) model. They all provided estimates of three major carbon fluxes on cropland: net primary production (NPP), net ecosystem production (NEP), and soil organic carbon (SOC) change. Using data mining and spatial statistics, we studied the spatial distribution of the carbon fluxes uncertainties and the relationships between the uncertainties and the land cover characteristics. Results indicated that uncertainties for all three carbon fluxes were not randomly distributed, but instead formed multiple clusters within the MCI region. We investigated the impacts of three land cover characteristics on the fluxes uncertainties: cropland percentage, cropland richness and cropland diversity. The results indicated that cropland percentage significantly influenced the uncertainties of NPP and NEP, but not on the uncertainties of SOC change. Greater uncertainties of NPP and NEP were found in counties with small cropland percentage than the counties with large cropland percentage. Cropland species richness and diversity also showed negative correlations with the model uncertainties. Our study demonstrated that the land cover characteristics contributed to the uncertainties of regional carbon fluxes estimates. The approaches we used in this study can be applied to other ecosystem models to identify the areas with high uncertainties and where models can be improved to reduce overall uncertainties for regional carbon flux estimates.

  20. Object-oriented software for evaluating measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Hall, B. D.

    2013-05-01

    An earlier publication (Hall 2006 Metrologia 43 L56-61) introduced the notion of an uncertain number that can be used in data processing to represent quantity estimates with associated uncertainty. The approach can be automated, allowing data processing algorithms to be decomposed into convenient steps, so that complicated measurement procedures can be handled. This paper illustrates the uncertain-number approach using several simple measurement scenarios and two different software tools. One is an extension library for Microsoft Excel®. The other is a special-purpose calculator using the Python programming language.

  1. TU-AB-BRB-02: Stochastic Programming Methods for Handling Uncertainty and Motion in IMRT Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Unkelbach, J.

    The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less

  2. RESTSIM: A Simulation Model That Highlights Decision Making under Conditions of Uncertainty.

    ERIC Educational Resources Information Center

    Zinkhan, George M.; Taylor, James R.

    1983-01-01

    Describes RESTSIM, an interactive computer simulation program for graduate and upper-level undergraduate management, marketing, and retailing courses, which introduces naive users to simulation as a decision support technique, and provides a vehicle for studying various statistical procedures for evaluating simulation output. (MBR)

  3. REGIONAL VULNERABILITY ASSESSMENT OF THE MID-ATLANTIC REGION: EVALUATION OF INTEGRATION METHODS AND ASSESSMENTS RESULTS

    EPA Science Inventory

    This report describes methods for quantitative regional assessment developed by the Regional Vulnerability Assessment (ReVA) program. The goal of ReVA is to develop regional-scale assessments of the magnitude, extent, distribution, and uncertainty of current and anticipated envir...

  4. Portfolio evaluation of health programs: a reply to Sendi et al.

    PubMed

    Bridges, John F P; Terris, Darcey D

    2004-05-01

    Sendi et al. (Soc. Sci. Med. 57 (2003) 2207) extend previous research on cost-effectiveness analysis to the evaluation of a portfolio of interventions with risky outcomes using a "second best" approach that can identify improvements in efficiency in the allocation of resources. This method, however, cannot be used to directly identify the optimal solution to the resource allocation problem. Theoretically, a stricter adherence to the foundations of portfolio theory would permit direct optimization in portfolio selection, however, when we include uncertainty in our analysis in addition to the traditional concept of risk (which is often mislabelled uncertainty) complexities are introduced that create significant hurdles in the development of practical applications of portfolio theory for health care policy decision making.

  5. Interim reliability-evaluation program: analysis of the Browns Ferry, Unit 1, nuclear plant. Appendix C - sequence quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mays, S.E.; Poloski, J.P.; Sullivan, W.H.

    1982-07-01

    This report describes a risk study of the Browns Ferry, Unit 1, nuclear plant. The study is one of four such studies sponsored by the NRC Office of Research, Division of Risk Assessment, as part of its Interim Reliability Evaluation Program (IREP), Phase II. This report is contained in four volumes: a main report and three appendixes. Appendix C generally describes the methods used to estimate accident sequence frequency values. Information is presented concerning the approach, example collection, failure data, candidate dominant sequences, uncertainty analysis, and sensitivity analysis.

  6. Quantifying and reducing statistical uncertainty in sample-based health program costing studies in low- and middle-income countries.

    PubMed

    Rivera-Rodriguez, Claudia L; Resch, Stephen; Haneuse, Sebastien

    2018-01-01

    In many low- and middle-income countries, the costs of delivering public health programs such as for HIV/AIDS, nutrition, and immunization are not routinely tracked. A number of recent studies have sought to estimate program costs on the basis of detailed information collected on a subsample of facilities. While unbiased estimates can be obtained via accurate measurement and appropriate analyses, they are subject to statistical uncertainty. Quantification of this uncertainty, for example, via standard errors and/or 95% confidence intervals, provides important contextual information for decision-makers and for the design of future costing studies. While other forms of uncertainty, such as that due to model misspecification, are considered and can be investigated through sensitivity analyses, statistical uncertainty is often not reported in studies estimating the total program costs. This may be due to a lack of awareness/understanding of (1) the technical details regarding uncertainty estimation and (2) the availability of software with which to calculate uncertainty for estimators resulting from complex surveys. We provide an overview of statistical uncertainty in the context of complex costing surveys, emphasizing the various potential specific sources that contribute to overall uncertainty. We describe how analysts can compute measures of uncertainty, either via appropriately derived formulae or through resampling techniques such as the bootstrap. We also provide an overview of calibration as a means of using additional auxiliary information that is readily available for the entire program, such as the total number of doses administered, to decrease uncertainty and thereby improve decision-making and the planning of future studies. A recent study of the national program for routine immunization in Honduras shows that uncertainty can be reduced by using information available prior to the study. This method can not only be used when estimating the total cost of delivering established health programs but also to decrease uncertainty when the interest lies in assessing the incremental effect of an intervention. Measures of statistical uncertainty associated with survey-based estimates of program costs, such as standard errors and 95% confidence intervals, provide important contextual information for health policy decision-making and key inputs for the design of future costing studies. Such measures are often not reported, possibly because of technical challenges associated with their calculation and a lack of awareness of appropriate software. Modern statistical analysis methods for survey data, such as calibration, provide a means to exploit additional information that is readily available but was not used in the design of the study to significantly improve the estimation of total cost through the reduction of statistical uncertainty.

  7. Quantifying and reducing statistical uncertainty in sample-based health program costing studies in low- and middle-income countries

    PubMed Central

    Resch, Stephen

    2018-01-01

    Objectives: In many low- and middle-income countries, the costs of delivering public health programs such as for HIV/AIDS, nutrition, and immunization are not routinely tracked. A number of recent studies have sought to estimate program costs on the basis of detailed information collected on a subsample of facilities. While unbiased estimates can be obtained via accurate measurement and appropriate analyses, they are subject to statistical uncertainty. Quantification of this uncertainty, for example, via standard errors and/or 95% confidence intervals, provides important contextual information for decision-makers and for the design of future costing studies. While other forms of uncertainty, such as that due to model misspecification, are considered and can be investigated through sensitivity analyses, statistical uncertainty is often not reported in studies estimating the total program costs. This may be due to a lack of awareness/understanding of (1) the technical details regarding uncertainty estimation and (2) the availability of software with which to calculate uncertainty for estimators resulting from complex surveys. We provide an overview of statistical uncertainty in the context of complex costing surveys, emphasizing the various potential specific sources that contribute to overall uncertainty. Methods: We describe how analysts can compute measures of uncertainty, either via appropriately derived formulae or through resampling techniques such as the bootstrap. We also provide an overview of calibration as a means of using additional auxiliary information that is readily available for the entire program, such as the total number of doses administered, to decrease uncertainty and thereby improve decision-making and the planning of future studies. Results: A recent study of the national program for routine immunization in Honduras shows that uncertainty can be reduced by using information available prior to the study. This method can not only be used when estimating the total cost of delivering established health programs but also to decrease uncertainty when the interest lies in assessing the incremental effect of an intervention. Conclusion: Measures of statistical uncertainty associated with survey-based estimates of program costs, such as standard errors and 95% confidence intervals, provide important contextual information for health policy decision-making and key inputs for the design of future costing studies. Such measures are often not reported, possibly because of technical challenges associated with their calculation and a lack of awareness of appropriate software. Modern statistical analysis methods for survey data, such as calibration, provide a means to exploit additional information that is readily available but was not used in the design of the study to significantly improve the estimation of total cost through the reduction of statistical uncertainty. PMID:29636964

  8. Space tug economic analysis study. Volume 2: Tug concepts analysis. Part 2: Economic analysis

    NASA Technical Reports Server (NTRS)

    1972-01-01

    An economic analysis of space tug operations is presented. The subjects discussed are: (1) cost uncertainties, (2) scenario analysis, (3) economic sensitivities, (4) mixed integer programming formulation of the space tug problem, and (5) critical parameters in the evaluation of a public expenditure.

  9. NGA West 2 | Pacific Earthquake Engineering Research Center

    Science.gov Websites

    , multi-year research program to improve Next Generation Attenuation models for active tectonic regions earthquake engineering, including modeling of directivity and directionality; verification of NGA-West models epistemic uncertainty; and evaluation of soil amplification factors in NGA models versus NEHRP site factors

  10. Risk Management for Weapon Systems Acquisition: A Decision Support System

    DTIC Science & Technology

    1985-02-28

    includes the program evaluation and review technique (PERT) for network analysis, the PMRM for quantifying risk , an optimization package for generating...Despite the inclusion of uncertainty in time, PERT can at best be considered as a tool for quantifying risk with regard to the time element only. Moreover

  11. Environmental Scanning in Educational Planning: Establishing a Strategic Trend Information System.

    ERIC Educational Resources Information Center

    Morrison, James L.

    The systematic evaluation of the macroenvironment is sometimes referred to as a strategic trend information system. Strategic trend intelligence systems are highly developed, systematic intelligence programs that focus on trends and events in the external environment and provide institutions with knowledge to reduce areas of uncertainty and with…

  12. Transmission models and management of lymphatic filariasis elimination.

    PubMed

    Michael, Edwin; Gambhir, Manoj

    2010-01-01

    The planning and evaluation of parasitic control programmes are complicated by the many interacting population dynamic and programmatic factors that determine infection trends under different control options. A key need is quantification about the status of the parasite system state at any one given timepoint and the dynamic change brought upon that state as an intervention program proceeds. Here, we focus on the control and elimination of the vector-borne disease, lymphatic filariasis, to show how mathematical models of parasite transmission can provide a quantitative framework for aiding the design of parasite elimination and monitoring programs by their ability to support (1) conducting rational analysis and definition of endpoints for different programmatic aims or objectives, including transmission endpoints for disease elimination, (2) undertaking strategic analysis to aid the optimal design of intervention programs to meet set endpoints under different endemic settings and (3) providing support for performing informed evaluations of ongoing programs, including aiding the formation of timely adaptive management strategies to correct for any observed deficiencies in program effectiveness. The results also highlight how the use of a model-based framework will be critical to addressing the impacts of ecological complexities, heterogeneities and uncertainties on effective parasite management and thereby guiding the development of strategies to resolve and overcome such real-world complexities. In particular, we underscore how this approach can provide a link between ecological science and policy by revealing novel tools and measures to appraise and enhance the biological controllability or eradicability of parasitic diseases. We conclude by emphasizing an urgent need to develop and apply flexible adaptive management frameworks informed by mathematical models that are based on learning and reducing uncertainty using monitoring data, apply phased or sequential decision-making to address extant uncertainty and focus on developing ecologically resilient management strategies, in ongoing efforts to control or eliminate filariasis and other parasitic diseases in resource-poor communities.

  13. CALIBRATION, OPTIMIZATION, AND SENSITIVITY AND UNCERTAINTY ALGORITHMS APPLICATION PROGRAMMING INTERFACE (COSU-API)

    EPA Science Inventory

    The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and Parameter Estimation (UA/SA/PE API) tool development, here fore referred to as the Calibration, Optimization, and Sensitivity and Uncertainty Algorithms API (COSU-API), was initially d...

  14. Uncertainty Considerations for Ballistic Limit Equations

    NASA Technical Reports Server (NTRS)

    Schonberg, W. P.; Evans, H. J.; Williamsen, J. E.; Boyer, R. L.; Nakayama, G. S.

    2005-01-01

    The overall risk for any spacecraft system is typically determined using a Probabilistic Risk Assessment (PRA). A PRA attempts to determine the overall risk associated with a particular mission by factoring in all known risks (and their corresponding uncertainties, if known) to the spacecraft during its mission. The threat to mission and human life posed by the mircro-meteoroid & orbital debris (MMOD) environment is one of the risks. NASA uses the BUMPER II program to provide point estimate predictions of MMOD risk for the Space Shuttle and the International Space Station. However, BUMPER II does not provide uncertainty bounds or confidence intervals for its predictions. With so many uncertainties believed to be present in the models used within BUMPER II, providing uncertainty bounds with BUMPER II results would appear to be essential to properly evaluating its predictions of MMOD risk. The uncertainties in BUMPER II come primarily from three areas: damage prediction/ballistic limit equations, environment models, and failure criteria definitions. In order to quantify the overall uncertainty bounds on MMOD risk predictions, the uncertainties in these three areas must be identified. In this paper, possible approaches through which uncertainty bounds can be developed for the various damage prediction and ballistic limit equations encoded within the shuttle and station versions of BUMPER II are presented and discussed. We begin the paper with a review of the current approaches used by NASA to perform a PRA for the Space Shuttle and the International Space Station, followed by a review of the results of a recent sensitivity analysis performed by NASA using the shuttle version of the BUMPER II code. Following a discussion of the various equations that are encoded in BUMPER II, we propose several possible approaches for establishing uncertainty bounds for the equations within BUMPER II. We conclude with an evaluation of these approaches and present a recommendation regarding which of them would be the most appropriate to follow.

  15. Evaluation of Carrying Capacity : Measure 7.1A of the Northwest Power Planning Council`s 1994 Fish and Wildlife Program : Report 1 of 4, Final Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neitzel, Duane A.; Johnson, Gary E.

    This report is one of four that the Pacific Northwest National Laboratory (PNNL) staff prepared to address Measure 7.1A in the Northwest Power Planning Council's (Council) Fish and Wildlife Program (Program) dated december 1994 (NPPC 1994). Measure 7.1A calls for the Bonneville Power Administration (BPA) to fund an evaluation of salmon survival, ecology, carrying capacity, and limiting factors in freshwater, estuarine, and marine habitats. Additionally, the Measure asks for development of a study plan based on critical uncertainties and research needs identified during the evaluation. This report deals with the evaluation of carrying capacity. It describes the analysis of differentmore » views of capacity as it relates to salmon survival and abundance. The report ends with conclusions and recommendations for studying carrying capacity.« less

  16. Impact of uncertainty on modeling and testing

    NASA Technical Reports Server (NTRS)

    Coleman, Hugh W.; Brown, Kendall K.

    1995-01-01

    A thorough understanding of the uncertainties associated with the modeling and testing of the Space Shuttle Main Engine (SSME) Engine will greatly aid decisions concerning hardware performance and future development efforts. This report will describe the determination of the uncertainties in the modeling and testing of the Space Shuttle Main Engine test program at the Technology Test Bed facility at Marshall Space Flight Center. Section 2 will present a summary of the uncertainty analysis methodology used and discuss the specific applications to the TTB SSME test program. Section 3 will discuss the application of the uncertainty analysis to the test program and the results obtained. Section 4 presents the results of the analysis of the SSME modeling effort from an uncertainty analysis point of view. The appendices at the end of the report contain a significant amount of information relative to the analysis, including discussions of venturi flowmeter data reduction and uncertainty propagation, bias uncertainty documentations, technical papers published, the computer code generated to determine the venturi uncertainties, and the venturi data and results used in the analysis.

  17. A framework for improving the cost-effectiveness of DSM program evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sonnenblick, R.; Eto, J.

    The prudence of utility demand-side management (DSM) investments hinges on their performance, yet evaluating performance is complicated because the energy saved by DSM programs can never be observed directly but only inferred. This study frames and begins to answer the following questions: (1) how well do current evaluation methods perform in improving confidence in the measurement of energy savings produced by DSM programs; (2) in view of this performance, how can limited evaluation resources be best allocated to maximize the value of the information they provide? The authors review three major classes of methods for estimating annual energy savings: trackingmore » database (sometimes called engineering estimates), end-use metering, and billing analysis and examine them in light of the uncertainties in current estimates of DSM program measure lifetimes. The authors assess the accuracy and precision of each method and construct trade-off curves to examine the costs of increases in accuracy or precision. Several approaches for improving evaluations for the purpose of assessing program cost effectiveness are demonstrated. The methods can be easily generalized to other evaluation objectives, such as shared savings incentive payments.« less

  18. Interval Predictor Models with a Formal Characterization of Uncertainty and Reliability

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Giesy, Daniel P.; Kenny, Sean P.

    2014-01-01

    This paper develops techniques for constructing empirical predictor models based on observations. By contrast to standard models, which yield a single predicted output at each value of the model's inputs, Interval Predictors Models (IPM) yield an interval into which the unobserved output is predicted to fall. The IPMs proposed prescribe the output as an interval valued function of the model's inputs, render a formal description of both the uncertainty in the model's parameters and of the spread in the predicted output. Uncertainty is prescribed as a hyper-rectangular set in the space of model's parameters. The propagation of this set through the empirical model yields a range of outputs of minimal spread containing all (or, depending on the formulation, most) of the observations. Optimization-based strategies for calculating IPMs and eliminating the effects of outliers are proposed. Outliers are identified by evaluating the extent by which they degrade the tightness of the prediction. This evaluation can be carried out while the IPM is calculated. When the data satisfies mild stochastic assumptions, and the optimization program used for calculating the IPM is convex (or, when its solution coincides with the solution to an auxiliary convex program), the model's reliability (that is, the probability that a future observation would be within the predicted range of outputs) can be bounded rigorously by a non-asymptotic formula.

  19. Retrieval of surface temperature by remote sensing. [of earth surface using brightness temperature of air pollutants

    NASA Technical Reports Server (NTRS)

    Gupta, S. K.; Tiwari, S. N.

    1976-01-01

    A simple procedure and computer program were developed for retrieving the surface temperature from the measurement of upwelling infrared radiance in a single spectral region in the atmosphere. The program evaluates the total upwelling radiance at any altitude in the region of the CO fundamental band (2070-2220 1/cm) for several values of surface temperature. Actual surface temperature is inferred by interpolation of the measured upwelling radiance between the computed values of radiance for the same altitude. Sensitivity calculations were made to determine the effect of uncertainty in various surface, atmospheric and experimental parameters on the inferred value of surface temperature. It is found that the uncertainties in water vapor concentration and surface emittance are the most important factors affecting the accuracy of the inferred value of surface temperature.

  20. Cost comparisons for the use of nonterrestrial materials in space manufacturing of large structures

    NASA Technical Reports Server (NTRS)

    Bock, E. H.; Risley, R. C.

    1979-01-01

    This paper presents results of a study sponsored by NASA to evaluate the relative merits of constructing solar power satellites (SPS) using resources obtained from the earth and from the moon. Three representative lunar resources utilization (LRU) concepts are developed and compared with a previously defined earth baseline concept. Economic assessment of the alternatives includes cost determination, economic threshold sensitivity to manufacturing cost variations, cost uncertainties, program funding schedule, and present value of costs. Results indicate that LRU for space construction is competitive with the earth baseline approach for a program requiring 100,000 metric tons per year of completed satellites. LRU can reduce earth-launched cargo requirements to less than 10% of that needed to build satellites exclusively from earth materials. LRU is potentially more cost-effective than earth-derived material utilization, due to significant reductions in both transportation and manufacturing costs. Because of uncertainties, cost-effectiveness cannot be ascertained with great confidence. The probability of LRU attaining a lower total program cost within the 30-year program appears to range from 57 to 93%.

  1. Lunar Navigation Architecture Design Considerations

    NASA Technical Reports Server (NTRS)

    D'Souza, Christopher; Getchius, Joel; Holt, Greg; Moreau, Michael

    2009-01-01

    The NASA Constellation Program is aiming to establish a long-term presence on the lunar surface. The Constellation elements (Orion, Altair, Earth Departure Stage, and Ares launch vehicles) will require a lunar navigation architecture for navigation state updates during lunar-class missions. Orion in particular has baselined earth-based ground direct tracking as the primary source for much of its absolute navigation needs. However, due to the uncertainty in the lunar navigation architecture, the Orion program has had to make certain assumptions on the capabilities of such architectures in order to adequately scale the vehicle design trade space. The following paper outlines lunar navigation requirements, the Orion program assumptions, and the impacts of these assumptions to the lunar navigation architecture design. The selection of potential sites was based upon geometric baselines, logistical feasibility, redundancy, and abort support capability. Simulated navigation covariances mapped to entry interface flightpath- angle uncertainties were used to evaluate knowledge errors. A minimum ground station architecture was identified consisting of Goldstone, Madrid, Canberra, Santiago, Hartebeeshoek, Dongora, Hawaii, Guam, and Ascension Island (or the geometric equivalent).

  2. MODEL EVALUATION SCIENCE TO MEET TODAY'S QUALITY ASSURANCE REQUIREMENTS FOR REGULATORY USE: ADDRESSING UNCERTAINTY, SENSITIVITY, AND PARAMETERIZATION

    EPA Science Inventory

    The EPA/ORD National Exposure Research Lab's (NERL) UA/SA/PE research program addresses both tactical and strategic needs in direct support of ORD's client base. The design represents an integrated approach in achieving the highest levels of quality assurance in environmental de...

  3. MEETING IN TUCSON: MODEL EVALUATION SCIENCE TO MEET TODAY'S QUALITY ASSURANCE REQUIREMENTS FOR REGULATORY USE: ADDRESSING UNCERTAINTY, SENSITIVITY, AND PARAMETERIZATION

    EPA Science Inventory

    The EPA/ORD National Exposure Research Lab's (NERL) UA/SA/PE research program addresses both tactical and strategic needs in direct support of ORD's client base. The design represents an integrated approach in achieving the highest levels of quality assurance in environmental dec...

  4. TOWARD DEVELOPMENT OF A COMMON SOFTWARE APPLICATION PROGRAMMING INTERFACE (API) FOR UNCERTAINTY, SENSITIVITY, AND PARAMETER ESTIMATION METHODS AND TOOLS

    EPA Science Inventory

    The final session of the workshop considered the subject of software technology and how it might be better constructed to support those who develop, evaluate, and apply multimedia environmental models. Two invited presentations were featured along with an extended open discussio...

  5. Constant-Elasticity-of-Substitution Simulation

    NASA Technical Reports Server (NTRS)

    Reiter, G.

    1986-01-01

    Program simulates constant elasticity-of-substitution (CES) production function. CES function used by economic analysts to examine production costs as well as uncertainties in production. User provides such input parameters as price of labor, price of capital, and dispersion levels. CES minimizes expected cost to produce capital-uncertainty pair. By varying capital-value input, one obtains series of capital-uncertainty pairs. Capital-uncertainty pairs then used to generate several cost curves. CES program menu driven and features specific print menu for examining selected output curves. Program written in BASIC for interactive execution and implemented on IBM PC-series computer.

  6. Development of a special-purpose test surface guided by uncertainty analysis - Introduction of a new uncertainty analysis step

    NASA Technical Reports Server (NTRS)

    Wang, T.; Simon, T. W.

    1988-01-01

    Development of a recent experimental program to investigate the effects of streamwise curvature on boundary layer transition required making a bendable, heated and instrumented test wall, a rather nonconventional surface. The present paper describes this surface, the design choices made in its development and how uncertainty analysis was used, beginning early in the test program, to make such design choices. Published uncertainty analysis techniques were found to be of great value; but, it became clear that another step, one herein called the pre-test analysis, would aid the program development. Finally, it is shown how the uncertainty analysis was used to determine whether the test surface was qualified for service.

  7. Achieving Robustness to Uncertainty for Financial Decision-making

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barnum, George M.; Van Buren, Kendra L.; Hemez, Francois M.

    2014-01-10

    This report investigates the concept of robustness analysis to support financial decision-making. Financial models, that forecast future stock returns or market conditions, depend on assumptions that might be unwarranted and variables that might exhibit large fluctuations from their last-known values. The analysis of robustness explores these sources of uncertainty, and recommends model settings such that the forecasts used for decision-making are as insensitive as possible to the uncertainty. A proof-of-concept is presented with the Capital Asset Pricing Model. The robustness of model predictions is assessed using info-gap decision theory. Info-gaps are models of uncertainty that express the “distance,” or gapmore » of information, between what is known and what needs to be known in order to support the decision. The analysis yields a description of worst-case stock returns as a function of increasing gaps in our knowledge. The analyst can then decide on the best course of action by trading-off worst-case performance with “risk”, which is how much uncertainty they think needs to be accommodated in the future. The report also discusses the Graphical User Interface, developed using the MATLAB® programming environment, such that the user can control the analysis through an easy-to-navigate interface. Three directions of future work are identified to enhance the present software. First, the code should be re-written using the Python scientific programming software. This change will achieve greater cross-platform compatibility, better portability, allow for a more professional appearance, and render it independent from a commercial license, which MATLAB® requires. Second, a capability should be developed to allow users to quickly implement and analyze their own models. This will facilitate application of the software to the evaluation of proprietary financial models. The third enhancement proposed is to add the ability to evaluate multiple models simultaneously. When two models reflect past data with similar accuracy, the more robust of the two is preferable for decision-making because its predictions are, by definition, less sensitive to the uncertainty.« less

  8. PTV margin determination in conformal SRT of intracranial lesions

    PubMed Central

    Parker, Brent C.; Shiu, Almon S.; Maor, Moshe H.; Lang, Frederick F.; Liu, H. Helen; White, R. Allen; Antolak, John A.

    2002-01-01

    The planning target volume (PTV) includes the clinical target volume (CTV) to be irradiated and a margin to account for uncertainties in the treatment process. Uncertainties in miniature multileaf collimator (mMLC) leaf positioning, CT scanner spatial localization, CT‐MRI image fusion spatial localization, and Gill‐Thomas‐Cosman (GTC) relocatable head frame repositioning were quantified for the purpose of determining a minimum PTV margin that still delivers a satisfactory CTV dose. The measured uncertainties were then incorporated into a simple Monte Carlo calculation for evaluation of various margin and fraction combinations. Satisfactory CTV dosimetric criteria were selected to be a minimum CTV dose of 95% of the PTV dose and at least 95% of the CTV receiving 100% of the PTV dose. The measured uncertainties were assumed to be Gaussian distributions. Systematic errors were added linearly and random errors were added in quadrature assuming no correlation to arrive at the total combined error. The Monte Carlo simulation written for this work examined the distribution of cumulative dose volume histograms for a large patient population using various margin and fraction combinations to determine the smallest margin required to meet the established criteria. The program examined 5 and 30 fraction treatments, since those are the only fractionation schemes currently used at our institution. The fractionation schemes were evaluated using no margin, a margin of just the systematic component of the total uncertainty, and a margin of the systematic component plus one standard deviation of the total uncertainty. It was concluded that (i) a margin of the systematic error plus one standard deviation of the total uncertainty is the smallest PTV margin necessary to achieve the established CTV dose criteria, and (ii) it is necessary to determine the uncertainties introduced by the specific equipment and procedures used at each institution since the uncertainties may vary among locations. PACS number(s): 87.53.Kn, 87.53.Ly PMID:12132939

  9. Conservation of northern bobwhite on private lands in Georgia, USA under uncertainty about landscape-level habitat effects

    USGS Publications Warehouse

    Howell, J.E.; Moore, C.T.; Conroy, M.J.; Hamrick, R.G.; Cooper, R.J.; Thackston, R.E.; Carroll, J.P.

    2009-01-01

    Large-scale habitat enhancement programs for birds are becoming more widespread, however, most lack monitoring to resolve uncertainties and enhance program impact over time. Georgia?s Bobwhite Quail Initiative (BQI) is a competitive, proposal-based system that provides incentives to landowners to establish habitat for northern bobwhites (Colinus virginianus). Using data from monitoring conducted in the program?s first years (1999?2001), we developed alternative hierarchical models to predict bobwhite abundance in response to program habitat modifications on local and regional scales. Effects of habitat and habitat management on bobwhite population response varied among geographical scales, but high measurement variability rendered the specific nature of these scaled effects equivocal. Under some models, BQI had positive impact at both local farm scales (1, 9 km2), particularly when practice acres were clustered, whereas other credible models indicated that bird response did not depend on spatial arrangement of practices. Thus, uncertainty about landscape-level effects of management presents a challenge to program managers who must decide which proposals to accept. We demonstrate that optimal selection decisions can be made despite this uncertainty and that uncertainty can be reduced over time, with consequent improvement in management efficacy. However, such an adaptive approach to BQI program implementation would require the reestablishment of monitoring of bobwhite abundance, an effort for which funding was discontinued in 2002. For landscape-level conservation programs generally, our approach demonstrates the value in assessing multiple scales of impact of habitat modification programs, and it reveals the utility of addressing management uncertainty through multiple decision models and system monitoring.

  10. Probabilistic Methods for Uncertainty Propagation Applied to Aircraft Design

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Lin, Hong-Zong; Khalessi, Mohammad R.

    2002-01-01

    Three methods of probabilistic uncertainty propagation and quantification (the method of moments, Monte Carlo simulation, and a nongradient simulation search method) are applied to an aircraft analysis and conceptual design program to demonstrate design under uncertainty. The chosen example problems appear to have discontinuous design spaces and thus these examples pose difficulties for many popular methods of uncertainty propagation and quantification. However, specific implementation features of the first and third methods chosen for use in this study enable successful propagation of small uncertainties through the program. Input uncertainties in two configuration design variables are considered. Uncertainties in aircraft weight are computed. The effects of specifying required levels of constraint satisfaction with specified levels of input uncertainty are also demonstrated. The results show, as expected, that the designs under uncertainty are typically heavier and more conservative than those in which no input uncertainties exist.

  11. Nuclear thermal rocket nozzle testing and evaluation program

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth O.; Kacynski, Kenneth J.

    1993-01-01

    Performance characteristics of the Nuclear Thermal Rocket can be enhanced through the use of unconventional nozzles as part of the propulsion system. The Nuclear Thermal Rocket nozzle testing and evaluation program being conducted at the NASA Lewis is outlined and the advantages of a plug nozzle are described. A facility description, experimental designs and schematics are given. Results of pretest performance analyses show that high nozzle performance can be attained despite substantial nozzle length reduction through the use of plug nozzles as compared to a convergent-divergent nozzle. Pretest measurement uncertainty analyses indicate that specific impulse values are expected to be within + or - 1.17 pct.

  12. ESTIMATION OF INTERNAL EXPOSURE TO URANIUM WITH UNCERTAINTY FROM URINALYSIS DATA USING THE InDEP COMPUTER CODE

    PubMed Central

    Anderson, Jeri L.; Apostoaei, A. Iulian; Thomas, Brian A.

    2015-01-01

    The National Institute for Occupational Safety and Health (NIOSH) is currently studying mortality in a cohort of 6409 workers at a former uranium processing facility. As part of this study, over 220 000 urine samples were used to reconstruct organ doses due to internal exposure to uranium. Most of the available computational programs designed for analysis of bioassay data handle a single case at a time, and thus require a significant outlay of time and resources for the exposure assessment of a large cohort. NIOSH is currently supporting the development of a computer program, InDEP (Internal Dose Evaluation Program), to facilitate internal radiation exposure assessment as part of epidemiological studies of both uranium- and plutonium-exposed cohorts. A novel feature of InDEP is its batch processing capability which allows for the evaluation of multiple study subjects simultaneously. InDEP analyses bioassay data and derives intakes and organ doses with uncertainty estimates using least-squares regression techniques or using the Bayes’ Theorem as applied to internal dosimetry (Bayesian method). This paper describes the application of the current version of InDEP to formulate assumptions about the characteristics of exposure at the study facility that were used in a detailed retrospective intake and organ dose assessment of the cohort. PMID:22683620

  13. Evaluation of Electric Power Procurement Strategies by Stochastic Dynamic Programming

    NASA Astrophysics Data System (ADS)

    Saisho, Yuichi; Hayashi, Taketo; Fujii, Yasumasa; Yamaji, Kenji

    In deregulated electricity markets, the role of a distribution company is to purchase electricity from the wholesale electricity market at randomly fluctuating prices and to provide it to its customers at a given fixed price. Therefore the company has to take risk stemming from the uncertainties of electricity prices and/or demand fluctuation instead of the customers. The way to avoid the risk is to make a bilateral contact with generating companies or install its own power generation facility. This entails the necessity to develop a certain method to make an optimal strategy for electric power procurement. In such a circumstance, this research has the purpose for proposing a mathematical method based on stochastic dynamic programming and additionally considering the characteristics of the start-up cost of electric power generation facility to evaluate strategies of combination of the bilateral contract and power auto-generation with its own facility for procuring electric power in deregulated electricity market. In the beginning we proposed two approaches to solve the stochastic dynamic programming, and they are a Monte Carlo simulation method and a finite difference method to derive the solution of a partial differential equation of the total procurement cost of electric power. Finally we discussed the influences of the price uncertainty on optimal strategies of power procurement.

  14. Measuring Research Data Uncertainty in the 2010 NRC Assessment of Geography Graduate Education

    ERIC Educational Resources Information Center

    Shortridge, Ashton; Goldsberry, Kirk; Weessies, Kathleen

    2011-01-01

    This article characterizes and measures errors in the 2010 National Research Council (NRC) assessment of research-doctorate programs in geography. This article provides a conceptual model for data-based sources of uncertainty and reports on a quantitative assessment of NRC research data uncertainty for a particular geography doctoral program.…

  15. Approaches to highly parameterized inversion: A guide to using PEST for model-parameter and predictive-uncertainty analysis

    USGS Publications Warehouse

    Doherty, John E.; Hunt, Randall J.; Tonkin, Matthew J.

    2010-01-01

    Analysis of the uncertainty associated with parameters used by a numerical model, and with predictions that depend on those parameters, is fundamental to the use of modeling in support of decisionmaking. Unfortunately, predictive uncertainty analysis with regard to models can be very computationally demanding, due in part to complex constraints on parameters that arise from expert knowledge of system properties on the one hand (knowledge constraints) and from the necessity for the model parameters to assume values that allow the model to reproduce historical system behavior on the other hand (calibration constraints). Enforcement of knowledge and calibration constraints on parameters used by a model does not eliminate the uncertainty in those parameters. In fact, in many cases, enforcement of calibration constraints simply reduces the uncertainties associated with a number of broad-scale combinations of model parameters that collectively describe spatially averaged system properties. The uncertainties associated with other combinations of parameters, especially those that pertain to small-scale parameter heterogeneity, may not be reduced through the calibration process. To the extent that a prediction depends on system-property detail, its postcalibration variability may be reduced very little, if at all, by applying calibration constraints; knowledge constraints remain the only limits on the variability of predictions that depend on such detail. Regrettably, in many common modeling applications, these constraints are weak. Though the PEST software suite was initially developed as a tool for model calibration, recent developments have focused on the evaluation of model-parameter and predictive uncertainty. As a complement to functionality that it provides for highly parameterized inversion (calibration) by means of formal mathematical regularization techniques, the PEST suite provides utilities for linear and nonlinear error-variance and uncertainty analysis in these highly parameterized modeling contexts. Availability of these utilities is particularly important because, in many cases, a significant proportion of the uncertainty associated with model parameters-and the predictions that depend on them-arises from differences between the complex properties of the real world and the simplified representation of those properties that is expressed by the calibrated model. This report is intended to guide intermediate to advanced modelers in the use of capabilities available with the PEST suite of programs for evaluating model predictive error and uncertainty. A brief theoretical background is presented on sources of parameter and predictive uncertainty and on the means for evaluating this uncertainty. Applications of PEST tools are then discussed for overdetermined and underdetermined problems, both linear and nonlinear. PEST tools for calculating contributions to model predictive uncertainty, as well as optimization of data acquisition for reducing parameter and predictive uncertainty, are presented. The appendixes list the relevant PEST variables, files, and utilities required for the analyses described in the document.

  16. Determination of uncertainties associated to the in vivo measurement of iodine-131 in the thyroid.

    PubMed

    Dantas, B M; Lima, F F; Dantas, A L; Lucena, E A; Gontijo, R M G; Carvalho, C B; Hazin, C

    2016-07-01

    Intakes of radionuclides can be estimated through in vivo measurements, and the uncertainties associated to the measured activities should be clearly stated in monitoring program reports. This study aims to evaluate the uncertainties of in vivo monitoring of iodine 131 in the thyroid. The reference values for high-energy photons are based on the IDEAS Guide. Measurements were performed at the In Vivo Monitoring Laboratory of the Institute of Radiation Protection and Dosimetry (IRD) and at the Internal Dosimetry Laboratory of the Regional Center of Nuclear Sciences (CRCN-NE). In both institutions, the experiment was performed using a NaI(Tl) 3''3″ scintillation detector and a neck-thyroid phantom. Scattering factors were calculated and compared in different counting geometries. The results show that the technique produces reproducibility equivalent to the values suggested in the IDEAS Guide and measurement uncertainties is comparable to international quality standards for this type of in vivo monitoring. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. To what extent can ecosystem services motivate protecting biodiversity?

    PubMed

    Dee, Laura E; De Lara, Michel; Costello, Christopher; Gaines, Steven D

    2017-08-01

    Society increasingly focuses on managing nature for the services it provides people rather than for the existence of particular species. How much biodiversity protection would result from this modified focus? Although biodiversity contributes to ecosystem services, the details of which species are critical, and whether they will go functionally extinct in the future, are fraught with uncertainty. Explicitly considering this uncertainty, we develop an analytical framework to determine how much biodiversity protection would arise solely from optimising net value from an ecosystem service. Using stochastic dynamic programming, we find that protecting a threshold number of species is optimal, and uncertainty surrounding how biodiversity produces services makes it optimal to protect more species than are presumed critical. We define conditions under which the economically optimal protection strategy is to protect all species, no species, and cases in between. We show how the optimal number of species to protect depends upon different relationships between species and services, including considering multiple services. Our analysis provides simple criteria to evaluate when managing for particular ecosystem services could warrant protecting all species, given uncertainty. Evaluating this criterion with empirical estimates from different ecosystems suggests that optimising some services will be more likely to protect most species than others. © 2017 John Wiley & Sons Ltd/CNRS.

  18. Strategic Technology Investment Analysis: An Integrated System Approach

    NASA Technical Reports Server (NTRS)

    Adumitroaie, V.; Weisbin, C. R.

    2010-01-01

    Complex technology investment decisions within NASA are increasingly difficult to make such that the end results are satisfying the technical objectives and all the organizational constraints. Due to a restricted science budget environment and numerous required technology developments, the investment decisions need to take into account not only the functional impact on the program goals, but also development uncertainties and cost variations along with maintaining a healthy workforce. This paper describes an approach for optimizing and qualifying technology investment portfolios from the perspective of an integrated system model. The methodology encompasses multi-attribute decision theory elements and sensitivity analysis. The evaluation of the degree of robustness of the recommended portfolio provides the decision-maker with an array of viable selection alternatives, which take into account input uncertainties and possibly satisfy nontechnical constraints. The methodology is presented in the context of assessing capability development portfolios for NASA technology programs.

  19. Evaluation of Uncertainty in Precipitation Datasets for New Mexico, USA

    NASA Astrophysics Data System (ADS)

    Besha, A. A.; Steele, C. M.; Fernald, A.

    2014-12-01

    Climate change, population growth and other factors are endangering water availability and sustainability in semiarid/arid areas particularly in the southwestern United States. Wide coverage of spatial and temporal measurements of precipitation are key for regional water budget analysis and hydrological operations which themselves are valuable tool for water resource planning and management. Rain gauge measurements are usually reliable and accurate at a point. They measure rainfall continuously, but spatial sampling is limited. Ground based radar and satellite remotely sensed precipitation have wide spatial and temporal coverage. However, these measurements are indirect and subject to errors because of equipment, meteorological variability, the heterogeneity of the land surface itself and lack of regular recording. This study seeks to understand precipitation uncertainty and in doing so, lessen uncertainty propagation into hydrological applications and operations. We reviewed, compared and evaluated the TRMM (Tropical Rainfall Measuring Mission) precipitation products, NOAA's (National Oceanic and Atmospheric Administration) Global Precipitation Climatology Centre (GPCC) monthly precipitation dataset, PRISM (Parameter elevation Regression on Independent Slopes Model) data and data from individual climate stations including Cooperative Observer Program (COOP), Remote Automated Weather Stations (RAWS), Soil Climate Analysis Network (SCAN) and Snowpack Telemetry (SNOTEL) stations. Though not yet finalized, this study finds that the uncertainty within precipitation estimates datasets is influenced by regional topography, season, climate and precipitation rate. Ongoing work aims to further evaluate precipitation datasets based on the relative influence of these phenomena so that we can identify the optimum datasets for input to statewide water budget analysis.

  20. FIELD ANALYTICAL METHODS: ADVANCED FIELD MONITORING METHODS DEVELOPMENT AND EVALUATION OF NEW AND INNOVATIVE TECHNOLOGIES THAT SUPPORT THE SITE CHARACTERIZATION AND MONITORING REQUIREMENTS OF THE SUPERFUND PROGRAM.

    EPA Science Inventory

    The overall goal of this task is to help reduce the uncertainties in the assessment of environmental health and human exposure by better characterizing hazardous wastes through cost-effective analytical methods. Research projects are directed towards the applied development and ...

  1. Is my bottom-up uncertainty estimation on metal measurement adequate?

    NASA Astrophysics Data System (ADS)

    Marques, J. R.; Faustino, M. G.; Monteiro, L. R.; Ulrich, J. C.; Pires, M. A. F.; Cotrim, M. E. B.

    2018-03-01

    Is the estimated uncertainty under GUM recommendation associated with metal measurement adequately estimated? How to evaluate if the measurement uncertainty really covers all uncertainty that is associated with the analytical procedure? Considering that, many laboratories frequently underestimate or less frequently overestimate uncertainties on its results; this paper presents the evaluation of estimated uncertainties on two ICP-OES procedures of seven metal measurements according to GUM approach. Horwitz function and proficiency tests scaled standard uncertainties were used in this evaluation. Our data shows that most elements expanded uncertainties were from two to four times underestimated. Possible causes and corrections are discussed herein.

  2. The Value of Information in Decision-Analytic Modeling for Malaria Vector Control in East Africa.

    PubMed

    Kim, Dohyeong; Brown, Zachary; Anderson, Richard; Mutero, Clifford; Miranda, Marie Lynn; Wiener, Jonathan; Kramer, Randall

    2017-02-01

    Decision analysis tools and mathematical modeling are increasingly emphasized in malaria control programs worldwide to improve resource allocation and address ongoing challenges with sustainability. However, such tools require substantial scientific evidence, which is costly to acquire. The value of information (VOI) has been proposed as a metric for gauging the value of reduced model uncertainty. We apply this concept to an evidenced-based Malaria Decision Analysis Support Tool (MDAST) designed for application in East Africa. In developing MDAST, substantial gaps in the scientific evidence base were identified regarding insecticide resistance in malaria vector control and the effectiveness of alternative mosquito control approaches, including larviciding. We identify four entomological parameters in the model (two for insecticide resistance and two for larviciding) that involve high levels of uncertainty and to which outputs in MDAST are sensitive. We estimate and compare a VOI for combinations of these parameters in evaluating three policy alternatives relative to a status quo policy. We find having perfect information on the uncertain parameters could improve program net benefits by up to 5-21%, with the highest VOI associated with jointly eliminating uncertainty about reproductive speed of malaria-transmitting mosquitoes and initial efficacy of larviciding at reducing the emergence of new adult mosquitoes. Future research on parameter uncertainty in decision analysis of malaria control policy should investigate the VOI with respect to other aspects of malaria transmission (such as antimalarial resistance), the costs of reducing uncertainty in these parameters, and the extent to which imperfect information about these parameters can improve payoffs. © 2016 Society for Risk Analysis.

  3. Dose evaluation of an NIPAM polymer gel dosimeter using gamma index

    NASA Astrophysics Data System (ADS)

    Chang, Yuan-Jen; Lin, Jing-Quan; Hsieh, Bor-Tsung; Yao, Chun-Hsu; Chen, Chin-Hsing

    2014-11-01

    An N-isopropylacrylamide (NIPAM) polymer gel dosimeter has great potential in clinical applications. However, its three-dimensional dose distribution must be assessed. In this work, a quantitative evaluation of dose distributions was performed to evaluate the NIPAM polymer gel dosimeter using gamma analysis. A cylindrical acrylic phantom filled with NIPAM gel measuring 10 cm (diameter) by 10 cm (height) by 3 mm (thickness) was irradiated by a 4×4 cm2 square light field. The irradiated gel phantom was scanned using an optical computed tomography (optical CT) scanner (OCTOPUS™, MGS Research, Inc., Madison, CT, USA) at 1 mm resolution. The projection data were transferred to an image reconstruction program, which was written using MATLAB (The MathWorks, Natick, MA, USA). The program reconstructed the image of the optical density distribution using the algorithm of a filter back-projection. Three batches of replicated gel phantoms were independently measured. The average uncertainty of the measurements was less than 1%. The gel was found to have a high degree of spatial uniformity throughout the dosimeter and good temporal stability. A comparison of the line profiles of the treatment planning system and of the data measured by optical CT showed that the dose was overestimated in the penumbra region because of two factors. The first is light scattering due to changes in the refractive index at the edge of the irradiated field. The second is the edge enhancement caused by free radical diffusion. However, the effect of edge enhancement on the NIPAM gel dosimeter is not as significant as that on the BANG gel dosimeter. Moreover, the dose uncertainty is affected by the inaccuracy of the gel container positioning process. To reduce the uncertainty of 3D dose distribution, improvements in the gel container holder must be developed.

  4. An Approach to Experimental Design for the Computer Analysis of Complex Phenomenon

    NASA Technical Reports Server (NTRS)

    Rutherford, Brian

    2000-01-01

    The ability to make credible system assessments, predictions and design decisions related to engineered systems and other complex phenomenon is key to a successful program for many large-scale investigations in government and industry. Recently, many of these large-scale analyses have turned to computational simulation to provide much of the required information. Addressing specific goals in the computer analysis of these complex phenomenon is often accomplished through the use of performance measures that are based on system response models. The response models are constructed using computer-generated responses together with physical test results where possible. They are often based on probabilistically defined inputs and generally require estimation of a set of response modeling parameters. As a consequence, the performance measures are themselves distributed quantities reflecting these variabilities and uncertainties. Uncertainty in the values of the performance measures leads to uncertainties in predicted performance and can cloud the decisions required of the analysis. A specific goal of this research has been to develop methodology that will reduce this uncertainty in an analysis environment where limited resources and system complexity together restrict the number of simulations that can be performed. An approach has been developed that is based on evaluation of the potential information provided for each "intelligently selected" candidate set of computer runs. Each candidate is evaluated by partitioning the performance measure uncertainty into two components - one component that could be explained through the additional computational simulation runs and a second that would remain uncertain. The portion explained is estimated using a probabilistic evaluation of likely results for the additional computational analyses based on what is currently known about the system. The set of runs indicating the largest potential reduction in uncertainty is then selected and the computational simulations are performed. Examples are provided to demonstrate this approach on small scale problems. These examples give encouraging results. Directions for further research are indicated.

  5. Uncertainty Quantification in Climate Modeling and Projection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qian, Yun; Jackson, Charles; Giorgi, Filippo

    The projection of future climate is one of the most complex problems undertaken by the scientific community. Although scientists have been striving to better understand the physical basis of the climate system and to improve climate models, the overall uncertainty in projections of future climate has not been significantly reduced (e.g., from the IPCC AR4 to AR5). With the rapid increase of complexity in Earth system models, reducing uncertainties in climate projections becomes extremely challenging. Since uncertainties always exist in climate models, interpreting the strengths and limitations of future climate projections is key to evaluating risks, and climate change informationmore » for use in Vulnerability, Impact, and Adaptation (VIA) studies should be provided with both well-characterized and well-quantified uncertainty. The workshop aimed at providing participants, many of them from developing countries, information on strategies to quantify the uncertainty in climate model projections and assess the reliability of climate change information for decision-making. The program included a mixture of lectures on fundamental concepts in Bayesian inference and sampling, applications, and hands-on computer laboratory exercises employing software packages for Bayesian inference, Markov Chain Monte Carlo methods, and global sensitivity analyses. The lectures covered a range of scientific issues underlying the evaluation of uncertainties in climate projections, such as the effects of uncertain initial and boundary conditions, uncertain physics, and limitations of observational records. Progress in quantitatively estimating uncertainties in hydrologic, land surface, and atmospheric models at both regional and global scales was also reviewed. The application of Uncertainty Quantification (UQ) concepts to coupled climate system models is still in its infancy. The Coupled Model Intercomparison Project (CMIP) multi-model ensemble currently represents the primary data for assessing reliability and uncertainties of climate change information. An alternative approach is to generate similar ensembles by perturbing parameters within a single-model framework. One of workshop’s objectives was to give participants a deeper understanding of these approaches within a Bayesian statistical framework. However, there remain significant challenges still to be resolved before UQ can be applied in a convincing way to climate models and their projections.« less

  6. Evaluating measurement uncertainty in fluid phase equilibrium calculations

    NASA Astrophysics Data System (ADS)

    van der Veen, Adriaan M. H.

    2018-04-01

    The evaluation of measurement uncertainty in accordance with the ‘Guide to the expression of uncertainty in measurement’ (GUM) has not yet become widespread in physical chemistry. With only the law of the propagation of uncertainty from the GUM, many of these uncertainty evaluations would be cumbersome, as models are often non-linear and require iterative calculations. The methods from GUM supplements 1 and 2 enable the propagation of uncertainties under most circumstances. Experimental data in physical chemistry are used, for example, to derive reference property data and support trade—all applications where measurement uncertainty plays an important role. This paper aims to outline how the methods for evaluating and propagating uncertainty can be applied to some specific cases with a wide impact: deriving reference data from vapour pressure data, a flash calculation, and the use of an equation-of-state to predict the properties of both phases in a vapour-liquid equilibrium. The three uncertainty evaluations demonstrate that the methods of GUM and its supplements are a versatile toolbox that enable us to evaluate the measurement uncertainty of physical chemical measurements, including the derivation of reference data, such as the equilibrium thermodynamical properties of fluids.

  7. DECIDE: a software for computer-assisted evaluation of diagnostic test performance.

    PubMed

    Chiecchio, A; Bo, A; Manzone, P; Giglioli, F

    1993-05-01

    The evaluation of the performance of clinical tests is a complex problem involving different steps and many statistical tools, not always structured in an organic and rational system. This paper presents a software which provides an organic system of statistical tools helping evaluation of clinical test performance. The program allows (a) the building and the organization of a working database, (b) the selection of the minimal set of tests with the maximum information content, (c) the search of the model best fitting the distribution of the test values, (d) the selection of optimal diagnostic cut-off value of the test for every positive/negative situation, (e) the evaluation of performance of the combinations of correlated and uncorrelated tests. The uncertainty associated with all the variables involved is evaluated. The program works in a MS-DOS environment with EGA or higher performing graphic card.

  8. Evolution and outcomes of a quality improvement program.

    PubMed

    Thor, Johan; Herrlin, Bo; Wittlöv, Karin; Øvretveit, John; Brommels, Mats

    2010-01-01

    The purpose of this paper is to examine the outcomes and evolution over a five-year period of a Swedish university hospital quality improvement program in light of enduring uncertainty regarding the effectiveness of such programs in healthcare and how best to evaluate it. The paper takes the form of a case study, using data collected as part of the program, including quality indicators from clinical improvement projects and participants' program evaluations. Overall, 58 percent of the program's projects (39/67) demonstrated success. A greater proportion of projects led by female doctors demonstrated success (91 percent, n=11) than projects led by male doctors (51 percent, n=55). Facilitators at the hospital continuously adapted the improvement methods to the local context. A lack of dedicated time for improvement efforts was the participants' biggest difficulty. The dominant benefits included an increased ability to see the "bigger picture" and the improvements achieved for patients and employees. Quality measurement, which is important for conducting and evaluating improvement efforts, was weak with limited reliability. Nevertheless, the present study adds evidence about the effectiveness of healthcare improvement programs. Gender differences in improvement team leadership merit further study. Improvement program evaluation should assess the extent to which improvement methods are locally adapted and applied. This case study reports the outcomes of all improvement projects undertaken in one healthcare organization over a five-year period and provides in-depth insight into an improvement program's changeable nature.

  9. Real-Time Optimal Flood Control Decision Making and Risk Propagation Under Multiple Uncertainties

    NASA Astrophysics Data System (ADS)

    Zhu, Feilin; Zhong, Ping-An; Sun, Yimeng; Yeh, William W.-G.

    2017-12-01

    Multiple uncertainties exist in the optimal flood control decision-making process, presenting risks involving flood control decisions. This paper defines the main steps in optimal flood control decision making that constitute the Forecast-Optimization-Decision Making (FODM) chain. We propose a framework for supporting optimal flood control decision making under multiple uncertainties and evaluate risk propagation along the FODM chain from a holistic perspective. To deal with uncertainties, we employ stochastic models at each link of the FODM chain. We generate synthetic ensemble flood forecasts via the martingale model of forecast evolution. We then establish a multiobjective stochastic programming with recourse model for optimal flood control operation. The Pareto front under uncertainty is derived via the constraint method coupled with a two-step process. We propose a novel SMAA-TOPSIS model for stochastic multicriteria decision making. Then we propose the risk assessment model, the risk of decision-making errors and rank uncertainty degree to quantify the risk propagation process along the FODM chain. We conduct numerical experiments to investigate the effects of flood forecast uncertainty on optimal flood control decision making and risk propagation. We apply the proposed methodology to a flood control system in the Daduhe River basin in China. The results indicate that the proposed method can provide valuable risk information in each link of the FODM chain and enable risk-informed decisions with higher reliability.

  10. Nuclear thermal rocket nozzle testing and evaluation program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidian, K.O.; Kacynski, K.J.

    Performance characteristics of the Nuclear Thermal Rocket can be enhanced through the use of unconventional nozzles as part of the propulsion system. In this report, the Nuclear Thermal Rocket nozzle testing and evaluation program being conducted at the NASA Lewis Research Center is outlined and the advantages of a plug nozzle are described. A facility description, experimental designs and schematics are given. Results of pretest performance analyses show that high nozzle performance can be attained despite substantial nozzle length reduction through the use of plug nozzles as compared to a convergent-divergent nozzle. Pretest measurement uncertainty analyses indicate that specific impulsemore » values are expected to be within plus or minus 1.17%.« less

  11. Computer-assisted uncertainty assessment of k0-NAA measurement results

    NASA Astrophysics Data System (ADS)

    Bučar, T.; Smodiš, B.

    2008-10-01

    In quantifying measurement uncertainty of measurement results obtained by the k0-based neutron activation analysis ( k0-NAA), a number of parameters should be considered and appropriately combined in deriving the final budget. To facilitate this process, a program ERON (ERror propagatiON) was developed, which computes uncertainty propagation factors from the relevant formulae and calculates the combined uncertainty. The program calculates uncertainty of the final result—mass fraction of an element in the measured sample—taking into account the relevant neutron flux parameters such as α and f, including their uncertainties. Nuclear parameters and their uncertainties are taken from the IUPAC database (V.P. Kolotov and F. De Corte, Compilation of k0 and related data for NAA). Furthermore, the program allows for uncertainty calculations of the measured parameters needed in k0-NAA: α (determined with either the Cd-ratio or the Cd-covered multi-monitor method), f (using the Cd-ratio or the bare method), Q0 (using the Cd-ratio or internal comparator method) and k0 (using the Cd-ratio, internal comparator or the Cd subtraction method). The results of calculations can be printed or exported to text or MS Excel format for further analysis. Special care was taken to make the calculation engine portable by having possibility of its incorporation into other applications (e.g., DLL and WWW server). Theoretical basis and the program are described in detail, and typical results obtained under real measurement conditions are presented.

  12. Linear Programming Problems for Generalized Uncertainty

    ERIC Educational Resources Information Center

    Thipwiwatpotjana, Phantipa

    2010-01-01

    Uncertainty occurs when there is more than one realization that can represent an information. This dissertation concerns merely discrete realizations of an uncertainty. Different interpretations of an uncertainty and their relationships are addressed when the uncertainty is not a probability of each realization. A well known model that can handle…

  13. Strategies for Evaluating Complex Environmental Education Programs

    NASA Astrophysics Data System (ADS)

    Williams, V.

    2011-12-01

    Evidence for the effectiveness of environmental education programs has been difficult to establish for many reasons. Chief among them are the lack of clear program objectives and an inability to conceptualize how environmental education programs work. Both can lead to evaluations that make claims that are difficult to substantiate, such as significant changes in student achievement levels or behavioral changes based on acquisition of knowledge. Many of these challenges can be addressed by establishing the program theory and developing a logic model. However, claims of impact on larger societal outcomes are difficult to attribute solely to program activities. Contribution analysis may offer a promising method for addressing this challenge. Rather than attempt to definitively and causally link a program's activities to desired results, contribution analysis seeks to provide plausible evidence that can reduce uncertainty regarding the 'difference' a program is making to observed outcomes. It sets out to verify the theory of change behind a program and, at the same time, takes into consideration other influencing factors. Contribution analysis is useful in situations where the program is not experimental-there is little or no scope for varying how the program is implemented-and the program has been funded on the basis of a theory of change. In this paper, the author reviews the feasibility of using contribution analysis as a way of evaluating the impact of the GLOBE program, an environmental science and education program. Initially conceptualized by Al Gore in 1995, the program's implementation model is based on worldwide environmental monitoring by students and scientists around the globe. This paper will make a significant and timely contribution to the field of evaluation, and specifically environmental education evaluation by examining the usefulness of this analysis for developing evidence to assess the impact of environmental education programs.

  14. A detailed description of the uncertainty analysis for high area ratio rocket nozzle tests at the NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac

    1987-01-01

    A preliminary uncertainty analysis was performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis is presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.

  15. A detailed description of the uncertainty analysis for High Area Ratio Rocket Nozzle tests at the NASA Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Davidian, Kenneth J.; Dieck, Ronald H.; Chuang, Isaac

    1987-01-01

    A preliminary uncertainty analysis has been performed for the High Area Ratio Rocket Nozzle test program which took place at the altitude test capsule of the Rocket Engine Test Facility at the NASA Lewis Research Center. Results from the study establish the uncertainty of measured and calculated parameters required for the calculation of rocket engine specific impulse. A generalized description of the uncertainty methodology used is provided. Specific equations and a detailed description of the analysis are presented. Verification of the uncertainty analysis model was performed by comparison with results from the experimental program's data reduction code. Final results include an uncertainty for specific impulse of 1.30 percent. The largest contributors to this uncertainty were calibration errors from the test capsule pressure and thrust measurement devices.

  16. A methodology for the evaluation of program cost and schedule risk for the SEASAT program

    NASA Technical Reports Server (NTRS)

    Abram, P.; Myers, D.

    1976-01-01

    An interactive computerized project management software package (RISKNET) is designed to analyze the effect of the risk involved in each specific activity on the results of the total SEASAT-A program. Both the time and the cost of each distinct activity can be modeled with an uncertainty interval so as to provide the project manager with not only the expected time and cost for the completion of the total program, but also with the expected range of costs corresponding to any desired level of significance. The nature of the SEASAT-A program is described. The capabilities of RISKNET and the implementation plan of a RISKNET analysis for the development of SEASAT-A are presented.

  17. Improvements to Nuclear Data and Its Uncertainties by Theoretical Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Danon, Yaron; Nazarewicz, Witold; Talou, Patrick

    2013-02-18

    This project addresses three important gaps in existing evaluated nuclear data libraries that represent a significant hindrance against highly advanced modeling and simulation capabilities for the Advanced Fuel Cycle Initiative (AFCI). This project will: Develop advanced theoretical tools to compute prompt fission neutrons and gamma-ray characteristics well beyond average spectra and multiplicity, and produce new evaluated files of U and Pu isotopes, along with some minor actinides; Perform state-of-the-art fission cross-section modeling and calculations using global and microscopic model input parameters, leading to truly predictive fission cross-sections capabilities. Consistent calculations for a suite of Pu isotopes will be performed; Implementmore » innovative data assimilation tools, which will reflect the nuclear data evaluation process much more accurately, and lead to a new generation of uncertainty quantification files. New covariance matrices will be obtained for Pu isotopes and compared to existing ones. The deployment of a fleet of safe and efficient advanced reactors that minimize radiotoxic waste and are proliferation-resistant is a clear and ambitious goal of AFCI. While in the past the design, construction and operation of a reactor were supported through empirical trials, this new phase in nuclear energy production is expected to rely heavily on advanced modeling and simulation capabilities. To be truly successful, a program for advanced simulations of innovative reactors will have to develop advanced multi-physics capabilities, to be run on massively parallel super- computers, and to incorporate adequate and precise underlying physics. And all these areas have to be developed simultaneously to achieve those ambitious goals. Of particular interest are reliable fission cross-section uncertainty estimates (including important correlations) and evaluations of prompt fission neutrons and gamma-ray spectra and uncertainties.« less

  18. [The uncertainty evaluation of analytical results of 27 elements in geological samples by X-ray fluorescence spectrometry].

    PubMed

    Wang, Yi-Ya; Zhan, Xiu-Chun

    2014-04-01

    Evaluating uncertainty of analytical results with 165 geological samples by polarized dispersive X-ray fluorescence spectrometry (P-EDXRF) has been reported according to the internationally accepted guidelines. One hundred sixty five pressed pellets of similar matrix geological samples with reliable values were analyzed by P-EDXRF. These samples were divided into several different concentration sections in the concentration ranges of every component. The relative uncertainties caused by precision and accuracy of 27 components were evaluated respectively. For one element in one concentration, the relative uncertainty caused by precision can be calculated according to the average value of relative standard deviation with different concentration level in one concentration section, n = 6 stands for the 6 results of one concentration level. The relative uncertainty caused by accuracy in one concentration section can be evaluated by the relative standard deviation of relative deviation with different concentration level in one concentration section. According to the error propagation theory, combining the precision uncertainty and the accuracy uncertainty into a global uncertainty, this global uncertainty acted as method uncertainty. This model of evaluating uncertainty can solve a series of difficult questions in the process of evaluating uncertainty, such as uncertainties caused by complex matrix of geological samples, calibration procedure, standard samples, unknown samples, matrix correction, overlap correction, sample preparation, instrument condition and mathematics model. The uncertainty of analytical results in this method can act as the uncertainty of the results of the similar matrix unknown sample in one concentration section. This evaluation model is a basic statistical method owning the practical application value, which can provide a strong base for the building of model of the following uncertainty evaluation function. However, this model used a lot of samples which cannot simply be applied to other types of samples with different matrix samples. The number of samples is too large to adapt to other type's samples. We will strive for using this study as a basis to establish a reasonable basis of mathematical statistics function mode to be applied to different types of samples.

  19. When Do Simpler Sexual Behavior Data Collection Techniques Suffice?: An Analysis of Consequent Uncertainty in HIV Acquisition Risk Estimates

    ERIC Educational Resources Information Center

    Pinkerton, Steven D.; Benotsch, Eric G.; Mikytuck, John

    2007-01-01

    The "gold standard" for evaluating human immunodeficiency virus (HIV) prevention programs is a partner-by-partner sexual behavior assessment that elicits information about each sex partner and the activities engaged in with that partner. When collection of detailed partner-by-partner data is not feasible, aggregate data (e.g., total…

  20. FIA Quality Assurance Program: Evaluation of a Tree Matching Algorithm for Paired Forest Inventory Data

    Treesearch

    James E. Pollard; James A. Westfall; Paul A. Patterson; David L. Gartner

    2005-01-01

    The quality of Forest Inventory and Analysis inventory data can be documented by having quality assurance crews remeasure plots originally measured by field crews within 2 to 3 weeks of the initial measurement, and assessing the difference between the original and remeasured data. Estimates of measurement uncertainty for the data are generated using paired data...

  1. Policy uncertainty and corporate performance in government-sponsored voluntary environmental programs.

    PubMed

    Liu, Ning; Tang, Shui-Yan; Zhan, Xueyong; Lo, Carlos Wing-Hung

    2018-08-01

    This study combines insights from the policy uncertainty literature and neo-institutional theory to examine corporate performance in implementing a government-sponsored voluntary environmental program (VEP) during 2004-2012 in Guangzhou, China. In this regulatory context, characterized by rapid policy changes, corporate performance in VEPs is affected by government surveillance, policy uncertainty, and peer pressures. Specifically, if VEP participants have experienced more government surveillance, they tend to perform better in program implementation. Such positive influence of government surveillance is particularly evident among those joining under high and low, rather than moderate uncertainty. Participants also perform better if they belong to an industry with more certified VEP firms, but worse if they are located in a regulatory jurisdiction with more certified VEP firms. At a moderate level of policy uncertainty, within-industry imitation is most likely to occur but within-jurisdiction imitation is least likely to occur. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. A Cost-Effectiveness Analysis of the Swedish Universal Parenting Program All Children in Focus.

    PubMed

    Ulfsdotter, Malin; Lindberg, Lene; Månsdotter, Anna

    2015-01-01

    There are few health economic evaluations of parenting programs with quality-adjusted life-years (QALYs) as the outcome measure. The objective of this study was, therefore, to conduct a cost-effectiveness analysis of the universal parenting program All Children in Focus (ABC). The goals were to estimate the costs of program implementation, investigate the health effects of the program, and examine its cost-effectiveness. A cost-effectiveness analysis was conducted. Costs included setup costs and operating costs. A parent proxy Visual Analog Scale was used to measure QALYs in children, whereas the General Health Questionnaire-12 was used for parents. A societal perspective was adopted, and the incremental cost-effectiveness ratio was calculated. To account for uncertainty in the estimate, the probability of cost-effectiveness was investigated, and sensitivity analyses were used to account for the uncertainty in cost data. The cost was € 326.3 per parent, of which € 53.7 represented setup costs under the assumption that group leaders on average run 10 groups, and € 272.6 was the operating costs. For health effects, the QALY gain was 0.0042 per child and 0.0027 per parent. These gains resulted in an incremental cost-effectiveness ratio for the base case of € 47 290 per gained QALY. The sensitivity analyses resulted in ratios from € 41 739 to € 55 072. With the common Swedish threshold value of € 55 000 per QALY, the probability of the ABC program being cost-effective was 50.8 percent. Our analysis of the ABC program demonstrates cost-effectiveness ratios below or just above the QALY threshold in Sweden. However, due to great uncertainty about the data, the health economic rationale for implementation should be further studied considering a longer time perspective, effects on siblings, and validated measuring techniques, before full scale implementation.

  3. A Cost-Effectiveness Analysis of the Swedish Universal Parenting Program All Children in Focus

    PubMed Central

    Ulfsdotter, Malin

    2015-01-01

    Objective There are few health economic evaluations of parenting programs with quality-adjusted life-years (QALYs) as the outcome measure. The objective of this study was, therefore, to conduct a cost-effectiveness analysis of the universal parenting program All Children in Focus (ABC). The goals were to estimate the costs of program implementation, investigate the health effects of the program, and examine its cost-effectiveness. Methods A cost-effectiveness analysis was conducted. Costs included setup costs and operating costs. A parent proxy Visual Analog Scale was used to measure QALYs in children, whereas the General Health Questionnaire-12 was used for parents. A societal perspective was adopted, and the incremental cost-effectiveness ratio was calculated. To account for uncertainty in the estimate, the probability of cost-effectiveness was investigated, and sensitivity analyses were used to account for the uncertainty in cost data. Results The cost was €326.3 per parent, of which €53.7 represented setup costs under the assumption that group leaders on average run 10 groups, and €272.6 was the operating costs. For health effects, the QALY gain was 0.0042 per child and 0.0027 per parent. These gains resulted in an incremental cost-effectiveness ratio for the base case of €47 290 per gained QALY. The sensitivity analyses resulted in ratios from €41 739 to €55 072. With the common Swedish threshold value of €55 000 per QALY, the probability of the ABC program being cost-effective was 50.8 percent. Conclusion Our analysis of the ABC program demonstrates cost-effectiveness ratios below or just above the QALY threshold in Sweden. However, due to great uncertainty about the data, the health economic rationale for implementation should be further studied considering a longer time perspective, effects on siblings, and validated measuring techniques, before full scale implementation. PMID:26681349

  4. Constellation Program Lessons Learned in the Quantification and Use of Aerodynamic Uncertainty

    NASA Technical Reports Server (NTRS)

    Walker, Eric L.; Hemsch, Michael J.; Pinier, Jeremy T.; Bibb, Karen L.; Chan, David T.; Hanke, Jeremy L.

    2011-01-01

    The NASA Constellation Program has worked for the past five years to develop a re- placement for the current Space Transportation System. Of the elements that form the Constellation Program, only two require databases that define aerodynamic environments and their respective uncertainty: the Ares launch vehicles and the Orion crew and launch abort vehicles. Teams were established within the Ares and Orion projects to provide repre- sentative aerodynamic models including both baseline values and quantified uncertainties. A technical team was also formed within the Constellation Program to facilitate integra- tion among the project elements. This paper is a summary of the collective experience of the three teams working with the quantification and use of uncertainty in aerodynamic environments: the Ares and Orion project teams as well as the Constellation integration team. Not all of the lessons learned discussed in this paper could be applied during the course of the program, but they are included in the hope of benefiting future projects.

  5. A general program to compute the multivariable stability margin for systems with parametric uncertainty

    NASA Technical Reports Server (NTRS)

    Sanchez Pena, Ricardo S.; Sideris, Athanasios

    1988-01-01

    A computer program implementing an algorithm for computing the multivariable stability margin to check the robust stability of feedback systems with real parametric uncertainty is proposed. The authors present in some detail important aspects of the program. An example is presented using lateral directional control system.

  6. An inexact mixed risk-aversion two-stage stochastic programming model for water resources management under uncertainty.

    PubMed

    Li, W; Wang, B; Xie, Y L; Huang, G H; Liu, L

    2015-02-01

    Uncertainties exist in the water resources system, while traditional two-stage stochastic programming is risk-neutral and compares the random variables (e.g., total benefit) to identify the best decisions. To deal with the risk issues, a risk-aversion inexact two-stage stochastic programming model is developed for water resources management under uncertainty. The model was a hybrid methodology of interval-parameter programming, conditional value-at-risk measure, and a general two-stage stochastic programming framework. The method extends on the traditional two-stage stochastic programming method by enabling uncertainties presented as probability density functions and discrete intervals to be effectively incorporated within the optimization framework. It could not only provide information on the benefits of the allocation plan to the decision makers but also measure the extreme expected loss on the second-stage penalty cost. The developed model was applied to a hypothetical case of water resources management. Results showed that that could help managers generate feasible and balanced risk-aversion allocation plans, and analyze the trade-offs between system stability and economy.

  7. Examples of measurement uncertainty evaluations in accordance with the revised GUM

    NASA Astrophysics Data System (ADS)

    Runje, B.; Horvatic, A.; Alar, V.; Medic, S.; Bosnjakovic, A.

    2016-11-01

    The paper presents examples of the evaluation of uncertainty components in accordance with the current and revised Guide to the expression of uncertainty in measurement (GUM). In accordance with the proposed revision of the GUM a Bayesian approach was conducted for both type A and type B evaluations.The law of propagation of uncertainty (LPU) and the law of propagation of distribution applied through the Monte Carlo method, (MCM) were used to evaluate associated standard uncertainties, expanded uncertainties and coverage intervals. Furthermore, the influence of the non-Gaussian dominant input quantity and asymmetric distribution of the output quantity y on the evaluation of measurement uncertainty was analyzed. In the case when the probabilistically coverage interval is not symmetric, the coverage interval for the probability P is estimated from the experimental probability density function using the Monte Carlo method. Key highlights of the proposed revision of the GUM were analyzed through a set of examples.

  8. [Influence of Uncertainty and Uncertainty Appraisal on Self-management in Hemodialysis Patients].

    PubMed

    Jang, Hyung Suk; Lee, Chang Suk; Yang, Young Hee

    2015-04-01

    This study was done to examine the relation of uncertainty, uncertainty appraisal, and self-management in patients undergoing hemodialysis, and to identify factors influencing self-management. A convenience sample of 92 patients receiving hemodialysis was selected. Data were collected using a structured questionnaire and medical records. The collected data were analyzed using descriptive statistics, t-test, ANOVA, Pearson correlations and multiple regression analysis with the SPSS/WIN 20.0 program. The participants showed a moderate level of uncertainty with the highest score being for ambiguity among the four uncertainty subdomains. Scores for uncertainty danger or opportunity appraisals were under the mid points. The participants were found to perform a high level of self-management such as diet control, management of arteriovenous fistula, exercise, medication, physical management, measurements of body weight and blood pressure, and social activity. The self-management of participants undergoing hemodialysis showed a significant relationship with uncertainty and uncertainty appraisal. The significant factors influencing self-management were uncertainty, uncertainty opportunity appraisal, hemodialysis duration, and having a spouse. These variables explained 32.8% of the variance in self-management. The results suggest that intervention programs to reduce the level of uncertainty and to increase the level of uncertainty opportunity appraisal among patients would improve the self-management of hemodialysis patients.

  9. Gap Size Uncertainty Quantification in Advanced Gas Reactor TRISO Fuel Irradiation Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, Binh T.; Einerson, Jeffrey J.; Hawkes, Grant L.

    The Advanced Gas Reactor (AGR)-3/4 experiment is the combination of the third and fourth tests conducted within the tristructural isotropic fuel development and qualification research program. The AGR-3/4 test consists of twelve independent capsules containing a fuel stack in the center surrounded by three graphite cylinders and shrouded by a stainless steel shell. This capsule design enables temperature control of both the fuel and the graphite rings by varying the neon/helium gas mixture flowing through the four resulting gaps. Knowledge of fuel and graphite temperatures is crucial for establishing the functional relationship between fission product release and irradiation thermal conditions.more » These temperatures are predicted for each capsule using the commercial finite-element heat transfer code ABAQUS. Uncertainty quantification reveals that the gap size uncertainties are among the dominant factors contributing to predicted temperature uncertainty due to high input sensitivity and uncertainty. Gap size uncertainty originates from the fact that all gap sizes vary with time due to dimensional changes of the fuel compacts and three graphite rings caused by extended exposure to high temperatures and fast neutron irradiation. Gap sizes are estimated using as-fabricated dimensional measurements at the start of irradiation and post irradiation examination dimensional measurements at the end of irradiation. Uncertainties in these measurements provide a basis for quantifying gap size uncertainty. However, lack of gap size measurements during irradiation and lack of knowledge about the dimension change rates lead to gap size modeling assumptions, which could increase gap size uncertainty. In addition, the dimensional measurements are performed at room temperature, and must be corrected to account for thermal expansion of the materials at high irradiation temperatures. Uncertainty in the thermal expansion coefficients for the graphite materials used in the AGR-3/4 capsules also increases gap size uncertainty. This study focuses on analysis of modeling assumptions and uncertainty sources to evaluate their impacts on the gap size uncertainty.« less

  10. Evaluation of the ²³⁹Pu prompt fission neutron spectrum induced by neutrons of 500 keV and associated covariances

    DOE PAGES

    Neudecker, D.; Talou, P.; Kawano, T.; ...

    2015-08-01

    We present evaluations of the prompt fission neutron spectrum (PFNS) of ²³⁹Pu induced by 500 keV neutrons, and associated covariances. In a previous evaluation by Talou et al. 2010, surprisingly low evaluated uncertainties were obtained, partly due to simplifying assumptions in the quantification of uncertainties from experiment and model. Therefore, special emphasis is placed here on a thorough uncertainty quantification of experimental data and of the Los Alamos model predicted values entering the evaluation. In addition, the Los Alamos model was extended and an evaluation technique was employed that takes into account the qualitative differences between normalized model predicted valuesmore » and experimental shape data. These improvements lead to changes in the evaluated PFNS and overall larger evaluated uncertainties than in the previous work. However, these evaluated uncertainties are still smaller than those obtained in a statistical analysis using experimental information only, due to strong model correlations. Hence, suggestions to estimate model defect uncertainties are presented, which lead to more reasonable evaluated uncertainties. The calculated k eff of selected criticality benchmarks obtained with these new evaluations agree with each other within their uncertainties despite the different approaches to estimate model defect uncertainties. The k eff one standard deviations overlap with some of those obtained using ENDF/B-VII.1, albeit their mean values are further away from unity. Spectral indexes for the Jezebel critical assembly calculated with the newly evaluated PFNS agree with the experimental data for selected (n,γ) and (n,f) reactions, and show improvements for high-energy threshold (n,2n) reactions compared to ENDF/B-VII.1.« less

  11. Evaluation of the 239 Pu prompt fission neutron spectrum induced by neutrons of 500 keV and associated covariances

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neudecker, D.; Talou, P.; Kawano, T.

    2015-08-01

    We present evaluations of the prompt fission neutron spectrum (PFNS) of (PU)-P-239 induced by 500 keV neutrons, and associated covariances. In a previous evaluation by Talon et al. (2010), surprisingly low evaluated uncertainties were obtained, partly due to simplifying assumptions in the quantification of uncertainties from experiment and model. Therefore, special emphasis is placed here on a thorough uncertainty quantification of experimental data and of the Los Alamos model predicted values entering the evaluation. In addition, the Los Alamos model was extended and an evaluation technique was employed that takes into account the qualitative differences between normalized model predicted valuesmore » and experimental shape data These improvements lead to changes in the evaluated PENS and overall larger evaluated uncertainties than in the previous work. However, these evaluated uncertainties are still smaller than those obtained in a statistical analysis using experimental information only, due to strong model correlations. Hence, suggestions to estimate model defect uncertainties are presented. which lead to more reasonable evaluated uncertainties. The calculated k(eff) of selected criticality benchmarks obtained with these new evaluations agree with each other within their uncertainties despite the different approaches to estimate model defect uncertainties. The k(eff) one standard deviations overlap with some of those obtained using ENDF/B-VILl, albeit their mean values are further away from unity. Spectral indexes for the Jezebel critical assembly calculated with the newly evaluated PFNS agree with the experimental data for selected (n,) and (n,f) reactions, and show improvements for highenergy threshold (n,2n) reactions compared to ENDF/B-VII.l. (C) 2015 Elsevier B.V. All rights reserved.« less

  12. Nuclear Data Uncertainty Quantification: Past, Present and Future

    NASA Astrophysics Data System (ADS)

    Smith, D. L.

    2015-01-01

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for future investigation of this subject are also suggested.

  13. Uncertainty Evaluation of Computational Model Used to Support the Integrated Powerhead Demonstration Project

    NASA Technical Reports Server (NTRS)

    Steele, W. G.; Molder, K. J.; Hudson, S. T.; Vadasy, K. V.; Rieder, P. T.; Giel, T.

    2005-01-01

    NASA and the U.S. Air Force are working on a joint project to develop a new hydrogen-fueled, full-flow, staged combustion rocket engine. The initial testing and modeling work for the Integrated Powerhead Demonstrator (IPD) project is being performed by NASA Marshall and Stennis Space Centers. A key factor in the testing of this engine is the ability to predict and measure the transient fluid flow during engine start and shutdown phases of operation. A model built by NASA Marshall in the ROCket Engine Transient Simulation (ROCETS) program is used to predict transient engine fluid flows. The model is initially calibrated to data from previous tests on the Stennis E1 test stand. The model is then used to predict the next run. Data from this run can then be used to recalibrate the model providing a tool to guide the test program in incremental steps to reduce the risk to the prototype engine. In this paper, they define this type of model as a calibrated model. This paper proposes a method to estimate the uncertainty of a model calibrated to a set of experimental test data. The method is similar to that used in the calibration of experiment instrumentation. For the IPD example used in this paper, the model uncertainty is determined for both LOX and LH flow rates using previous data. The successful use of this model is then demonstrated to predict another similar test run within the uncertainty bounds. The paper summarizes the uncertainty methodology when a model is continually recalibrated with new test data. The methodology is general and can be applied to other calibrated models.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savy, J.

    New design and evaluation guidelines for department of energy facilities subjected to natural phenomena hazard, are being finalized. Although still in draft form at this time, the document describing those guidelines should be considered to be an update of previously available guidelines. The recommendations in the guidelines document mentioned above, and simply referred to as the guidelines'' thereafter, are based on the best information at the time of its development. In particular, the seismic hazard model for the Princeton site was based on a study performed in 1981 for Lawrence Livermore National Laboratory (LLNL), which relied heavily on the resultsmore » of the NRC's Systematic Evaluation Program and was based on a methodology and data sets developed in 1977 and 1978. Considerable advances have been made in the last ten years in the domain of seismic hazard modeling. Thus, it is recommended to update the estimate of the seismic hazard at the DOE sites whenever possible. The major differences between previous estimates and the ones proposed in this study for the PPPL are in the modeling of the strong ground motion at the site, and the treatment of the total uncertainty in the estimates to include knowledge uncertainty, random uncertainty, and expert opinion diversity as well. 28 refs.« less

  15. [Evaluation of measurement uncertainty of welding fume in welding workplace of a shipyard].

    PubMed

    Ren, Jie; Wang, Yanrang

    2015-12-01

    To evaluate the measurement uncertainty of welding fume in the air of the welding workplace of a shipyard, and to provide quality assurance for measurement. According to GBZ/T 192.1-2007 "Determination of dust in the air of workplace-Part 1: Total dust concentration" and JJF 1059-1999 "Evaluation and expression of measurement uncertainty", the uncertainty for determination of welding fume was evaluated and the measurement results were completely described. The concentration of welding fume was 3.3 mg/m(3), and the expanded uncertainty was 0.24 mg/m(3). The repeatability for determination of dust concentration introduced an uncertainty of 1.9%, the measurement using electronic balance introduced a standard uncertainty of 0.3%, and the measurement of sample quality introduced a standard uncertainty of 3.2%. During the determination of welding fume, the standard uncertainty introduced by the measurement of sample quality is the dominant uncertainty. In the process of sampling and measurement, quality control should be focused on the collection efficiency of dust, air humidity, sample volume, and measuring instruments.

  16. Enhancing the ecological risk assessment process.

    PubMed

    Dale, Virginia H; Biddinger, Gregory R; Newman, Michael C; Oris, James T; Suter, Glenn W; Thompson, Timothy; Armitage, Thomas M; Meyer, Judith L; Allen-King, Richelle M; Burton, G Allen; Chapman, Peter M; Conquest, Loveday L; Fernandez, Ivan J; Landis, Wayne G; Master, Lawrence L; Mitsch, William J; Mueller, Thomas C; Rabeni, Charles F; Rodewald, Amanda D; Sanders, James G; van Heerden, Ivor L

    2008-07-01

    The Ecological Processes and Effects Committee of the US Environmental Protection Agency Science Advisory Board conducted a self-initiated study and convened a public workshop to characterize the state of the ecological risk assessment (ERA), with a view toward advancing the science and application of the process. That survey and analysis of ERA in decision making shows that such assessments have been most effective when clear management goals were included in the problem formulation; translated into information needs; and developed in collaboration with decision makers, assessors, scientists, and stakeholders. This process is best facilitated when risk managers, risk assessors, and stakeholders are engaged in an ongoing dialogue about problem formulation. Identification and acknowledgment of uncertainties that have the potential to profoundly affect the results and outcome of risk assessments also improves assessment effectiveness. Thus we suggest 1) through peer review of ERAs be conducted at the problem formulation stage and 2) the predictive power of risk-based decision making be expanded to reduce uncertainties through analytical and methodological approaches like life cycle analysis. Risk assessment and monitoring programs need better integration to reduce uncertainty and to evaluate risk management decision outcomes. Postdecision audit programs should be initiated to evaluate the environmental outcomes of risk-based decisions. In addition, a process should be developed to demonstrate how monitoring data can be used to reduce uncertainties. Ecological risk assessments should include the effects of chemical and nonchemical stressors at multiple levels of biological organization and spatial scale, and the extent and resolution of the pertinent scales and levels of organization should be explicitly considered during problem formulation. An approach to interpreting lines of evidence and weight of evidence is critically needed for complex assessments, and it would be useful to develop case studies and/or standards of practice for interpreting lines of evidence. In addition, tools for cumulative risk assessment should be developed because contaminants are often released into stressed environments.

  17. Uncertainty during breast diagnostic evaluation: state of the science.

    PubMed

    Montgomery, Mariann

    2010-01-01

    To present the state of the science on uncertainty in relationship to the experiences of women undergoing diagnostic evaluation for suspected breast cancer. Published articles from Medline, CINAHL, PubMED, and PsycINFO from 1983-2008 using the following key words: breast biopsy, mammography, uncertainty, reframing, inner strength, and disruption. Fifty research studies were examined with all reporting the presence of anxiety persisting throughout the diagnostic evaluation until certitude is achieved through the establishment of a definitive diagnosis. Indirect determinants of uncertainty for women undergoing breast diagnostic evaluation include measures of anxiety, depression, social support, emotional responses, defense mechanisms, and the psychological impact of events. Understanding and influencing the uncertainty experience have been suggested to be key in relieving psychosocial distress and positively influencing future screening behaviors. Several studies examine correlational relationships among anxiety, selection of coping methods, and demographic factors that influence uncertainty. A gap exists in the literature with regard to the relationship of inner strength and uncertainty. Nurses can be invaluable in assisting women in coping with the uncertainty experience by providing positive communication and support. Nursing interventions should be designed and tested for their effects on uncertainty experienced by women undergoing a breast diagnostic evaluation.

  18. A Bayesian Framework of Uncertainties Integration in 3D Geological Model

    NASA Astrophysics Data System (ADS)

    Liang, D.; Liu, X.

    2017-12-01

    3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.

  19. Nuclear Data Uncertainty Quantification: Past, Present and Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, D. L.

    2015-01-01

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for futuremore » investigation of this subject are also suggested.« less

  20. Nuclear Data Uncertainty Quantification: Past, Present and Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, D.L., E-mail: Donald.L.Smith@anl.gov

    2015-01-15

    An historical overview is provided of the mathematical foundations of uncertainty quantification and the roles played in the more recent past by nuclear data uncertainties in nuclear data evaluations and nuclear applications. Significant advances that have established the mathematical framework for contemporary nuclear data evaluation methods, as well as the use of uncertainty information in nuclear data evaluation and nuclear applications, are described. This is followed by a brief examination of the current status concerning nuclear data evaluation methodology, covariance data generation, and the application of evaluated nuclear data uncertainties in contemporary nuclear technology. A few possible areas for futuremore » investigation of this subject are also suggested.« less

  1. Uncertainty evaluation of thickness and warp of a silicon wafer measured by a spectrally resolved interferometer

    NASA Astrophysics Data System (ADS)

    Praba Drijarkara, Agustinus; Gergiso Gebrie, Tadesse; Lee, Jae Yong; Kang, Chu-Shik

    2018-06-01

    Evaluation of uncertainty of thickness and gravity-compensated warp of a silicon wafer measured by a spectrally resolved interferometer is presented. The evaluation is performed in a rigorous manner, by analysing the propagation of uncertainty from the input quantities through all the steps of measurement functions, in accordance with the ISO Guide to the Expression of Uncertainty in Measurement. In the evaluation, correlation between input quantities as well as uncertainty attributed to thermal effect, which were not included in earlier publications, are taken into account. The temperature dependence of the group refractive index of silicon was found to be nonlinear and varies widely within a wafer and also between different wafers. The uncertainty evaluation described here can be applied to other spectral interferometry applications based on similar principles.

  2. Blast Load Simulator Experiments for Computational Model Validation: Report 2

    DTIC Science & Technology

    2017-02-01

    repeatability. The uncertainty in the experimental pressures and impulses was evaluated by computing 95% confidence intervals on the results. DISCLAIMER: The...Experiment uncertainty The uncertainty in the experimental pressure and impulse was evaluated for the five replicate experiments for which, as closely as...comparisons were made among the replicated experiments to evaluate repeatability. The uncertainty in the experimental pressures and impulses was

  3. Future directions for LDEF ionizing radiation modeling and assessments

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    1993-01-01

    A calculational program utilizing data from radiation dosimetry measurements aboard the Long Duration Exposure Facility (LDEF) satellite to reduce the uncertainties in current models defining the ionizing radiation environment is in progress. Most of the effort to date has been on using LDEF radiation dose measurements to evaluate models defining the geomagnetically trapped radiation, which has provided results applicable to radiation design assessments being performed for Space Station Freedom. Plans for future data comparisons, model evaluations, and assessments using additional LDEF data sets (LET spectra, induced radioactivity, and particle spectra) are discussed.

  4. Data-free and data-driven spectral perturbations for RANS UQ

    NASA Astrophysics Data System (ADS)

    Edeling, Wouter; Mishra, Aashwin; Iaccarino, Gianluca

    2017-11-01

    Despite recent developments in high-fidelity turbulent flow simulations, RANS modeling is still vastly used by industry, due to its inherent low cost. Since accuracy is a concern in RANS modeling, model-form UQ is an essential tool for assessing the impacts of this uncertainty on quantities of interest. Applying the spectral decomposition to the modeled Reynolds-Stress Tensor (RST) allows for the introduction of decoupled perturbations into the baseline intensity (kinetic energy), shape (eigenvalues), and orientation (eigenvectors). This constitutes a natural methodology to evaluate the model form uncertainty associated to different aspects of RST modeling. In a predictive setting, one frequently encounters an absence of any relevant reference data. To make data-free predictions with quantified uncertainty we employ physical bounds to a-priori define maximum spectral perturbations. When propagated, these perturbations yield intervals of engineering utility. High-fidelity data opens up the possibility of inferring a distribution of uncertainty, by means of various data-driven machine-learning techniques. We will demonstrate our framework on a number of flow problems where RANS models are prone to failure. This research was partially supported by the Defense Advanced Research Projects Agency under the Enabling Quantification of Uncertainty in Physical Systems (EQUiPS) project (technical monitor: Dr Fariba Fahroo), and the DOE PSAAP-II program.

  5. Uncertainty evaluation of dead zone of diagnostic ultrasound equipment

    NASA Astrophysics Data System (ADS)

    Souza, R. M.; Alvarenga, A. V.; Braz, D. S.; Petrella, L. I.; Costa-Felix, R. P. B.

    2016-07-01

    This paper presents a model for evaluating measurement uncertainty of a feature used in the assessment of ultrasound images: dead zone. The dead zone was measured by two technicians of the INMETRO's Laboratory of Ultrasound using a phantom and following the standard IEC/TS 61390. The uncertainty model was proposed based on the Guide to the Expression of Uncertainty in Measurement. For the tested equipment, results indicate a dead zone of 1.01 mm, and based on the proposed model, the expanded uncertainty was 0.17 mm. The proposed uncertainty model contributes as a novel way for metrological evaluation of diagnostic imaging by ultrasound.

  6. Pressurized thermal shock evaluation of the Calvert Cliffs Unit 1 Nuclear Power Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abbott, L

    1985-09-01

    An evaluation of the risk to the Calvert Cliffs Unit 1 nuclear power plant due to pressurized thermal shock (PTS) has been completed by Oak Ridge National Laboratory (ORNL) with the assistance of several other organizations. This evaluation was part of a Nuclear Regulatory Commission program designed to study the PTS risk to three nuclear plants, the other two plants being Oconee Unit 1 and H.B. Robinson Unit 2. The specific objectives of the program were to (1) provide a best estimate of the frequency of a through-the-wall crack in the pressure vessel at each of the three plants, togethermore » with the uncertainty in the estimated frequency and its sensitivity to the variables used in the evaluation; (2) determine the dominant overcooling sequences contributing to the estimated frequency and the associated failures in the plant systems or in operator actions; and (3) evaluate the effectiveness of potential corrective measures.« less

  7. Computer programs for generation and evaluation of near-optimum vertical flight profiles

    NASA Technical Reports Server (NTRS)

    Sorensen, J. A.; Waters, M. H.; Patmore, L. C.

    1983-01-01

    Two extensive computer programs were developed. The first, called OPTIM, generates a reference near-optimum vertical profile, and it contains control options so that the effects of various flight constraints on cost performance can be examined. The second, called TRAGEN, is used to simulate an aircraft flying along an optimum or any other vertical reference profile. TRAGEN is used to verify OPTIM's output, examine the effects of uncertainty in the values of parameters (such as prevailing wind) which govern the optimum profile, or compare the cost performance of profiles generated by different techniques. A general description of these programs, the efforts to add special features to them, and sample results of their usage are presented.

  8. Water resources in the twenty-first century; a study of the implications of climate uncertainty

    USGS Publications Warehouse

    Moss, Marshall E.; Lins, Harry F.

    1989-01-01

    The interactions of the water resources on and within the surface of the Earth with the atmosphere that surrounds it are exceedingly complex. Increased uncertainty can be attached to the availability of water of usable quality in the 21st century, therefore, because of potential anthropogenic changes in the global climate system. For the U.S. Geological Survey to continue to fulfill its mission with respect to assessing the Nation's water resources, an expanded program to study the hydrologic implications of climate uncertainty will be required. The goal for this program is to develop knowledge and information concerning the potential water-resources implications for the United States of uncertainties in climate that may result from both anthropogenic and natural changes of the Earth's atmosphere. Like most past and current water-resources programs of the Geological Survey, the climate-uncertainty program should be composed of three elements: (1) research, (2) data collection, and (3) interpretive studies. However, unlike most other programs, the climate-uncertainty program necessarily will be dominated by its research component during its early years. Critical new concerns to be addressed by the research component are (1) areal estimates of evapotranspiration, (2) hydrologic resolution within atmospheric (climatic) models at the global scale and at mesoscales, (3) linkages between hydrology and climatology, and (4) methodology for the design of data networks that will help to track the impacts of climate change on water resources. Other ongoing activities in U.S. Geological Survey research programs will be enhanced to make them more compatible with climate-uncertainty research needs. The existing hydrologic data base of the Geological Survey serves as a key element in assessing hydrologic and climatologic change. However, this data base has evolved in response to other needs for hydrologic information and probably is not as sensitive to climate change as is desirable. Therefore, as measurement and network-design methodologies are improved to account for climate-change potential, new data-collection activities will be added to the existing programs. One particular area of data-collection concern pertains to the phenomenon of evapotranspiration. Interpretive studies of the hydrologic implications of climate uncertainty will be initiated by establishing several studies at the river-basin scale in diverse hydroclimatic and demographic settings. These studies will serve as tests of the existing methodologies for studying the impacts of climate change and also will help to define subsequent research priorities. A prototype for these studies was initiated in early 1988 in the Delaware River basin.

  9. Sustainable infrastructure system modeling under uncertainties and dynamics

    NASA Astrophysics Data System (ADS)

    Huang, Yongxi

    Infrastructure systems support human activities in transportation, communication, water use, and energy supply. The dissertation research focuses on critical transportation infrastructure and renewable energy infrastructure systems. The goal of the research efforts is to improve the sustainability of the infrastructure systems, with an emphasis on economic viability, system reliability and robustness, and environmental impacts. The research efforts in critical transportation infrastructure concern the development of strategic robust resource allocation strategies in an uncertain decision-making environment, considering both uncertain service availability and accessibility. The study explores the performances of different modeling approaches (i.e., deterministic, stochastic programming, and robust optimization) to reflect various risk preferences. The models are evaluated in a case study of Singapore and results demonstrate that stochastic modeling methods in general offers more robust allocation strategies compared to deterministic approaches in achieving high coverage to critical infrastructures under risks. This general modeling framework can be applied to other emergency service applications, such as, locating medical emergency services. The development of renewable energy infrastructure system development aims to answer the following key research questions: (1) is the renewable energy an economically viable solution? (2) what are the energy distribution and infrastructure system requirements to support such energy supply systems in hedging against potential risks? (3) how does the energy system adapt the dynamics from evolving technology and societal needs in the transition into a renewable energy based society? The study of Renewable Energy System Planning with Risk Management incorporates risk management into its strategic planning of the supply chains. The physical design and operational management are integrated as a whole in seeking mitigations against the potential risks caused by feedstock seasonality and demand uncertainty. Facility spatiality, time variation of feedstock yields, and demand uncertainty are integrated into a two-stage stochastic programming (SP) framework. In the study of Transitional Energy System Modeling under Uncertainty, a multistage stochastic dynamic programming is established to optimize the process of building and operating fuel production facilities during the transition. Dynamics due to the evolving technologies and societal changes and uncertainty due to demand fluctuations are the major issues to be addressed.

  10. Performance Assessment Uncertainty Analysis for Japan's HLW Program Feasibility Study (H12)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BABA,T.; ISHIGURO,K.; ISHIHARA,Y.

    1999-08-30

    Most HLW programs in the world recognize that any estimate of long-term radiological performance must be couched in terms of the uncertainties derived from natural variation, changes through time and lack of knowledge about the essential processes. The Japan Nuclear Cycle Development Institute followed a relatively standard procedure to address two major categories of uncertainty. First, a FEatures, Events and Processes (FEPs) listing, screening and grouping activity was pursued in order to define the range of uncertainty in system processes as well as possible variations in engineering design. A reference and many alternative cases representing various groups of FEPs weremore » defined and individual numerical simulations performed for each to quantify the range of conceptual uncertainty. Second, parameter distributions were developed for the reference case to represent the uncertainty in the strength of these processes, the sequencing of activities and geometric variations. Both point estimates using high and low values for individual parameters as well as a probabilistic analysis were performed to estimate parameter uncertainty. A brief description of the conceptual model uncertainty analysis is presented. This paper focuses on presenting the details of the probabilistic parameter uncertainty assessment.« less

  11. Uncertainty Modeling and Evaluation of CMM Task Oriented Measurement Based on SVCMM

    NASA Astrophysics Data System (ADS)

    Li, Hongli; Chen, Xiaohuai; Cheng, Yinbao; Liu, Houde; Wang, Hanbin; Cheng, Zhenying; Wang, Hongtao

    2017-10-01

    Due to the variety of measurement tasks and the complexity of the errors of coordinate measuring machine (CMM), it is very difficult to reasonably evaluate the uncertainty of the measurement results of CMM. It has limited the application of CMM. Task oriented uncertainty evaluation has become a difficult problem to be solved. Taking dimension measurement as an example, this paper puts forward a practical method of uncertainty modeling and evaluation of CMM task oriented measurement (called SVCMM method). This method makes full use of the CMM acceptance or reinspection report and the Monte Carlo computer simulation method (MCM). The evaluation example is presented, and the results are evaluated by the traditional method given in GUM and the proposed method, respectively. The SVCMM method is verified to be feasible and practical. It can help CMM users to conveniently complete the measurement uncertainty evaluation through a single measurement cycle.

  12. Designing cost effective water demand management programs in Australia.

    PubMed

    White, S B; Fane, S A

    2002-01-01

    This paper describes recent experience with integrated resource planning (IRP) and the application of least cost planning (LCP) for the evaluation of demand management strategies in urban water. Two Australian case studies, Sydney and Northern New South Wales (NSW) are used in illustration. LCP can determine the most cost effective means of providing water services or alternatively the cheapest forms of water conservation. LCP contrasts to a traditional approach of evaluation which looks only at means of increasing supply. Detailed investigation of water usage, known as end-use analysis, is required for LCP. End-use analysis allows both rigorous demand forecasting, and the development and evaluation of conservation strategies. Strategies include education campaigns, increasing water use efficiency and promoting wastewater reuse or rainwater tanks. The optimal mix of conservation strategies and conventional capacity expansion is identified based on levelised unit cost. IRP uses LCP in the iterative process, evaluating and assessing options, investing in selected options, measuring the results, and then re-evaluating options. Key to this process is the design of cost effective demand management programs. IRP however includes a range of parameters beyond least economic cost in the planning process and program designs, including uncertainty, benefit partitioning and implementation considerations.

  13. First uncertainty evaluation of the FoCS-2 primary frequency standard

    NASA Astrophysics Data System (ADS)

    Jallageas, A.; Devenoges, L.; Petersen, M.; Morel, J.; Bernier, L. G.; Schenker, D.; Thomann, P.; Südmeyer, T.

    2018-06-01

    We report the uncertainty evaluation of the Swiss continuous primary frequency standard FoCS-2 (Fontaine Continue Suisse). Unlike other primary frequency standards which are working with clouds of cold atoms, this fountain uses a continuous beam of cold caesium atoms bringing a series of metrological advantages and specific techniques for the evaluation of the uncertainty budget. Recent improvements of FoCS-2 have made possible the evaluation of the frequency shifts and of their uncertainties in the order of . When operating in an optimal regime a relative frequency instability of is obtained. The relative standard uncertainty reported in this article, , is strongly dominated by the statistics of the frequency measurements.

  14. Monitoring in the context of structured decision-making and adaptive management

    USGS Publications Warehouse

    Lyons, J.E.; Runge, M.C.; Laskowski, H.P.; Kendall, W.L.

    2008-01-01

    In a natural resource management setting, monitoring is a crucial component of an informed process for making decisions, and monitoring design should be driven by the decision context and associated uncertainties. Monitoring itself can play >3 roles. First, it is important for state-dependent decision-making, as when managers need to know the system state before deciding on the appropriate course of action during the ensuing management cycle. Second, monitoring is critical for evaluating the effectiveness of management actions relative to objectives. Third, in an adaptive management setting, monitoring provides the feedback loop for learning about the system; learning is sought not for its own sake but primarily to better achieve management objectives. In this case, monitoring should be designed to reduce the critical uncertainties in models of the managed system. The United States Geological Survey and United States Fish and Wildlife Service are conducting a large-scale management experiment on 23 National Wildlife Refuges across the Northeast and Midwest Regions. The primary management objective is to provide habitat for migratory waterbirds, particularly during migration, using water-level manipulations in managed wetlands. Key uncertainties are related to the potential trade-offs created by management for a specific waterbird guild (e.g., migratory shorebirds) and the response of waterbirds, plant communities, and invertebrates to specific experimental hydroperiods. We reviewed the monitoring program associated with this study, and the ways that specific observations fill >1 of the roles identified above. We used observations from our monitoring to improve state-dependent decisions to control undesired plants, to evaluate management performance relative to shallow-water habitat objectives, and to evaluate potential trade-offs between waterfowl and shorebird habitat management. With limited staff and budgets, management agencies need efficient monitoring programs that are used for decision-making, not comprehensive studies that elucidate all manner of ecological relationships.

  15. Uncertainty quantification metrics for whole product life cycle cost estimates in aerospace innovation

    NASA Astrophysics Data System (ADS)

    Schwabe, O.; Shehab, E.; Erkoyuncu, J.

    2015-08-01

    The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis for future work in this field.

  16. Uncertainty Quantification for Ice Sheet Science and Sea Level Projections

    NASA Astrophysics Data System (ADS)

    Boening, C.; Schlegel, N.; Limonadi, D.; Schodlok, M.; Seroussi, H. L.; Larour, E. Y.; Watkins, M. M.

    2017-12-01

    In order to better quantify uncertainties in global mean sea level rise projections and in particular upper bounds, we aim at systematically evaluating the contributions from ice sheets and potential for extreme sea level rise due to sudden ice mass loss. Here, we take advantage of established uncertainty quantification tools embedded within the Ice Sheet System Model (ISSM) as well as sensitivities to ice/ocean interactions using melt rates and melt potential derived from MITgcm/ECCO2. With the use of these tools, we conduct Monte-Carlo style sampling experiments on forward simulations of the Antarctic ice sheet, by varying internal parameters and boundary conditions of the system over both extreme and credible worst-case ranges. Uncertainty bounds for climate forcing are informed by CMIP5 ensemble precipitation and ice melt estimates for year 2100, and uncertainty bounds for ocean melt rates are derived from a suite of regional sensitivity experiments using MITgcm. Resulting statistics allow us to assess how regional uncertainty in various parameters affect model estimates of century-scale sea level rise projections. The results inform efforts to a) isolate the processes and inputs that are most responsible for determining ice sheet contribution to sea level; b) redefine uncertainty brackets for century-scale projections; and c) provide a prioritized list of measurements, along with quantitative information on spatial and temporal resolution, required for reducing uncertainty in future sea level rise projections. Results indicate that ice sheet mass loss is dependent on the spatial resolution of key boundary conditions - such as bedrock topography and melt rates at the ice-ocean interface. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  17. Bayesian Chance-Constrained Hydraulic Barrier Design under Geological Structure Uncertainty.

    PubMed

    Chitsazan, Nima; Pham, Hai V; Tsai, Frank T-C

    2015-01-01

    The groundwater community has widely recognized geological structure uncertainty as a major source of model structure uncertainty. Previous studies in aquifer remediation design, however, rarely discuss the impact of geological structure uncertainty. This study combines chance-constrained (CC) programming with Bayesian model averaging (BMA) as a BMA-CC framework to assess the impact of geological structure uncertainty in remediation design. To pursue this goal, the BMA-CC method is compared with traditional CC programming that only considers model parameter uncertainty. The BMA-CC method is employed to design a hydraulic barrier to protect public supply wells of the Government St. pump station from salt water intrusion in the "1500-foot" sand and the "1700-foot" sand of the Baton Rouge area, southeastern Louisiana. To address geological structure uncertainty, three groundwater models based on three different hydrostratigraphic architectures are developed. The results show that using traditional CC programming overestimates design reliability. The results also show that at least five additional connector wells are needed to achieve more than 90% design reliability level. The total amount of injected water from the connector wells is higher than the total pumpage of the protected public supply wells. While reducing the injection rate can be achieved by reducing the reliability level, the study finds that the hydraulic barrier design to protect the Government St. pump station may not be economically attractive. © 2014, National Ground Water Association.

  18. UCODE_2005 and six other computer codes for universal sensitivity analysis, calibration, and uncertainty evaluation constructed using the JUPITER API

    USGS Publications Warehouse

    Poeter, Eileen E.; Hill, Mary C.; Banta, Edward R.; Mehl, Steffen; Christensen, Steen

    2006-01-01

    This report documents the computer codes UCODE_2005 and six post-processors. Together the codes can be used with existing process models to perform sensitivity analysis, data needs assessment, calibration, prediction, and uncertainty analysis. Any process model or set of models can be used; the only requirements are that models have numerical (ASCII or text only) input and output files, that the numbers in these files have sufficient significant digits, that all required models can be run from a single batch file or script, and that simulated values are continuous functions of the parameter values. Process models can include pre-processors and post-processors as well as one or more models related to the processes of interest (physical, chemical, and so on), making UCODE_2005 extremely powerful. An estimated parameter can be a quantity that appears in the input files of the process model(s), or a quantity used in an equation that produces a value that appears in the input files. In the latter situation, the equation is user-defined. UCODE_2005 can compare observations and simulated equivalents. The simulated equivalents can be any simulated value written in the process-model output files or can be calculated from simulated values with user-defined equations. The quantities can be model results, or dependent variables. For example, for ground-water models they can be heads, flows, concentrations, and so on. Prior, or direct, information on estimated parameters also can be considered. Statistics are calculated to quantify the comparison of observations and simulated equivalents, including a weighted least-squares objective function. In addition, data-exchange files are produced that facilitate graphical analysis. UCODE_2005 can be used fruitfully in model calibration through its sensitivity analysis capabilities and its ability to estimate parameter values that result in the best possible fit to the observations. Parameters are estimated using nonlinear regression: a weighted least-squares objective function is minimized with respect to the parameter values using a modified Gauss-Newton method or a double-dogleg technique. Sensitivities needed for the method can be read from files produced by process models that can calculate sensitivities, such as MODFLOW-2000, or can be calculated by UCODE_2005 using a more general, but less accurate, forward- or central-difference perturbation technique. Problems resulting from inaccurate sensitivities and solutions related to the perturbation techniques are discussed in the report. Statistics are calculated and printed for use in (1) diagnosing inadequate data and identifying parameters that probably cannot be estimated; (2) evaluating estimated parameter values; and (3) evaluating how well the model represents the simulated processes. Results from UCODE_2005 and codes RESIDUAL_ANALYSIS and RESIDUAL_ANALYSIS_ADV can be used to evaluate how accurately the model represents the processes it simulates. Results from LINEAR_UNCERTAINTY can be used to quantify the uncertainty of model simulated values if the model is sufficiently linear. Results from MODEL_LINEARITY and MODEL_LINEARITY_ADV can be used to evaluate model linearity and, thereby, the accuracy of the LINEAR_UNCERTAINTY results. UCODE_2005 can also be used to calculate nonlinear confidence and predictions intervals, which quantify the uncertainty of model simulated values when the model is not linear. CORFAC_PLUS can be used to produce factors that allow intervals to account for model intrinsic nonlinearity and small-scale variations in system characteristics that are not explicitly accounted for in the model or the observation weighting. The six post-processing programs are independent of UCODE_2005 and can use the results of other programs that produce the required data-exchange files. UCODE_2005 and the other six codes are intended for use on any computer operating system. The programs con

  19. Uncertainty Analysis in Space Radiation Protection

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.

    2011-01-01

    Space radiation is comprised of high energy and charge (HZE) nuclei, protons, and secondary radiation including neutrons. The uncertainties in estimating the health risks from galactic cosmic rays (GCR) are a major limitation to the length of space missions, the evaluation of potential risk mitigation approaches, and application of the As Low As Reasonably Achievable (ALARA) principle. For long duration space missio ns, risks may approach radiation exposure limits, therefore the uncertainties in risk projections become a major safety concern and methodologies used for ground-based works are not deemed to be sufficient. NASA limits astronaut exposures to a 3% risk of exposure induced death (REID) and protects against uncertainties in risks projections using an assessment of 95% confidence intervals in the projection model. We discuss NASA s approach to space radiation uncertainty assessments and applications for the International Space Station (ISS) program and design studies of future missions to Mars and other destinations. Several features of NASA s approach will be discussed. Radiation quality descriptions are based on the properties of radiation tracks rather than LET with probability distribution functions (PDF) for uncertainties derived from radiobiology experiments at particle accelerators. The application of age and gender specific models for individual astronauts is described. Because more than 90% of astronauts are never-smokers, an alternative risk calculation for never-smokers is used and will be compared to estimates for an average U.S. population. Because of the high energies of the GCR limits the benefits of shielding and the limited role expected for pharmaceutical countermeasures, uncertainty reduction continues to be the optimal approach to improve radiation safety for space missions.

  20. SU-E-J-107: The Impact of the Tumor Location to Deformable Image Registration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sugawara, Y; Tohoku University School of Medicine, Sendai, Miyagi; Tachibana, H

    2015-06-15

    Purpose: For four-dimensional planning and adaptive radiotherapy, the accuracy of deformable image registration (DIR) is essential. We evaluated the accuracy of an in-house program with the free-downloadable DIR software library package (NiftyReg) and two commercially available DIR software programs (MIM Maestro and Velocity AI) in lung SBRT cancer patients. In addition to it, the relationship between the tumor location and the accuracy of the DIRs was investigated. Methods: The free-form deformation was implemented in the in-house program and the MIM. The Velocity was based on the B-spline algorithm. The accuracy of the three programs was evaluated in comparison for themore » structures on 4DCT image datasets between at the peak-inhale and at the peak-exhale. The dice similarity coefficient (DSC) and normalized DSC (NDSC) were measured for the gross tumor volumes from 19 lung SBRT patients. Results: The DSC measurement showed the median values of the DSC were 0.885, 0.872 and 0.798 for the In-house program, the MIM and the Velocity, respectively. The Velocity showed significant difference compared to the others. The median NDSC values were 1.027, 1.005 and 0.946 for the In-house, the MIM and the Velocity, respectively. This indicated that the spatial overlap agreement between the reference and the deformed structure for the in-house and MIM was comparable with the accuracy within 1mm uncertainty. There was larger discrepancy within 1–2mm uncertainty for the Velocity. The In-house and the MIM showed the higher NDSC values than the median values when the GTV was not attached to the chest wall and diaphragm(p < 0.05). However, there is no relationship between the accuracy and the tumor location in the Velocity. Conclusion: The difference of the DIR program would affect different accuracy and the accuracy may be reduced when the tumor is located or attached to chest wall or diaphragm.« less

  1. Performance Assessment Program for the Savannah River Site Liquid Waste Facilities - 13610

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenberger, Kent H.

    2013-07-01

    The Liquid Waste facilities at the U.S. Department of Energy's (DOE) Savannah River Site (SRS) are operated by Liquid Waste Operations contractor Savannah River Remediation LLC (SRR). A separate Performance Assessment (PA) is prepared to support disposal operations at the Saltstone Disposal Facility and closure evaluations for the two liquid waste tank farm facilities at SRS, F-Tank Farm and H-Tank Farm. A PA provides the technical basis and results to be used in subsequent documents to demonstrate compliance with the pertinent requirements identified in operations and closure regulatory guidance. The Saltstone Disposal Facility is subject to a State of Southmore » Carolina industrial solid waste landfill permit and the tank farms are subject to a state industrial waste water permit. The three Liquid Waste facilities are also subject to a Federal Facility Agreement approved by the State, DOE and the Environmental Protection Agency (EPA). Due to the regulatory structure, a PA is a key technical document reviewed by the DOE, the State of South Carolina and the EPA. As the waste material disposed of in the Saltstone Disposal Facility and the residual material in the closed tank farms is also subject to reclassification prior to closure via a waste determination pursuant to Section 3116 of the Ronald W. Reagan National Defense Authorization Act of Fiscal Year 2005, the U.S. Nuclear Regulatory Commission (NRC) is also a reviewing agency for the PAs. Pursuant to the Act, the NRC also has a continuing role to monitor disposal actions to assess compliance with stated performance objectives. The Liquid Waste PA program at SRS represents a continual process over the life of the disposal and closure operations. When the need for a PA or PA revision is identified, the first step is to develop a conceptual model to best represent the facility conditions. The conceptual model will include physical dimensions of the closed system, both the engineered and natural system, and modeling input parameters associated with the modeled features, both initial values (at the time of facility closure) and degradation rates/values. During the development of the PA, evaluations are conducted to reflect not only the results associated with the best available information at the time but also to evaluate potential uncertainties and sensitivities associated with the modeled system. While the PA will reflect the modeled system results from the best available information, it will also identify areas for future work to reduce overall PA uncertainties moving forward. DOE requires a PA Maintenance Program such that work continues to reduce model uncertainties, thus bolstering confidence in PA results that support regulatory decisions. This maintenance work may include new Research and Development activities or modeling as informed by previous PA results and other new information that becomes available. As new information becomes available, it is evaluated against previous PAs and appropriate actions are taken to ensure continued confidence in the regulatory decisions. Therefore, the PA program is a continual process that is not just the development of a PA but seeks to incorporate new information to reduce overall model uncertainty and provide continuing confidence in regulatory decisions. (author)« less

  2. Finding optimal vaccination strategies under parameter uncertainty using stochastic programming.

    PubMed

    Tanner, Matthew W; Sattenspiel, Lisa; Ntaimo, Lewis

    2008-10-01

    We present a stochastic programming framework for finding the optimal vaccination policy for controlling infectious disease epidemics under parameter uncertainty. Stochastic programming is a popular framework for including the effects of parameter uncertainty in a mathematical optimization model. The problem is initially formulated to find the minimum cost vaccination policy under a chance-constraint. The chance-constraint requires that the probability that R(*)

  3. Technology Assessment Requirements for Programs and Projects

    NASA Technical Reports Server (NTRS)

    Bilbro, James W.

    2006-01-01

    Program/project uncertainty can most simply be defined as the unpredictability of its outcome. As might be expected, the degree of uncertainty depends substantially on program/project type. For hi-tech programs/projects, uncertainty all too frequently translates into schedule slips, cost overruns and occasionally even to cancellations or failures - consummations root cause of such events is often attributed to inadequate definition of requirements. If such were indeed the root cause, then correcting the situation would simply be a matter of requiring better requirements definition, but since history seems frequently to repeat itself, this must not be the case - at least not in total. There are in fact many contributors to schedule slips, cost overruns, project cancellations and failures, among them lack of adequate requirements definition. The case can be made, however, that many of these contributors are related to the degree of uncertainty at the outset of the project. And further, that a dominant factor in the degree of uncertainty is the maturity of the technology required to bring the project to fruition. This presentation discusses the concept of relating degrees of uncertainty to Technology Readiness Levels (TRL) and their associated Advancement Degree of Difficulty (AD2) levels. It also briefly describes a quantifiable process to establish the appropriate TRL for a given technology and quantifies through the AD2 what is required to move it from its current TRL to the desired TRL in order to reduce risk and maximize likelihood of successfully infusing the technology.

  4. Uncertainty quantification of surface-water/groundwater exchange estimates in large wetland systems using Python

    NASA Astrophysics Data System (ADS)

    Hughes, J. D.; Metz, P. A.

    2014-12-01

    Most watershed studies include observation-based water budget analyses to develop first-order estimates of significant flow terms. Surface-water/groundwater (SWGW) exchange is typically assumed to be equal to the residual of the sum of inflows and outflows in a watershed. These estimates of SWGW exchange, however, are highly uncertain as a result of the propagation of uncertainty inherent in the calculation or processing of the other terms of the water budget, such as stage-area-volume relations, and uncertainties associated with land-cover based evapotranspiration (ET) rate estimates. Furthermore, the uncertainty of estimated SWGW exchanges can be magnified in large wetland systems that transition from dry to wet during wet periods. Although it is well understood that observation-based estimates of SWGW exchange are uncertain it is uncommon for the uncertainty of these estimates to be directly quantified. High-level programming languages like Python can greatly reduce the effort required to (1) quantify the uncertainty of estimated SWGW exchange in large wetland systems and (2) evaluate how different approaches for partitioning land-cover data in a watershed may affect the water-budget uncertainty. We have used Python with the Numpy, Scipy.stats, and pyDOE packages to implement an unconstrained Monte Carlo approach with Latin Hypercube sampling to quantify the uncertainty of monthly estimates of SWGW exchange in the Floral City watershed of the Tsala Apopka wetland system in west-central Florida, USA. Possible sources of uncertainty in the water budget analysis include rainfall, ET, canal discharge, and land/bathymetric surface elevations. Each of these input variables was assigned a probability distribution based on observation error or spanning the range of probable values. The Monte Carlo integration process exposes the uncertainties in land-cover based ET rate estimates as the dominant contributor to the uncertainty in SWGW exchange estimates. We will discuss the uncertainty of SWGW exchange estimates using an ET model that partitions the watershed into open water and wetland land-cover types. We will also discuss the uncertainty of SWGW exchange estimates calculated using ET models partitioned into additional land-cover types.

  5. Methods for handling uncertainty within pharmaceutical funding decisions

    NASA Astrophysics Data System (ADS)

    Stevenson, Matt; Tappenden, Paul; Squires, Hazel

    2014-01-01

    This article provides a position statement regarding decision making under uncertainty within the economic evaluation of pharmaceuticals, with a particular focus upon the National Institute for Health and Clinical Excellence context within England and Wales. This area is of importance as funding agencies have a finite budget from which to purchase a selection of competing health care interventions. The objective function generally used is that of maximising societal health with an explicit acknowledgement that there will be opportunity costs associated with purchasing a particular intervention. Three components of uncertainty are discussed within a pharmaceutical funding perspective: methodological uncertainty, parameter uncertainty and structural uncertainty, alongside a discussion of challenges that are particularly pertinent to health economic evaluation. The discipline has focused primarily on handling methodological and parameter uncertainty and a clear reference case has been developed for consistency across evaluations. However, uncertainties still remain. Less attention has been given to methods for handling structural uncertainty. The lack of adequate methods to explicitly incorporate this aspect of model development may result in the true uncertainty surrounding health care investment decisions being underestimated. Research in this area is ongoing as we review.

  6. Best Practices of Uncertainty Estimation for the National Solar Radiation Database (NSRDB 1998-2015): Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habte, Aron M; Sengupta, Manajit

    It is essential to apply a traceable and standard approach to determine the uncertainty of solar resource data. Solar resource data are used for all phases of solar energy conversion projects, from the conceptual phase to routine solar power plant operation, and to determine performance guarantees of solar energy conversion systems. These guarantees are based on the available solar resource derived from a measurement station or modeled data set such as the National Solar Radiation Database (NSRDB). Therefore, quantifying the uncertainty of these data sets provides confidence to financiers, developers, and site operators of solar energy conversion systems and ultimatelymore » reduces deployment costs. In this study, we implemented the Guide to the Expression of Uncertainty in Measurement (GUM) 1 to quantify the overall uncertainty of the NSRDB data. First, we start with quantifying measurement uncertainty, then we determine each uncertainty statistic of the NSRDB data, and we combine them using the root-sum-of-the-squares method. The statistics were derived by comparing the NSRDB data to the seven measurement stations from the National Oceanic and Atmospheric Administration's Surface Radiation Budget Network, National Renewable Energy Laboratory's Solar Radiation Research Laboratory, and the Atmospheric Radiation Measurement program's Southern Great Plains Central Facility, in Billings, Oklahoma. The evaluation was conducted for hourly values, daily totals, monthly mean daily totals, and annual mean monthly mean daily totals. Varying time averages assist to capture the temporal uncertainty of the specific modeled solar resource data required for each phase of a solar energy project; some phases require higher temporal resolution than others. Overall, by including the uncertainty of measurements of solar radiation made at ground stations, bias, and root mean square error, the NSRDB data demonstrated expanded uncertainty of 17 percent - 29 percent on hourly and an approximate 5 percent - 8 percent annual bases.« less

  7. Probabilistic Structural Evaluation of Uncertainties in Radiator Sandwich Panel Design

    NASA Technical Reports Server (NTRS)

    Kuguoglu, Latife; Ludwiczak, Damian

    2006-01-01

    The Jupiter Icy Moons Orbiter (JIMO) Space System is part of the NASA's Prometheus Program. As part of the JIMO engineering team at NASA Glenn Research Center, the structural design of the JIMO Heat Rejection Subsystem (HRS) is evaluated. An initial goal of this study was to perform sensitivity analyses to determine the relative importance of the input variables on the structural responses of the radiator panel. The desire was to let the sensitivity analysis information identify the important parameters. The probabilistic analysis methods illustrated here support this objective. The probabilistic structural performance evaluation of a HRS radiator sandwich panel was performed. The radiator panel structural performance was assessed in the presence of uncertainties in the loading, fabrication process variables, and material properties. The stress and displacement contours of the deterministic structural analysis at mean probability was performed and results presented. It is followed by a probabilistic evaluation to determine the effect of the primitive variables on the radiator panel structural performance. Based on uncertainties in material properties, structural geometry and loading, the results of the displacement and stress analysis are used as an input file for the probabilistic analysis of the panel. The sensitivity of the structural responses, such as maximum displacement and maximum tensile and compressive stresses of the facesheet in x and y directions and maximum VonMises stresses of the tube, to the loading and design variables is determined under the boundary condition where all edges of the radiator panel are pinned. Based on this study, design critical material and geometric parameters of the considered sandwich panel are identified.

  8. [A correlational study on uncertainty, mastery and appraisal of uncertainty in hospitalized children's mothers].

    PubMed

    Yoo, Kyung Hee

    2007-06-01

    This study was conducted to investigate the correlation among uncertainty, mastery and appraisal of uncertainty in hospitalized children's mothers. Self report questionnaires were used to measure the variables. Variables were uncertainty, mastery and appraisal of uncertainty. In data analysis, the SPSSWIN 12.0 program was utilized for descriptive statistics, Pearson's correlation coefficients, and regression analysis. Reliability of the instruments was cronbach's alpha=.84~.94. Mastery negatively correlated with uncertainty(r=-.444, p=.000) and danger appraisal of uncertainty(r=-.514, p=.000). In regression of danger appraisal of uncertainty, uncertainty and mastery were significant predictors explaining 39.9%. Mastery was a significant mediating factor between uncertainty and danger appraisal of uncertainty in hospitalized children's mothers. Therefore, nursing interventions which improve mastery must be developed for hospitalized children's mothers.

  9. Limits, discovery and cut optimization for a Poisson process with uncertainty in background and signal efficiency: TRolke 2.0

    NASA Astrophysics Data System (ADS)

    Lundberg, J.; Conrad, J.; Rolke, W.; Lopez, A.

    2010-03-01

    A C++ class was written for the calculation of frequentist confidence intervals using the profile likelihood method. Seven combinations of Binomial, Gaussian, Poissonian and Binomial uncertainties are implemented. The package provides routines for the calculation of upper and lower limits, sensitivity and related properties. It also supports hypothesis tests which take uncertainties into account. It can be used in compiled C++ code, in Python or interactively via the ROOT analysis framework. Program summaryProgram title: TRolke version 2.0 Catalogue identifier: AEFT_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFT_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: MIT license No. of lines in distributed program, including test data, etc.: 3431 No. of bytes in distributed program, including test data, etc.: 21 789 Distribution format: tar.gz Programming language: ISO C++. Computer: Unix, GNU/Linux, Mac. Operating system: Linux 2.6 (Scientific Linux 4 and 5, Ubuntu 8.10), Darwin 9.0 (Mac-OS X 10.5.8). RAM:˜20 MB Classification: 14.13. External routines: ROOT ( http://root.cern.ch/drupal/) Nature of problem: The problem is to calculate a frequentist confidence interval on the parameter of a Poisson process with statistical or systematic uncertainties in signal efficiency or background. Solution method: Profile likelihood method, Analytical Running time:<10 seconds per extracted limit.

  10. Optimization under variability and uncertainty: a case study for NOx emissions control for a gasification system.

    PubMed

    Chen, Jianjun; Frey, H Christopher

    2004-12-15

    Methods for optimization of process technologies considering the distinction between variability and uncertainty are developed and applied to case studies of NOx control for Integrated Gasification Combined Cycle systems. Existing methods of stochastic optimization (SO) and stochastic programming (SP) are demonstrated. A comparison of SO and SP results provides the value of collecting additional information to reduce uncertainty. For example, an expected annual benefit of 240,000 dollars is estimated if uncertainty can be reduced before a final design is chosen. SO and SP are typically applied to uncertainty. However, when applied to variability, the benefit of dynamic process control is obtained. For example, an annual savings of 1 million dollars could be achieved if the system is adjusted to changes in process conditions. When variability and uncertainty are treated distinctively, a coupled stochastic optimization and programming method and a two-dimensional stochastic programming method are demonstrated via a case study. For the case study, the mean annual benefit of dynamic process control is estimated to be 700,000 dollars, with a 95% confidence range of 500,000 dollars to 940,000 dollars. These methods are expected to be of greatest utility for problems involving a large commitment of resources, for which small differences in designs can produce large cost savings.

  11. Evaluation of the cost effectiveness of the 1983 stream-gaging program in Kansas

    USGS Publications Warehouse

    Medina, K.D.; Geiger, C.O.

    1984-01-01

    The results of an evaluation of the cost effectiveness of the 1983 stream-gaging program in Kansas are documented. Data uses and funding sources were identified for the 140 complete record streamflow-gaging stations operated in Kansas during 1983 with a budget of $793,780. As a result of the evaluation of the needs and uses of data from the stream-gaging program, it was found that the 140 gaging stations were needed to meet these data requirements. The average standard error of estimation of streamflow records was 20.8 percent, assuming the 1983 budget and operating schedule of 6-week interval visitations and based on 85 of the 140 stations. It was shown that this overall level of accuracy could be improved to 18.9 percent by altering the 1983 schedule of station visitations. A minimum budget of $760 ,000, with a corresponding average error of estimation of 24.9 percent, is required to operate the 1983 program. None of the stations investigated were suitable for the application of alternative methods for simulating discharge records. Improved instrumentation can have a very positive impact on streamflow uncertainties by decreasing lost record. (USGS)

  12. Inexact nonlinear improved fuzzy chance-constrained programming model for irrigation water management under uncertainty

    NASA Astrophysics Data System (ADS)

    Zhang, Chenglong; Zhang, Fan; Guo, Shanshan; Liu, Xiao; Guo, Ping

    2018-01-01

    An inexact nonlinear mλ-measure fuzzy chance-constrained programming (INMFCCP) model is developed for irrigation water allocation under uncertainty. Techniques of inexact quadratic programming (IQP), mλ-measure, and fuzzy chance-constrained programming (FCCP) are integrated into a general optimization framework. The INMFCCP model can deal with not only nonlinearities in the objective function, but also uncertainties presented as discrete intervals in the objective function, variables and left-hand side constraints and fuzziness in the right-hand side constraints. Moreover, this model improves upon the conventional fuzzy chance-constrained programming by introducing a linear combination of possibility measure and necessity measure with varying preference parameters. To demonstrate its applicability, the model is then applied to a case study in the middle reaches of Heihe River Basin, northwest China. An interval regression analysis method is used to obtain interval crop water production functions in the whole growth period under uncertainty. Therefore, more flexible solutions can be generated for optimal irrigation water allocation. The variation of results can be examined by giving different confidence levels and preference parameters. Besides, it can reflect interrelationships among system benefits, preference parameters, confidence levels and the corresponding risk levels. Comparison between interval crop water production functions and deterministic ones based on the developed INMFCCP model indicates that the former is capable of reflecting more complexities and uncertainties in practical application. These results can provide more reliable scientific basis for supporting irrigation water management in arid areas.

  13. The NIST Simple Guide for Evaluating and Expressing Measurement Uncertainty

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio

    2016-11-01

    NIST has recently published guidance on the evaluation and expression of the uncertainty of NIST measurement results [1, 2], supplementing but not replacing B. N. Taylor and C. E. Kuyatt's (1994) Guidelines for Evaluating and Expressing the Uncertainty of NIST Measurement Results (NIST Technical Note 1297) [3], which tracks closely the Guide to the expression of uncertainty in measurement (GUM) [4], originally published in 1995 by the Joint Committee for Guides in Metrology of the International Bureau of Weights and Measures (BIPM). The scope of this Simple Guide, however, is much broader than the scope of both NIST Technical Note 1297 and the GUM, because it attempts to address several of the uncertainty evaluation challenges that have arisen at NIST since the 1990s, for example to include molecular biology, greenhouse gases and climate science measurements, and forensic science. The Simple Guide also expands the scope of those two other guidance documents by recognizing observation equations (that is, statistical models) as bona fide measurement models. These models are indispensable to reduce data from interlaboratory studies, to combine measurement results for the same measurand obtained by different methods, and to characterize the uncertainty of calibration and analysis functions used in the measurement of force, temperature, or composition of gas mixtures. This presentation reviews the salient aspects of the Simple Guide, illustrates the use of models and methods for uncertainty evaluation not contemplated in the GUM, and also demonstrates the NIST Uncertainty Machine [5] and the NIST Consensus Builder, which are web-based applications accessible worldwide that facilitate evaluations of measurement uncertainty and the characterization of consensus values in interlaboratory studies.

  14. Evaluating data worth for ground-water management under uncertainty

    USGS Publications Warehouse

    Wagner, B.J.

    1999-01-01

    A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models-a chance-constrained ground-water management model and an integer-programing sampling network design model-to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring network design model identifies, prior to data collection, the sampling strategy that will minimize model uncertainty; (3) the optimal ground-water management strategy is recalculated on the basis of the projected model uncertainty after sampling; and (4) the worth of the monitoring strategy is assessed by comparing the value of the sample information-i.e., the projected reduction in management costs-with the cost of data collection. Steps 2-4 are repeated for a series of data collection budgets, producing a suite of management/monitoring alternatives, from which the best alternative can be selected. A hypothetical example demonstrates the methodology's ability to identify the ground-water sampling strategy with greatest net economic benefit for ground-water management.A decision framework is presented for assessing the value of ground-water sampling within the context of ground-water management under uncertainty. The framework couples two optimization models - a chance-constrained ground-water management model and an integer-programming sampling network design model - to identify optimal pumping and sampling strategies. The methodology consists of four steps: (1) The optimal ground-water management strategy for the present level of model uncertainty is determined using the chance-constrained management model; (2) for a specified data collection budget, the monitoring network design model identifies, prior to data collection, the sampling strategy that will minimize model uncertainty; (3) the optimal ground-water management strategy is recalculated on the basis of the projected model uncertainty after sampling; and (4) the worth of the monitoring strategy is assessed by comparing the value of the sample information - i.e., the projected reduction in management costs - with the cost of data collection. Steps 2-4 are repeated for a series of data collection budgets, producing a suite of management/monitoring alternatives, from which the best alternative can be selected. A hypothetical example demonstrates the methodology's ability to identify the ground-water sampling strategy with greatest net economic benefit for ground-water management.

  15. Performance and Reliability Optimization for Aerospace Systems subject to Uncertainty and Degradation

    NASA Technical Reports Server (NTRS)

    Miller, David W.; Uebelhart, Scott A.; Blaurock, Carl

    2004-01-01

    This report summarizes work performed by the Space Systems Laboratory (SSL) for NASA Langley Research Center in the field of performance optimization for systems subject to uncertainty. The objective of the research is to develop design methods and tools to the aerospace vehicle design process which take into account lifecycle uncertainties. It recognizes that uncertainty between the predictions of integrated models and data collected from the system in its operational environment is unavoidable. Given the presence of uncertainty, the goal of this work is to develop means of identifying critical sources of uncertainty, and to combine these with the analytical tools used with integrated modeling. In this manner, system uncertainty analysis becomes part of the design process, and can motivate redesign. The specific program objectives were: 1. To incorporate uncertainty modeling, propagation and analysis into the integrated (controls, structures, payloads, disturbances, etc.) design process to derive the error bars associated with performance predictions. 2. To apply modern optimization tools to guide in the expenditure of funds in a way that most cost-effectively improves the lifecycle productivity of the system by enhancing the subsystem reliability and redundancy. The results from the second program objective are described. This report describes the work and results for the first objective: uncertainty modeling, propagation, and synthesis with integrated modeling.

  16. The law (and politics) of safe injection facilities in the United States.

    PubMed

    Beletsky, Leo; Davis, Corey S; Anderson, Evan; Burris, Scott

    2008-02-01

    Safe injection facilities (SIFs) have shown promise in reducing harms and social costs associated with injection drug use. Favorable evaluations elsewhere have raised the issue of their implementation in the United States. Recognizing that laws shape health interventions targeting drug users, we analyzed the legal environment for publicly authorized SIFs in the United States. Although states and some municipalities have the power to authorize SIFs under state law, federal authorities could still interfere with these facilities under the Controlled Substances Act. A state- or locally-authorized SIF could proceed free of legal uncertainty only if federal authorities explicitly authorized it or decided not to interfere. Given legal uncertainty, and the similar experience with syringe exchange programs, we recommend a process of sustained health research, strategic advocacy, and political deliberation.

  17. Dispersion analysis for baseline reference mission 1. [flight simulation and trajectory analysis for space shuttle orbiter

    NASA Technical Reports Server (NTRS)

    Kuhn, A. E.

    1975-01-01

    A dispersion analysis considering 3 sigma uncertainties (or perturbations) in platform, vehicle, and environmental parameters was performed for the baseline reference mission (BRM) 1 of the space shuttle orbiter. The dispersion analysis is based on the nominal trajectory for the BRM 1. State vector and performance dispersions (or variations) which result from the indicated 3 sigma uncertainties were studied. The dispersions were determined at major mission events and fixed times from lift-off (time slices) and the results will be used to evaluate the capability of the vehicle to perform the mission within a 3 sigma level of confidence and to determine flight performance reserves. A computer program is given that was used for dynamic flight simulations of the space shuttle orbiter.

  18. Optical depth measurements by shadow-band radiometers and their uncertainties.

    PubMed

    Alexandrov, Mikhail D; Kiedron, Peter; Michalsky, Joseph J; Hodges, Gary; Flynn, Connor J; Lacis, Andrew A

    2007-11-20

    Shadow-band radiometers in general, and especially the Multi-Filter Rotating Shadow-band Radiometer (MFRSR), are widely used for atmospheric optical depth measurements. The major programs running MFRSR networks in the United States include the Department of Energy Atmospheric Radiation Measurement (ARM) Program, U.S. Department of Agriculture UV-B Monitoring and Research Program, National Oceanic and Atmospheric Administration Surface Radiation (SURFRAD) Network, and NASA Solar Irradiance Research Network (SIRN). We discuss a number of technical issues specific to shadow-band radiometers and their impact on the optical depth measurements. These problems include instrument tilt and misalignment, as well as some data processing artifacts. Techniques for data evaluation and automatic detection of some of these problems are described.

  19. Engineering design of a high-temperature superconductor current lead

    NASA Astrophysics Data System (ADS)

    Niemann, R. C.; Cha, Y. S.; Hull, J. R.; Daugherty, M. A.; Buckles, W. E.

    As part of the US Department of Energy's Superconductivity Pilot Center Program, Argonne National Laboratory and Superconductivity, Inc., are developing high-temperature superconductor (HTS) current leads suitable for application to superconducting magnetic energy storage systems. The principal objective of the development program is to design, construct, and evaluate the performance of HTS current leads suitable for near-term applications. Supporting objectives are to (1) develop performance criteria; (2) develop a detailed design; (3) analyze performance; (4) gain manufacturing experience in the areas of materials and components procurement, fabrication and assembly, quality assurance, and cost; (5) measure performance of critical components and the overall assembly; (6) identify design uncertainties and develop a program for their study; and (7) develop application-acceptance criteria.

  20. SPRT Calibration Uncertainties and Internal Quality Control at a Commercial SPRT Calibration Facility

    NASA Astrophysics Data System (ADS)

    Wiandt, T. J.

    2008-06-01

    The Hart Scientific Division of the Fluke Corporation operates two accredited standard platinum resistance thermometer (SPRT) calibration facilities, one at the Hart Scientific factory in Utah, USA, and the other at a service facility in Norwich, UK. The US facility is accredited through National Voluntary Laboratory Accreditation Program (NVLAP), and the UK facility is accredited through UKAS. Both provide SPRT calibrations using similar equipment and procedures, and at similar levels of uncertainty. These uncertainties are among the lowest available commercially. To achieve and maintain low uncertainties, it is required that the calibration procedures be thorough and optimized. However, to minimize customer downtime, it is also important that the instruments be calibrated in a timely manner and returned to the customer. Consequently, subjecting the instrument to repeated calibrations or extensive repeated measurements is not a viable approach. Additionally, these laboratories provide SPRT calibration services involving a wide variety of SPRT designs. These designs behave differently, yet predictably, when subjected to calibration measurements. To this end, an evaluation strategy involving both statistical process control and internal consistency measures is utilized to provide confidence in both the instrument calibration and the calibration process. This article describes the calibration facilities, procedure, uncertainty analysis, and internal quality assurance measures employed in the calibration of SPRTs. Data will be reviewed and generalities will be presented. Finally, challenges and considerations for future improvements will be discussed.

  1. Space Trajectory Error Analysis Program (STEAP) for halo orbit missions. Volume 1: Analytic and user's manual

    NASA Technical Reports Server (NTRS)

    Byrnes, D. V.; Carney, P. C.; Underwood, J. W.; Vogt, E. D.

    1974-01-01

    Development, test, conversion, and documentation of computer software for the mission analysis of missions to halo orbits about libration points in the earth-sun system is reported. The software consisting of two programs called NOMNAL and ERRAN is part of the Space Trajectories Error Analysis Programs (STEAP). The program NOMNAL targets a transfer trajectory from Earth on a given launch date to a specified halo orbit on a required arrival date. Either impulsive or finite thrust insertion maneuvers into halo orbit are permitted by the program. The transfer trajectory is consistent with a realistic launch profile input by the user. The second program ERRAN conducts error analyses of the targeted transfer trajectory. Measurements including range, doppler, star-planet angles, and apparent planet diameter are processed in a Kalman-Schmidt filter to determine the trajectory knowledge uncertainty. Execution errors at injection, midcourse correction and orbit insertion maneuvers are analyzed along with the navigation uncertainty to determine trajectory control uncertainties and fuel-sizing requirements. The program is also capable of generalized covariance analyses.

  2. Health promotion in schools: a multi-method evaluation of an Australian School Youth Health Nurse Program.

    PubMed

    Banfield, Michelle; McGorm, Kelly; Sargent, Ginny

    2015-01-01

    Health promotion provides a key opportunity to empower young people to make informed choices regarding key health-related behaviours such as tobacco and alcohol use, sexual practices, dietary choices and physical activity. This paper describes the evaluation of a pilot School Youth Health Nurse (SYHN) Program, which aims to integrate a Registered Nurse into school communities to deliver health promotion through group education and individual sessions. The evaluation was guided by the RE-AIM (reach, effectiveness, adoption, implementation, maintenance) framework. The objectives were to explore: 1) whether the Program was accessible to the high school students; 2) the impacts of the Program on key stakeholders; 3) which factors affected adoption of the Program; 4) whether implementation was consistent with the Program intent; and 5) the long-term sustainability of the Program. Research included retrospective analysis of Program records, administration of a survey of student experiences and interviews with 38 stakeholders. This evaluation provided evidence that the SYHN Program is reaching students in need, is effective, has been adopted successfully in schools, is being implemented as intended and could be maintained with sustained funding. The nurses deliver an accessible and acceptable primary health care service, focused on health promotion, prevention and early intervention. After some initial uncertainty about the scope and nature of the role, the nurses are a respected source of health information in the schools, consulted on curriculum development and contributing to whole-of-school health activities. Findings demonstrate that the SYHN model is feasible and acceptable to the students and schools involved in the pilot. The Program provides health promotion and accessible primary health care in the school setting, consistent with the Health Promoting Schools framework.

  3. University of Arizona High Energy Physics Program at the Cosmic Frontier 2014-2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    abate, alex; cheu, elliott

    This is the final technical report from the University of Arizona High Energy Physics program at the Cosmic Frontier covering the period 2014-2016. The work aims to advance the understanding of dark energy using the Large Synoptic Survey Telescope (LSST). Progress on the engineering design of the power supplies for the LSST camera is discussed. A variety of contributions to photometric redshift measurement uncertainties were studied. The effect of the intergalactic medium on the photometric redshift of very distant galaxies was evaluated. Computer code was developed realizing the full chain of calculations needed to accurately and efficiently run large-scale simulations.

  4. Life cycle cost optimization of biofuel supply chains under uncertainties based on interval linear programming.

    PubMed

    Ren, Jingzheng; Dong, Liang; Sun, Lu; Goodsite, Michael Evan; Tan, Shiyu; Dong, Lichun

    2015-01-01

    The aim of this work was to develop a model for optimizing the life cycle cost of biofuel supply chain under uncertainties. Multiple agriculture zones, multiple transportation modes for the transport of grain and biofuel, multiple biofuel plants, and multiple market centers were considered in this model, and the price of the resources, the yield of grain and the market demands were regarded as interval numbers instead of constants. An interval linear programming was developed, and a method for solving interval linear programming was presented. An illustrative case was studied by the proposed model, and the results showed that the proposed model is feasible for designing biofuel supply chain under uncertainties. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Evaluation of Spatial Uncertainties In Modeling of Cadastral Systems

    NASA Astrophysics Data System (ADS)

    Fathi, Morteza; Teymurian, Farideh

    2013-04-01

    Cadastre plays an essential role in sustainable development especially in developing countries like Iran. A well-developed Cadastre results in transparency of estates tax system, transparency of data of estate, reduction of action before the courts and effective management of estates and natural sources and environment. Multipurpose Cadastre through gathering of other related data has a vital role in civil, economic and social programs and projects. Iran is being performed Cadastre for many years but success in this program is subject to correct geometric and descriptive data of estates. Since there are various sources of data with different accuracy and precision in Iran, some difficulties and uncertainties are existed in modeling of geometric part of Cadastre such as inconsistency between data in deeds and Cadastral map which cause some troubles in execution of cadastre and result in losing national and natural source, rights of nation. Now there is no uniform and effective technical method for resolving such conflicts. This article describes various aspects of such conflicts in geometric part of cadastre and suggests a solution through some modeling tools of GIS.

  6. Forecasting eruption size: what we know, what we don't know

    NASA Astrophysics Data System (ADS)

    Papale, Paolo

    2017-04-01

    Any eruption forecast includes an evaluation of the expected size of the forthcoming eruption, usually expressed as the probability associated to given size classes. Such evaluation is mostly based on the previous volcanic history at the specific volcano, or it is referred to a broader class of volcanoes constituting "analogues" of the one under specific consideration. In any case, use of knowledge from past eruptions implies considering the completeness of the reference catalogue, and most importantly, the existence of systematic biases in the catalogue, that may affect probability estimates and translate into biased volcanic hazard forecasts. An analysis of existing catalogues, with major reference to the catalogue from the Smithsonian Global Volcanism Program, suggests that systematic biases largely dominate at global, regional and local scale: volcanic histories reconstructed at individual volcanoes, often used as a reference for volcanic hazard forecasts, are the result of systematic loss of information with time and poor sample representativeness. That situation strictly requires the use of techniques to complete existing catalogues, as well as careful consideration of the uncertainties deriving from inadequate knowledge and model-dependent data elaboration. A reconstructed global eruption size distribution, obtained by merging information from different existing catalogues, shows a mode in the VEI 1-2 range, <0.1% incidence of eruptions with VEI 7 or larger, and substantial uncertainties associated with individual VEI frequencies. Even larger uncertainties are expected to derive from application to individual volcanoes or classes of analogue volcanoes, suggesting large to very large uncertainties associated to volcanic hazard forecasts virtually at any individual volcano worldwide.

  7. AN OVERVIEW OF THE UNCERTAINTY ANALYSIS, SENSITIVITY ANALYSIS, AND PARAMETER ESTIMATION (UA/SA/PE) API AND HOW TO IMPLEMENT IT

    EPA Science Inventory

    The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and
    Parameter Estimation (UA/SA/PE API) (also known as Calibration, Optimization and Sensitivity and Uncertainty (CUSO)) was developed in a joint effort between several members of both ...

  8. A generalized fuzzy linear programming approach for environmental management problem under uncertainty.

    PubMed

    Fan, Yurui; Huang, Guohe; Veawab, Amornvadee

    2012-01-01

    In this study, a generalized fuzzy linear programming (GFLP) method was developed to deal with uncertainties expressed as fuzzy sets that exist in the constraints and objective function. A stepwise interactive algorithm (SIA) was advanced to solve GFLP model and generate solutions expressed as fuzzy sets. To demonstrate its application, the developed GFLP method was applied to a regional sulfur dioxide (SO2) control planning model to identify effective SO2 mitigation polices with a minimized system performance cost under uncertainty. The results were obtained to represent the amount of SO2 allocated to different control measures from different sources. Compared with the conventional interval-parameter linear programming (ILP) approach, the solutions obtained through GFLP were expressed as fuzzy sets, which can provide intervals for the decision variables and objective function, as well as related possibilities. Therefore, the decision makers can make a tradeoff between model stability and the plausibility based on solutions obtained through GFLP and then identify desired policies for SO2-emission control under uncertainty.

  9. An inexact multistage fuzzy-stochastic programming for regional electric power system management constrained by environmental quality.

    PubMed

    Fu, Zhenghui; Wang, Han; Lu, Wentao; Guo, Huaicheng; Li, Wei

    2017-12-01

    Electric power system involves different fields and disciplines which addressed the economic system, energy system, and environment system. Inner uncertainty of this compound system would be an inevitable problem. Therefore, an inexact multistage fuzzy-stochastic programming (IMFSP) was developed for regional electric power system management constrained by environmental quality. A model which concluded interval-parameter programming, multistage stochastic programming, and fuzzy probability distribution was built to reflect the uncertain information and dynamic variation in the case study, and the scenarios under different credibility degrees were considered. For all scenarios under consideration, corrective actions were allowed to be taken dynamically in accordance with the pre-regulated policies and the uncertainties in reality. The results suggest that the methodology is applicable to handle the uncertainty of regional electric power management systems and help the decision makers to establish an effective development plan.

  10. Space Shuttle Orbiter flight heating rate measurement sensitivity to thermal protection system uncertainties

    NASA Technical Reports Server (NTRS)

    Bradley, P. F.; Throckmorton, D. A.

    1981-01-01

    A study was completed to determine the sensitivity of computed convective heating rates to uncertainties in the thermal protection system thermal model. Those parameters considered were: density, thermal conductivity, and specific heat of both the reusable surface insulation and its coating; coating thickness and emittance; and temperature measurement uncertainty. The assessment used a modified version of the computer program to calculate heating rates from temperature time histories. The original version of the program solves the direct one dimensional heating problem and this modified version of The program is set up to solve the inverse problem. The modified program was used in thermocouple data reduction for shuttle flight data. Both nominal thermal models and altered thermal models were used to determine the necessity for accurate knowledge of thermal protection system's material thermal properties. For many thermal properties, the sensitivity (inaccuracies created in the calculation of convective heating rate by an altered property) was very low.

  11. Communicating spatial uncertainty to non-experts using R

    NASA Astrophysics Data System (ADS)

    Luzzi, Damiano; Sawicka, Kasia; Heuvelink, Gerard; de Bruin, Sytze

    2016-04-01

    Effective visualisation methods are important for the efficient use of uncertainty information for various groups of users. Uncertainty propagation analysis is often used with spatial environmental models to quantify the uncertainty within the information. A challenge arises when trying to effectively communicate the uncertainty information to non-experts (not statisticians) in a wide range of cases. Due to the growing popularity and applicability of the open source programming language R, we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. The package has implemented Monte Carlo algorithms for uncertainty propagation, the output of which is represented by an ensemble of model outputs (i.e. a sample from a probability distribution). Numerous visualisation methods exist that aim to present such spatial uncertainty information both statically, dynamically and interactively. To provide the most universal visualisation tools for non-experts, we conducted a survey on a group of 20 university students and assessed the effectiveness of selected static and interactive methods for visualising uncertainty in spatial variables such as DEM and land cover. The static methods included adjacent maps and glyphs for continuous variables. Both allow for displaying maps with information about the ensemble mean, variance/standard deviation and prediction intervals. Adjacent maps were also used for categorical data, displaying maps of the most probable class, as well as its associated probability. The interactive methods included a graphical user interface, which in addition to displaying the previously mentioned variables also allowed for comparison of joint uncertainties at multiple locations. The survey indicated that users could understand the basics of the uncertainty information displayed in the static maps, with the interactive interface allowing for more in-depth information. Subsequently, the R package included a collation of the plotting functions that were evaluated in the survey. The implementation of static visualisations was done via calls to the 'ggplot2' package. This allowed the user to provide control over the content, legend, colours, axes and titles. The interactive methods were implemented using the 'shiny' package allowing users to activate the visualisation of statistical descriptions of uncertainty through interaction with a plotted map of means. This research brings uncertainty visualisation to a broader audience through the development of tools for visualising uncertainty using open source software.

  12. Modeling Cervical Cancer Prevention in Developed Countries

    PubMed Central

    Kim, Jane J.; Brisson, Marc; Edmunds, W. John; Goldie, Sue J.

    2009-01-01

    Cytology-based screening has reduced cervical cancer mortality in countries able to implement, sustain and financially support organized programs that achieve broad coverage. These ongoing secondary prevention efforts considerably complicate the question of whether vaccination against Human Papillomavirus (HPV) types -16 and 18 should be introduced. Policy questions focus primarily on the target ages of vaccination, appropriate ages for a temporary “catch-up” program, possible revisions in screening policies to optimize synergies with vaccination, including the increased used of HPV DNA testing, and the inclusion of boys in the vaccination program. Decision-analytic models are increasingly being developed to simulate disease burden and interventions in different settings in order to evaluate the benefits and cost-effectiveness of primary and secondary interventions for informed decision-making. This article is a focused review on existing mathematical models that have been used to evaluate HPV vaccination in the context of developed countries with existing screening programs. Despite variations in model assumptions and uncertainty in existing data, pre-adolescent vaccination of girls is consistently found to be attractive in the context of current screening practices, provided there is complete and lifelong vaccine protection and widespread vaccination coverage. Questions related to catch-up vaccination programs, potential benefits of other non-cervical cancer outcomes and inclusion of boys are subject to far more uncertainty, and results from these analyses have reached conflicting conclusions. Most analyses find that some catch-up vaccination is warranted but becomes increasingly unattractive as the catch-up age is extended, and vaccination of boys is unlikely to be cost-effective if reasonable levels of coverage are achieved in girls or coverage among girls can be improved. The objective of the review is to highlight points of consensus and qualitative themes, to discuss the areas of divergent findings, and to provide insight into critical decisions related to cervical cancer prevention. PMID:18847560

  13. Economic evaluation of a weight control program with e-mail and telephone counseling among overweight employees: a randomized controlled trial

    PubMed Central

    2012-01-01

    Background Distance lifestyle counseling for weight control is a promising public health intervention in the work setting. Information about the cost-effectiveness of such interventions is lacking, but necessary to make informed implementation decisions. The purpose of this study was to perform an economic evaluation of a six-month program with lifestyle counseling aimed at weight reduction in an overweight working population with a two-year time horizon from a societal perspective. Methods A randomized controlled trial comparing a program with two modes of intervention delivery against self-help. 1386 Employees from seven companies participated (67% male, mean age 43 (SD 8.6) years, mean BMI 29.6 (SD 3.5) kg/m2). All groups received self-directed lifestyle brochures. The two intervention groups additionally received a workbook-based program with phone counseling (phone; n=462) or a web-based program with e-mail counseling (internet; n=464). Body weight was measured at baseline and 24 months after baseline. Quality of life (EuroQol-5D) was assessed at baseline, 6, 12, 18 and 24 months after baseline. Resource use was measured with six-monthly diaries and valued with Dutch standard costs. Missing data were multiply imputed. Uncertainty around differences in costs and incremental cost-effectiveness ratios was estimated by applying non-parametric bootstrapping techniques and graphically plotting the results in cost-effectiveness planes and cost-effectiveness acceptability curves. Results At two years the incremental cost-effectiveness ratio was €1009/kg weight loss in the phone group and €16/kg weight loss in the internet group. The cost-utility analysis resulted in €245,243/quality adjusted life year (QALY) and €1337/QALY, respectively. The results from a complete-case analysis were slightly more favorable. However, there was considerable uncertainty around all outcomes. Conclusions Neither intervention mode was proven to be cost-effective compared to self-help. Trial registration ISRCTN04265725 PMID:22967224

  14. Comparison of linear and nonlinear programming approaches for "worst case dose" and "minmax" robust optimization of intensity-modulated proton therapy dose distributions.

    PubMed

    Zaghian, Maryam; Cao, Wenhua; Liu, Wei; Kardar, Laleh; Randeniya, Sharmalee; Mohan, Radhe; Lim, Gino

    2017-03-01

    Robust optimization of intensity-modulated proton therapy (IMPT) takes uncertainties into account during spot weight optimization and leads to dose distributions that are resilient to uncertainties. Previous studies demonstrated benefits of linear programming (LP) for IMPT in terms of delivery efficiency by considerably reducing the number of spots required for the same quality of plans. However, a reduction in the number of spots may lead to loss of robustness. The purpose of this study was to evaluate and compare the performance in terms of plan quality and robustness of two robust optimization approaches using LP and nonlinear programming (NLP) models. The so-called "worst case dose" and "minmax" robust optimization approaches and conventional planning target volume (PTV)-based optimization approach were applied to designing IMPT plans for five patients: two with prostate cancer, one with skull-based cancer, and two with head and neck cancer. For each approach, both LP and NLP models were used. Thus, for each case, six sets of IMPT plans were generated and assessed: LP-PTV-based, NLP-PTV-based, LP-worst case dose, NLP-worst case dose, LP-minmax, and NLP-minmax. The four robust optimization methods behaved differently from patient to patient, and no method emerged as superior to the others in terms of nominal plan quality and robustness against uncertainties. The plans generated using LP-based robust optimization were more robust regarding patient setup and range uncertainties than were those generated using NLP-based robust optimization for the prostate cancer patients. However, the robustness of plans generated using NLP-based methods was superior for the skull-based and head and neck cancer patients. Overall, LP-based methods were suitable for the less challenging cancer cases in which all uncertainty scenarios were able to satisfy tight dose constraints, while NLP performed better in more difficult cases in which most uncertainty scenarios were hard to meet tight dose limits. For robust optimization, the worst case dose approach was less sensitive to uncertainties than was the minmax approach for the prostate and skull-based cancer patients, whereas the minmax approach was superior for the head and neck cancer patients. The robustness of the IMPT plans was remarkably better after robust optimization than after PTV-based optimization, and the NLP-PTV-based optimization outperformed the LP-PTV-based optimization regarding robustness of clinical target volume coverage. In addition, plans generated using LP-based methods had notably fewer scanning spots than did those generated using NLP-based methods. © 2017 The Authors. Journal of Applied Clinical Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  15. A systematic uncertainty analysis of an evaluative fate and exposure model.

    PubMed

    Hertwich, E G; McKone, T E; Pease, W S

    2000-08-01

    Multimedia fate and exposure models are widely used to regulate the release of toxic chemicals, to set cleanup standards for contaminated sites, and to evaluate emissions in life-cycle assessment. CalTOX, one of these models, is used to calculate the potential dose, an outcome that is combined with the toxicity of the chemical to determine the Human Toxicity Potential (HTP), used to aggregate and compare emissions. The comprehensive assessment of the uncertainty in the potential dose calculation in this article serves to provide the information necessary to evaluate the reliability of decisions based on the HTP A framework for uncertainty analysis in multimedia risk assessment is proposed and evaluated with four types of uncertainty. Parameter uncertainty is assessed through Monte Carlo analysis. The variability in landscape parameters is assessed through a comparison of potential dose calculations for different regions in the United States. Decision rule uncertainty is explored through a comparison of the HTP values under open and closed system boundaries. Model uncertainty is evaluated through two case studies, one using alternative formulations for calculating the plant concentration and the other testing the steady state assumption for wet deposition. This investigation shows that steady state conditions for the removal of chemicals from the atmosphere are not appropriate and result in an underestimate of the potential dose for 25% of the 336 chemicals evaluated.

  16. Destructive analysis capabilities for plutonium and uranium characterization at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tandon, Lav; Kuhn, Kevin J; Drake, Lawrence R

    Los Alamos National Laboratory's (LANL) Actinide Analytical Chemistry (AAC) group has been in existence since the Manhattan Project. It maintains a complete set of analytical capabilities for performing complete characterization (elemental assay, isotopic, metallic and non metallic trace impurities) of uranium and plutonium samples in different forms. For a majority of the customers there are strong quality assurance (QA) and quality control (QC) objectives including highest accuracy and precision with well defined uncertainties associated with the analytical results. Los Alamos participates in various international and national programs such as the Plutonium Metal Exchange Program, New Brunswick Laboratory's (NBL' s) Safeguardsmore » Measurement Evaluation Program (SME) and several other inter-laboratory round robin exercises to monitor and evaluate the data quality generated by AAC. These programs also provide independent verification of analytical measurement capabilities, and allow any technical problems with analytical measurements to be identified and corrected. This presentation will focus on key analytical capabilities for destructive analysis in AAC and also comparative data between LANL and peer groups for Pu assay and isotopic analysis.« less

  17. Using social constructionist thinking in training social workers living and working under threat of political violence.

    PubMed

    Shamai, Michal

    2003-10-01

    This article describes and analyzes an intervention program with social workers living and working in a situation of uncertainty created by political violence such as war and terrorism. The author used a social constructionist perspective as a theoretical framework, emphasizing the effect of the social and political context in constructing the experience and a recognition of the personal and professional knowledge acquired in the daily experience. The author used qualitative methods to evaluate the process and outcome. The narrative-holistic analysis focused on reconstructing meaning and adapting it to the new situation, the main thrust of the program. From the thematic analysis four main themes emerged: (1) loss as a result of political violence; (2) meaning of strength and weakness in situations of political violence; (3) preparation for terrorist attacks; and (4) definition of a safe place. The outcome evaluation describes the meaning of this kind of training program to the participants. The specific context of the training program is discussed as well as possibilities of using it in different contexts.

  18. Building model analysis applications with the Joint Universal Parameter IdenTification and Evaluation of Reliability (JUPITER) API

    USGS Publications Warehouse

    Banta, E.R.; Hill, M.C.; Poeter, E.; Doherty, J.E.; Babendreier, J.

    2008-01-01

    The open-source, public domain JUPITER (Joint Universal Parameter IdenTification and Evaluation of Reliability) API (Application Programming Interface) provides conventions and Fortran-90 modules to develop applications (computer programs) for analyzing process models. The input and output conventions allow application users to access various applications and the analysis methods they embody with a minimum of time and effort. Process models simulate, for example, physical, chemical, and (or) biological systems of interest using phenomenological, theoretical, or heuristic approaches. The types of model analyses supported by the JUPITER API include, but are not limited to, sensitivity analysis, data needs assessment, calibration, uncertainty analysis, model discrimination, and optimization. The advantages provided by the JUPITER API for users and programmers allow for rapid programming and testing of new ideas. Application-specific coding can be in languages other than the Fortran-90 of the API. This article briefly describes the capabilities and utility of the JUPITER API, lists existing applications, and uses UCODE_2005 as an example.

  19. [Evaluation of possibility of using new financial instruments for supporting biomedical projects].

    PubMed

    Starodubov, V I; Kurakova, N G; Eremchenko, O A; Tsvetkova, L A; Zinov, V G

    2014-01-01

    Analysis of selection criteria on projects of Russian medical research centers for funding in Russian scientific fund and Federal program "Research and innovations" was done. It was noted that a high degree of uncertainty of such concepts as "priority direction", "applied" and "search" research and "industrial partner" in regards to research of biomedical theme. Analysis of classified "Medicine and health care" "Forecast of scientific-technological development of Russian Federation till 2030 year" were completed.

  20. Uncertainty in stormwater drainage adaptation: what matters and how much is too much?

    NASA Astrophysics Data System (ADS)

    Stack, L. J.; Simpson, M. H.; Moore, T.; Gulliver, J. S.; Roseen, R.; Eberhart, L.; Smith, J. B.; Gruber, J.; Yetka, L.; Wood, R.; Lawson, C.

    2014-12-01

    Published research continues to report that long-term, local-scale precipitation forecasts are too uncertain to support local-scale adaptation. Numerous studies quantify the range of uncertainty in downscaled model output; compare this with uncertainty from other sources such as hydrological modeling; and propose circumventing uncertainty via "soft" or "low regret" actions, or adaptive management. Yet non-structural adaptations alone are likely insufficient. Structural adaptation requires quantified engineering design specifications. However, the literature does not define a tolerable level of uncertainty. Without such a benchmark, how can we determine whether the climate-change-cognizant design specifications that we are capable of, for example the climate change factors increasingly utilized in European practice, are viable? The presentation will explore this question, in the context of reporting results and observations from an ongoing ten-year program assessing local-scale stormwater drainage system vulnerabilities, required capacities, and adaptation options and costs. This program has studied stormwater systems of varying complexity in a variety of regions, topographies, and levels of urbanization, in northern-New England and the upper-Midwestern United States. These studies demonstrate the feasibility of local-scale design specifications, and provide tangible information on risk to enable valid cost/benefit decisions. The research program has found that stormwater planners and engineers have routinely accepted, in the normal course of professional practice, a level of uncertainty in hydrological modeling comparable to that in long-term precipitation projections. Moreover, the ability to quantify required capacity and related construction costs for specific climate change scenarios, the insensitivity of capacity and costs to uncertainty, and the percentage of pipes and culverts that never require upsizing, all serve to limit the impact of uncertainty inherent in climate change projections.

  1. US Efforts in Support of Examinations at Fukushima Daiichi – 2016 Evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amway, P.; Andrews, N.; Bixby, Willis

    Although it is clear that the accident signatures from each unit at the Fukushima Daiichi Nuclear Power Station (NPS) [Daiichi] differ, much is not known about the end-state of core materials within these units. Some of this uncertainty can be attributed to a lack of information related to cooling system operation and cooling water injection. There is also uncertainty in our understanding of phenomena affecting: a) in-vessel core damage progression during severe accidents in boiling water reactors (BWRs), and b) accident progression after vessel failure (ex-vessel progression) for BWRs and Pressurized Water Reactors (PWRs). These uncertainties arise due to limitedmore » full scale prototypic data. Similar to what occurred after the accident at Three Mile Island Unit 2, these Daiichi units offer the international community a means to reduce such uncertainties by obtaining prototypic data from multiple full-scale BWR severe accidents. Information obtained from Daiichi is required to inform Decontamination and Decommissioning activities, improving the ability of the Tokyo Electric Power Company Holdings (TEPCO) to characterize potential hazards and to ensure the safety of workers involved with cleanup activities. This document reports recent results from the US Forensics Effort to use information obtained by TEPCO to enhance the safety of existing and future nuclear power plant designs. This Forensics Effort, which is sponsored by the Reactor Safety Technologies Pathway of the Department of Energy Office of Nuclear Energy Light Water Reactor (LWR) Sustainability Program, consists of a group of US experts in LWR safety and plant operations that have identified examination needs and are evaluating TEPCO information from Daiichi that address these needs. Examples presented in this report demonstrate that significant safety insights are being obtained in the areas of component performance, fission product release and transport, debris end-state location, and combustible gas generation and transport. In addition to reducing uncertainties related to severe accident modeling progression, these insights are being used to update guidance for severe accident prevention, mitigation, and emergency planning. Furthermore, reduced uncertainties in modeling the events at Daiichi will improve the realism of reactor safety evaluations and inform future D&D activities by improving the capability for characterizing potential hazards to workers involved with cleanup activities.« less

  2. Cost effectiveness of the stream-gaging program in North Dakota

    USGS Publications Warehouse

    Ryan, Gerald L.

    1989-01-01

    This report documents results of a cost-effectiveness study of the stream-gaging program In North Dakota. It is part of a nationwide evaluation of the stream-gaging program of the U.S. Geological Survey.One phase of evaluating cost effectiveness is to identify less costly alternative methods of simulating streamflow records. Statistical or hydro logic flow-routing methods were used as alternative methods to simulate streamflow records for 21 combinations of gaging stations from the 94-gaging-station network. Accuracy of the alternative methods was sufficient to consider discontinuing only one gaging station.Operation of the gaging-station network was evaluated by using associated uncertainty in streamflow records. The evaluation was limited to the nonwinter operation of 29 gaging stations in eastern North Dakota. The current (1987) travel routes and measurement frequencies require a budget of about $248/000 and result in an average equivalent Gaussian spread in streamflow records of 16.5 percent. Changes in routes and measurement frequencies optimally could reduce the average equivalent Gaussian spread to 14.7 percent.Budgets evaluated ranged from $235,000 to $400,000. A $235,000 budget would increase the optimal average equivalent Gaussian spread from 14.7 to 20.4 percent, and a $400,000 budget could decrease it to 5.8 percent.

  3. Development Optimization and Uncertainty Analysis Methods for Oil and Gas Reservoirs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ettehadtavakkol, Amin, E-mail: amin.ettehadtavakkol@ttu.edu; Jablonowski, Christopher; Lake, Larry

    Uncertainty complicates the development optimization of oil and gas exploration and production projects, but methods have been devised to analyze uncertainty and its impact on optimal decision-making. This paper compares two methods for development optimization and uncertainty analysis: Monte Carlo (MC) simulation and stochastic programming. Two example problems for a gas field development and an oilfield development are solved and discussed to elaborate the advantages and disadvantages of each method. Development optimization involves decisions regarding the configuration of initial capital investment and subsequent operational decisions. Uncertainty analysis involves the quantification of the impact of uncertain parameters on the optimum designmore » concept. The gas field development problem is designed to highlight the differences in the implementation of the two methods and to show that both methods yield the exact same optimum design. The results show that both MC optimization and stochastic programming provide unique benefits, and that the choice of method depends on the goal of the analysis. While the MC method generates more useful information, along with the optimum design configuration, the stochastic programming method is more computationally efficient in determining the optimal solution. Reservoirs comprise multiple compartments and layers with multiphase flow of oil, water, and gas. We present a workflow for development optimization under uncertainty for these reservoirs, and solve an example on the design optimization of a multicompartment, multilayer oilfield development.« less

  4. Novel health economic evaluation of a vaccination strategy to prevent HPV-related diseases: the BEST study.

    PubMed

    Favato, Giampiero; Baio, Gianluca; Capone, Alessandro; Marcellusi, Andrea; Costa, Silvano; Garganese, Giorgia; Picardo, Mauro; Drummond, Mike; Jonsson, Bengt; Scambia, Giovanni; Zweifel, Peter; Mennini, Francesco S

    2012-12-01

    The development of human papillomavirus (HPV)-related diseases is not understood perfectly and uncertainties associated with commonly utilized probabilistic models must be considered. The study assessed the cost-effectiveness of a quadrivalent-based multicohort HPV vaccination strategy within a Bayesian framework. A full Bayesian multicohort Markov model was used, in which all unknown quantities were associated with suitable probability distributions reflecting the state of currently available knowledge. These distributions were informed by observed data or expert opinion. The model cycle lasted 1 year, whereas the follow-up time horizon was 90 years. Precancerous cervical lesions, cervical cancers, and anogenital warts were considered as outcomes. The base case scenario (2 cohorts of girls aged 12 and 15 y) and other multicohort vaccination strategies (additional cohorts aged 18 and 25 y) were cost-effective, with a discounted cost per quality-adjusted life-year gained that corresponded to €12,013, €13,232, and €15,890 for vaccination programs based on 2, 3, and 4 cohorts, respectively. With multicohort vaccination strategies, the reduction in the number of HPV-related events occurred earlier (range, 3.8-6.4 y) when compared with a single cohort. The analysis of the expected value of information showed that the results of the model were subject to limited uncertainty (cost per patient = €12.6). This methodological approach is designed to incorporate the uncertainty associated with HPV vaccination. Modeling the cost-effectiveness of a multicohort vaccination program with Bayesian statistics confirmed the value for money of quadrivalent-based HPV vaccination. The expected value of information gave the most appropriate and feasible representation of the true value of this program.

  5. MODEL UNCERTAINTY ANALYSIS, FIELD DATA COLLECTION AND ANALYSIS OF CONTAMINATED VAPOR INTRUSION INTO BUILDINGS

    EPA Science Inventory

    To address uncertainty associated with the evaluation of vapor intrusion problems we are working on a three part strategy that includes: evaluation of uncertainty in model-based assessments; collection of field data and assessment of sites using EPA and state protocols.

  6. Uncertainty Evaluation of Residential Central Air-conditioning Test System

    NASA Astrophysics Data System (ADS)

    Li, Haoxue

    2018-04-01

    According to national standards, property tests of air-conditioning are required. However, test results could be influenced by the precision of apparatus or measure errors. Therefore, uncertainty evaluation of property tests should be conducted. In this paper, the uncertainties are calculated on the property tests of Xinfei13.6 kW residential central air-conditioning. The evaluation result shows that the property tests are credible.

  7. Assessing the Responses of Streamflow to Pollution Release in South Carolina

    NASA Astrophysics Data System (ADS)

    Maze, G.; Chovancak, N. A.; Samadi, S. Z.

    2017-12-01

    The purpose of this investigation was to examine the effects of various stream flows on the transport of a pollutant downstream and to evaluate the uncertainty associated with using a single stream flow value when the true flow is unknown in the model. The area used for this study was Horse Creek in South Carolina where a chlorine pollutant spill has occurred in the past resulting from a train derailment in Graniteville, SC. In the example scenario used, the chlorine gas pollutant was released into the environment, where it killed plants, infected groundwater, and caused evacuation of the city. Tracking the movement and concentrations at various points downstream in the river system is crucial to understanding how a single accidental pollutant release can affect the surrounding areas. As a result of the lack of real-time data available this emergency response model uses historical monthly averages, however, these monthly averages do not reflect how widely the flow can vary within that month. Therefore, the assumption to use the historical monthly average flow data may not be accurate, and this investigation aims at quantifying the uncertainty associated with using a single stream flow value when the true stream flow may vary greatly. For the purpose of this investigation, the event in Graniteville was used as a case study to evaluate the emergency response model. This investigation was conducted by adjusting the STREAM II V7 program developed by Savannah River National Laboratory (SRNL) to model a confluence at the Horse Creek and the Savannah River system. This adjusted program was utilized to track the progress of the chlorine pollutant release and examine how it was transported downstream. By adjusting this program, the concentrations and time taken to reach various points downstream of the release were obtained and can be used not only to analyze this particular pollutant release in Graniteville, but can continue to be adjusted and used as a technical tool for emergency responders in future accidents. Further, the program was run with monthly maximum, minimum, and average advective flows and an uncertainty analysis was conducted to examine the error associated with the input data. These results underscore to profound influence that streamflow magnitudes (maximum, minimum, and average) have on shaping downstream water quality.

  8. An integrated bi-level optimization model for air quality management of Beijing's energy system under uncertainty.

    PubMed

    Jin, S W; Li, Y P; Nie, S

    2018-05-15

    In this study, an interval chance-constrained bi-level programming (ICBP) method is developed for air quality management of municipal energy system under uncertainty. ICBP can deal with uncertainties presented as interval values and probability distributions as well as examine the risk of violating constraints. Besides, a leader-follower decision strategy is incorporated into the optimization process where two decision makers with different goals and preferences are involved. To solve the proposed model, a bi-level interactive algorithm based on satisfactory degree is introduced into the decision-making processes. Then, an ICBP based energy and environmental systems (ICBP-EES) model is formulated for Beijing, in which air quality index (AQI) is used for evaluating the integrated air quality of multiple pollutants. Result analysis can help different stakeholders adjust their tolerances to achieve the overall satisfaction of EES planning for the study city. Results reveal that natural gas is the main source for electricity-generation and heating that could lead to a potentially increment of imported energy for Beijing in future. Results also disclose that PM 10 is the major contributor to AQI. These findings can help decision makers to identify desired alternatives for EES planning and provide useful information for regional air quality management under uncertainty. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. What Risk Assessments of Genetically Modified Organisms Can Learn from Institutional Analyses of Public Health Risks

    PubMed Central

    Rajan, S. Ravi; Letourneau, Deborah K.

    2012-01-01

    The risks of genetically modified organisms (GMOs) are evaluated traditionally by combining hazard identification and exposure estimates to provide decision support for regulatory agencies. We question the utility of the classical risk paradigm and discuss its evolution in GMO risk assessment. First, we consider the problem of uncertainty, by comparing risk assessment for environmental toxins in the public health domain with genetically modified organisms in the environment; we use the specific comparison of an insecticide to a transgenic, insecticidal food crop. Next, we examine normal accident theory (NAT) as a heuristic to consider runaway effects of GMOs, such as negative community level consequences of gene flow from transgenic, insecticidal crops. These examples illustrate how risk assessments are made more complex and contentious by both their inherent uncertainty and the inevitability of failure beyond expectation in complex systems. We emphasize the value of conducting decision-support research, embracing uncertainty, increasing transparency, and building interdisciplinary institutions that can address the complex interactions between ecosystems and society. In particular, we argue against black boxing risk analysis, and for a program to educate policy makers about uncertainty and complexity, so that eventually, decision making is not the burden that falls upon scientists but is assumed by the public at large. PMID:23193357

  10. What risk assessments of genetically modified organisms can learn from institutional analyses of public health risks.

    PubMed

    Rajan, S Ravi; Letourneau, Deborah K

    2012-01-01

    The risks of genetically modified organisms (GMOs) are evaluated traditionally by combining hazard identification and exposure estimates to provide decision support for regulatory agencies. We question the utility of the classical risk paradigm and discuss its evolution in GMO risk assessment. First, we consider the problem of uncertainty, by comparing risk assessment for environmental toxins in the public health domain with genetically modified organisms in the environment; we use the specific comparison of an insecticide to a transgenic, insecticidal food crop. Next, we examine normal accident theory (NAT) as a heuristic to consider runaway effects of GMOs, such as negative community level consequences of gene flow from transgenic, insecticidal crops. These examples illustrate how risk assessments are made more complex and contentious by both their inherent uncertainty and the inevitability of failure beyond expectation in complex systems. We emphasize the value of conducting decision-support research, embracing uncertainty, increasing transparency, and building interdisciplinary institutions that can address the complex interactions between ecosystems and society. In particular, we argue against black boxing risk analysis, and for a program to educate policy makers about uncertainty and complexity, so that eventually, decision making is not the burden that falls upon scientists but is assumed by the public at large.

  11. Conservation in the face of climate change: The roles of alternative models, monitoring, and adaptation in confronting and reducing uncertainty

    USGS Publications Warehouse

    Conroy, M.J.; Runge, M.C.; Nichols, J.D.; Stodola, K.W.; Cooper, R.J.

    2011-01-01

    The broad physical and biological principles behind climate change and its potential large scale ecological impacts on biota are fairly well understood, although likely responses of biotic communities at fine spatio-temporal scales are not, limiting the ability of conservation programs to respond effectively to climate change outside the range of human experience. Much of the climate debate has focused on attempts to resolve key uncertainties in a hypothesis-testing framework. However, conservation decisions cannot await resolution of these scientific issues and instead must proceed in the face of uncertainty. We suggest that conservation should precede in an adaptive management framework, in which decisions are guided by predictions under multiple, plausible hypotheses about climate impacts. Under this plan, monitoring is used to evaluate the response of the system to climate drivers, and management actions (perhaps experimental) are used to confront testable predictions with data, in turn providing feedback for future decision making. We illustrate these principles with the problem of mitigating the effects of climate change on terrestrial bird communities in the southern Appalachian Mountains, USA. ?? 2010 Elsevier Ltd.

  12. Final Report---Optimization Under Nonconvexity and Uncertainty: Algorithms and Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeff Linderoth

    2011-11-06

    the goal of this work was to develop new algorithmic techniques for solving large-scale numerical optimization problems, focusing on problems classes that have proven to be among the most challenging for practitioners: those involving uncertainty and those involving nonconvexity. This research advanced the state-of-the-art in solving mixed integer linear programs containing symmetry, mixed integer nonlinear programs, and stochastic optimization problems. The focus of the work done in the continuation was on Mixed Integer Nonlinear Programs (MINLP)s and Mixed Integer Linear Programs (MILP)s, especially those containing a great deal of symmetry.

  13. Uncertainty in BMP evaluation and optimization for watershed management

    NASA Astrophysics Data System (ADS)

    Chaubey, I.; Cibin, R.; Sudheer, K.; Her, Y.

    2012-12-01

    Use of computer simulation models have increased substantially to make watershed management decisions and to develop strategies for water quality improvements. These models are often used to evaluate potential benefits of various best management practices (BMPs) for reducing losses of pollutants from sources areas into receiving waterbodies. Similarly, use of simulation models in optimizing selection and placement of best management practices under single (maximization of crop production or minimization of pollutant transport) and multiple objective functions has increased recently. One of the limitations of the currently available assessment and optimization approaches is that the BMP strategies are considered deterministic. Uncertainties in input data (e.g. precipitation, streamflow, sediment, nutrient and pesticide losses measured, land use) and model parameters may result in considerable uncertainty in watershed response under various BMP options. We have developed and evaluated options to include uncertainty in BMP evaluation and optimization for watershed management. We have also applied these methods to evaluate uncertainty in ecosystem services from mixed land use watersheds. In this presentation, we will discuss methods to to quantify uncertainties in BMP assessment and optimization solutions due to uncertainties in model inputs and parameters. We have used a watershed model (Soil and Water Assessment Tool or SWAT) to simulate the hydrology and water quality in mixed land use watershed located in Midwest USA. The SWAT model was also used to represent various BMPs in the watershed needed to improve water quality. SWAT model parameters, land use change parameters, and climate change parameters were considered uncertain. It was observed that model parameters, land use and climate changes resulted in considerable uncertainties in BMP performance in reducing P, N, and sediment loads. In addition, climate change scenarios also affected uncertainties in SWAT simulated crop yields. Considerable uncertainties in the net cost and the water quality improvements resulted due to uncertainties in land use, climate change, and model parameter values.

  14. Evaluation of measurement uncertainty of glucose in clinical chemistry.

    PubMed

    Berçik Inal, B; Koldas, M; Inal, H; Coskun, C; Gümüs, A; Döventas, Y

    2007-04-01

    The definition of the uncertainty of measurement used in the International Vocabulary of Basic and General Terms in Metrology (VIM) is a parameter associated with the result of a measurement, which characterizes the dispersion of the values that could reasonably be attributed to the measurand. Uncertainty of measurement comprises many components. In addition to every parameter, the measurement uncertainty is that a value should be given by all institutions that have been accredited. This value shows reliability of the measurement. GUM, published by NIST, contains uncertainty directions. Eurachem/CITAC Guide CG4 was also published by Eurachem/CITAC Working Group in the year 2000. Both of them offer a mathematical model, for uncertainty can be calculated. There are two types of uncertainty in measurement. Type A is the evaluation of uncertainty through the statistical analysis and type B is the evaluation of uncertainty through other means, for example, certificate reference material. Eurachem Guide uses four types of distribution functions: (1) rectangular distribution that gives limits without specifying a level of confidence (u(x)=a/ radical3) to a certificate; (2) triangular distribution that values near to the same point (u(x)=a/ radical6); (3) normal distribution in which an uncertainty is given in the form of a standard deviation s, a relative standard deviation s/ radicaln, or a coefficient of variance CV% without specifying the distribution (a = certificate value, u = standard uncertainty); and (4) confidence interval.

  15. A modified F-test for evaluating model performance by including both experimental and simulation uncertainties

    USDA-ARS?s Scientific Manuscript database

    Experimental and simulation uncertainties have not been included in many of the statistics used in assessing agricultural model performance. The objectives of this study were to develop an F-test that can be used to evaluate model performance considering experimental and simulation uncertainties, an...

  16. Satellite Power Systems (SPS) space transportation cost analysis and evaluation

    NASA Technical Reports Server (NTRS)

    1980-01-01

    A picture of Space Power Systems space transportation costs at the present time is given with respect to accuracy as stated, reasonableness of the methods used, assumptions made, and uncertainty associated with the estimates. The approach used consists of examining space transportation costs from several perspectives to perform a variety of sensitivity analyses or reviews and examine the findings in terms of internal consistency and external comparison with analogous systems. These approaches are summarized as a theoretical and historical review including a review of stated and unstated assumptions used to derive the costs, and a performance or technical review. These reviews cover the overall transportation program as well as the individual vehicles proposed. The review of overall cost assumptions is the principal means used for estimating the cost uncertainty derived. The cost estimates used as the best current estimate are included.

  17. R&D for computational cognitive and social models : foundations for model evaluation through verification and validation (final LDRD report).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slepoy, Alexander; Mitchell, Scott A.; Backus, George A.

    2008-09-01

    Sandia National Laboratories is investing in projects that aim to develop computational modeling and simulation applications that explore human cognitive and social phenomena. While some of these modeling and simulation projects are explicitly research oriented, others are intended to support or provide insight for people involved in high consequence decision-making. This raises the issue of how to evaluate computational modeling and simulation applications in both research and applied settings where human behavior is the focus of the model: when is a simulation 'good enough' for the goals its designers want to achieve? In this report, we discuss two years' worthmore » of review and assessment of the ASC program's approach to computational model verification and validation, uncertainty quantification, and decision making. We present a framework that extends the principles of the ASC approach into the area of computational social and cognitive modeling and simulation. In doing so, we argue that the potential for evaluation is a function of how the modeling and simulation software will be used in a particular setting. In making this argument, we move from strict, engineering and physics oriented approaches to V&V to a broader project of model evaluation, which asserts that the systematic, rigorous, and transparent accumulation of evidence about a model's performance under conditions of uncertainty is a reasonable and necessary goal for model evaluation, regardless of discipline. How to achieve the accumulation of evidence in areas outside physics and engineering is a significant research challenge, but one that requires addressing as modeling and simulation tools move out of research laboratories and into the hands of decision makers. This report provides an assessment of our thinking on ASC Verification and Validation, and argues for further extending V&V research in the physical and engineering sciences toward a broader program of model evaluation in situations of high consequence decision-making.« less

  18. Uncertainty Analysis of Consequence Management (CM) Data Products.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunt, Brian D.; Eckert-Gallup, Aubrey Celia; Cochran, Lainy Dromgoole

    The goal of this project is to develop and execute methods for characterizing uncertainty in data products that are deve loped and distributed by the DOE Consequence Management (CM) Program. A global approach to this problem is necessary because multiple sources of error and uncertainty from across the CM skill sets contribute to the ultimate p roduction of CM data products. This report presents the methods used to develop a probabilistic framework to characterize this uncertainty and provides results for an uncertainty analysis for a study scenario analyzed using this framework.

  19. Inexact fuzzy-stochastic mixed-integer programming approach for long-term planning of waste management--Part A: methodology.

    PubMed

    Guo, P; Huang, G H

    2009-01-01

    In this study, an inexact fuzzy chance-constrained two-stage mixed-integer linear programming (IFCTIP) approach is proposed for supporting long-term planning of waste-management systems under multiple uncertainties in the City of Regina, Canada. The method improves upon the existing inexact two-stage programming and mixed-integer linear programming techniques by incorporating uncertainties expressed as multiple uncertainties of intervals and dual probability distributions within a general optimization framework. The developed method can provide an effective linkage between the predefined environmental policies and the associated economic implications. Four special characteristics of the proposed method make it unique compared with other optimization techniques that deal with uncertainties. Firstly, it provides a linkage to predefined policies that have to be respected when a modeling effort is undertaken; secondly, it is useful for tackling uncertainties presented as intervals, probabilities, fuzzy sets and their incorporation; thirdly, it facilitates dynamic analysis for decisions of facility-expansion planning and waste-flow allocation within a multi-facility, multi-period, multi-level, and multi-option context; fourthly, the penalties are exercised with recourse against any infeasibility, which permits in-depth analyses of various policy scenarios that are associated with different levels of economic consequences when the promised solid waste-generation rates are violated. In a companion paper, the developed method is applied to a real case for the long-term planning of waste management in the City of Regina, Canada.

  20. Routine internal- and external-quality control data in clinical laboratories for estimating measurement and diagnostic uncertainty using GUM principles.

    PubMed

    Magnusson, Bertil; Ossowicki, Haakan; Rienitz, Olaf; Theodorsson, Elvar

    2012-05-01

    Healthcare laboratories are increasingly joining into larger laboratory organizations encompassing several physical laboratories. This caters for important new opportunities for re-defining the concept of a 'laboratory' to encompass all laboratories and measurement methods measuring the same measurand for a population of patients. In order to make measurement results, comparable bias should be minimized or eliminated and measurement uncertainty properly evaluated for all methods used for a particular patient population. The measurement as well as diagnostic uncertainty can be evaluated from internal and external quality control results using GUM principles. In this paper the uncertainty evaluations are described in detail using only two main components, within-laboratory reproducibility and uncertainty of the bias component according to a Nordtest guideline. The evaluation is exemplified for the determination of creatinine in serum for a conglomerate of laboratories both expressed in absolute units (μmol/L) and relative (%). An expanded measurement uncertainty of 12 μmol/L associated with concentrations of creatinine below 120 μmol/L and of 10% associated with concentrations above 120 μmol/L was estimated. The diagnostic uncertainty encompasses both measurement uncertainty and biological variation, and can be estimated for a single value and for a difference. This diagnostic uncertainty for the difference for two samples from the same patient was determined to be 14 μmol/L associated with concentrations of creatinine below 100 μmol/L and 14 % associated with concentrations above 100 μmol/L.

  1. Nuclear Data Activities in Support of the DOE Nuclear Criticality Safety Program

    NASA Astrophysics Data System (ADS)

    Westfall, R. M.; McKnight, R. D.

    2005-05-01

    The DOE Nuclear Criticality Safety Program (NCSP) provides the technical infrastructure maintenance for those technologies applied in the evaluation and performance of safe fissionable-material operations in the DOE complex. These technologies include an Analytical Methods element for neutron transport as well as the development of sensitivity/uncertainty methods, the performance of Critical Experiments, evaluation and qualification of experiments as Benchmarks, and a comprehensive Nuclear Data program coordinated by the NCSP Nuclear Data Advisory Group (NDAG). The NDAG gathers and evaluates differential and integral nuclear data, identifies deficiencies, and recommends priorities on meeting DOE criticality safety needs to the NCSP Criticality Safety Support Group (CSSG). Then the NDAG identifies the required resources and unique capabilities for meeting these needs, not only for performing measurements but also for data evaluation with nuclear model codes as well as for data processing for criticality safety applications. The NDAG coordinates effort with the leadership of the National Nuclear Data Center, the Cross Section Evaluation Working Group (CSEWG), and the Working Party on International Evaluation Cooperation (WPEC) of the OECD/NEA Nuclear Science Committee. The overall objective is to expedite the issuance of new data and methods to the DOE criticality safety user. This paper describes these activities in detail, with examples based upon special studies being performed in support of criticality safety for a variety of DOE operations.

  2. Enhancing the Characterization of Epistemic Uncertainties in PM2.5 Risk Analyses.

    PubMed

    Smith, Anne E; Gans, Will

    2015-03-01

    The Environmental Benefits Mapping and Analysis Program (BenMAP) is a software tool developed by the U.S. Environmental Protection Agency (EPA) that is widely used inside and outside of EPA to produce quantitative estimates of public health risks from fine particulate matter (PM2.5 ). This article discusses the purpose and appropriate role of a risk analysis tool to support risk management deliberations, and evaluates the functions of BenMAP in this context. It highlights the importance in quantitative risk analyses of characterization of epistemic uncertainty, or outright lack of knowledge, about the true risk relationships being quantified. This article describes and quantitatively illustrates sensitivities of PM2.5 risk estimates to several key forms of epistemic uncertainty that pervade those calculations: the risk coefficient, shape of the risk function, and the relative toxicity of individual PM2.5 constituents. It also summarizes findings from a review of U.S.-based epidemiological evidence regarding the PM2.5 risk coefficient for mortality from long-term exposure. That review shows that the set of risk coefficients embedded in BenMAP substantially understates the range in the literature. We conclude that BenMAP would more usefully fulfill its role as a risk analysis support tool if its functions were extended to better enable and prompt its users to characterize the epistemic uncertainties in their risk calculations. This requires expanded automatic sensitivity analysis functions and more recognition of the full range of uncertainty in risk coefficients. © 2014 Society for Risk Analysis.

  3. CPLOAS_2 User Manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sallaberry, Cedric Jean-Marie; Helton, Jon C.

    2015-05-01

    Weak link (WL)/strong link (SL) systems are important parts of the overall operational design of high - consequence systems. In such designs, the SL system is very robust and is intended to permit operation of the entire system under, and only under, intended conditions. In contrast, the WL system is intended to fail in a predictable and irreversible manner under accident conditions and render the entire system inoperable before an accidental operation of the SL system. The likelihood that the WL system will fail to d eactivate the entire system before the SL system fails (i.e., degrades into a configurationmore » that could allow an accidental operation of the entire system) is referred to as probability of loss of assured safety (PLOAS). This report describes the Fortran 90 program CPLOAS_2 that implements the following representations for PLOAS for situations in which both link physical properties and link failure properties are time - dependent: (i) failure of all SLs before failure of any WL, (ii) failure of any SL before f ailure of any WL, (iii) failure of all SLs before failure of all WLs, and (iv) failure of any SL before failure of all WLs. The effects of aleatory uncertainty and epistemic uncertainty in the definition and numerical evaluation of PLOAS can be included in the calculations performed by CPLOAS_2. Keywords: Aleatory uncertainty, CPLOAS_2, Epistemic uncertainty, Probability of loss of assured safety, Strong link, Uncertainty analysis, Weak link« less

  4. Effects of a case management program on patients with oral precancerous lesions: a randomized controlled trial.

    PubMed

    Lin, Hsiu-Ying; Chen, Shu-Ching; Peng, Hsi-Ling; Chen, Mu-Kuan

    2016-01-01

    The aim of this study is to identify the effects of a case management program on knowledge about oral cancer, preventive behavior for oral cancer, and level of uncertainty for patients with oral precancerous lesions. A randomized controlled trial was conducted with two groups, using a pre- and posttest design. The experimental group received a case management program and telephone follow-up sessions; the control group received routine care. Patients were assessed at three time points: first visit to the otolaryngology clinic for biopsy examination (T0), and then at 2 weeks (T1) and 4 weeks (T2) after the biopsy examination. Patients in both groups had significantly higher levels of knowledge about oral cancer, preventive behavior for oral cancer, and lower level of uncertainty at T2 compared to T0. At T2, participants in the experimental group had significantly greater knowledge about oral cancer, more preventive behavior for oral cancer, and less uncertainty compared to those in the control group. The case management program with telephone counseling effectively improved knowledge about oral cancer, preventive behavior for oral cancer, and uncertainty levels in patients with oral precancerous lesions in the four weeks after receiving a biopsy examination. The case management program can be applied with positive results to patients receiving different types of cancer screening, including colorectal, breast, and cervical screening.

  5. Clouds and more: ARM climate modeling best estimate data: A new data product for climate studies

    DOE PAGES

    Xie, Shaocheng; McCoy, Renata B.; Klein, Stephen A.; ...

    2010-01-01

    The U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Program (www.arm.gov) was created in 1989 to address scientific uncertainties related to global climate change, with a focus on the crucial role of clouds and their influence on the transfer of radiation atmosphere. Here, a central activity is the acquisition of detailed observations of clouds and radiation, as well as related atmospheric variables for climate model evaluation and improvement.

  6. Uncertainty Assessments of 2D and Axisymmetric Hypersonic Shock Wave - Turbulent Boundary Layer Interaction Simulations at Compression Corners

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.; Berry, Scott A.; VanNorman, John W.

    2011-01-01

    This paper is one of a series of five papers in a special session organized by the NASA Fundamental Aeronautics Program that addresses uncertainty assessments for CFD simulations in hypersonic flow. Simulations of a shock emanating from a compression corner and interacting with a fully developed turbulent boundary layer are evaluated herein. Mission relevant conditions at Mach 7 and Mach 14 are defined for a pre-compression ramp of a scramjet powered vehicle. Three compression angles are defined, the smallest to avoid separation losses and the largest to force a separated flow engaging more complicated flow physics. The Baldwin-Lomax and the Cebeci-Smith algebraic models, the one-equation Spalart-Allmaras model with the Catrix-Aupoix compressibility modification and two-equation models including Menter SST, Wilcox k-omega 98, and Wilcox k-omega 06 turbulence models are evaluated. Each model is fully defined herein to preclude any ambiguity regarding model implementation. Comparisons are made to existing experimental data and Van Driest theory to provide preliminary assessment of model form uncertainty. A set of coarse grained uncertainty metrics are defined to capture essential differences among turbulence models. Except for the inability of algebraic models to converge for some separated flows there is no clearly superior model as judged by these metrics. A preliminary metric for the numerical component of uncertainty in shock-turbulent-boundary-layer interactions at compression corners sufficiently steep to cause separation is defined as 55%. This value is a median of differences with experimental data averaged for peak pressure and heating and for extent of separation captured in new, grid-converged solutions presented here. This value is consistent with existing results in a literature review of hypersonic shock-turbulent-boundary-layer interactions by Roy and Blottner and with more recent computations of MacLean.

  7. SU-E-J-92: Validating Dose Uncertainty Estimates Produced by AUTODIRECT, An Automated Program to Evaluate Deformable Image Registration Accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, H; Chen, J; Pouliot, J

    2015-06-15

    Purpose: Deformable image registration (DIR) is a powerful tool with the potential to deformably map dose from one computed-tomography (CT) image to another. Errors in the DIR, however, will produce errors in the transferred dose distribution. We have proposed a software tool, called AUTODIRECT (automated DIR evaluation of confidence tool), which predicts voxel-specific dose mapping errors on a patient-by-patient basis. This work validates the effectiveness of AUTODIRECT to predict dose mapping errors with virtual and physical phantom datasets. Methods: AUTODIRECT requires 4 inputs: moving and fixed CT images and two noise scans of a water phantom (for noise characterization). Then,more » AUTODIRECT uses algorithms to generate test deformations and applies them to the moving and fixed images (along with processing) to digitally create sets of test images, with known ground-truth deformations that are similar to the actual one. The clinical DIR algorithm is then applied to these test image sets (currently 4) . From these tests, AUTODIRECT generates spatial and dose uncertainty estimates for each image voxel based on a Student’s t distribution. This work compares these uncertainty estimates to the actual errors made by the Velocity Deformable Multi Pass algorithm on 11 virtual and 1 physical phantom datasets. Results: For 11 of the 12 tests, the predicted dose error distributions from AUTODIRECT are well matched to the actual error distributions within 1–6% for 10 virtual phantoms, and 9% for the physical phantom. For one of the cases though, the predictions underestimated the errors in the tail of the distribution. Conclusion: Overall, the AUTODIRECT algorithm performed well on the 12 phantom cases for Velocity and was shown to generate accurate estimates of dose warping uncertainty. AUTODIRECT is able to automatically generate patient-, organ- , and voxel-specific DIR uncertainty estimates. This ability would be useful for patient-specific DIR quality assurance.« less

  8. Traffic engineering and regenerator placement in GMPLS networks with restoration

    NASA Astrophysics Data System (ADS)

    Yetginer, Emre; Karasan, Ezhan

    2002-07-01

    In this paper we study regenerator placement and traffic engineering of restorable paths in Generalized Multipro-tocol Label Switching (GMPLS) networks. Regenerators are necessary in optical networks due to transmission impairments. We study a network architecture where there are regenerators at selected nodes and we propose two heuristic algorithms for the regenerator placement problem. Performances of these algorithms in terms of required number of regenerators and computational complexity are evaluated. In this network architecture with sparse regeneration, offline computation of working and restoration paths is studied with bandwidth reservation and path rerouting as the restoration scheme. We study two approaches for selecting working and restoration paths from a set of candidate paths and formulate each method as an Integer Linear Programming (ILP) prob-lem. Traffic uncertainty model is developed in order to compare these methods based on their robustness with respect to changing traffic patterns. Traffic engineering methods are compared based on number of additional demands due to traffic uncertainty that can be carried. Regenerator placement algorithms are also evaluated from a traffic engineering point of view.

  9. Space shuttle launch vehicle aerodynamic uncertainties: Lessons learned

    NASA Technical Reports Server (NTRS)

    Hamilton, J. T.

    1983-01-01

    The chronological development and evolution of an uncertainties model which defines the complex interdependency and interaction of the individual Space Shuttle element and component uncertainties for the launch vehicle are presented. Emphasis is placed on user requirements which dictated certain concessions, simplifications, and assumptions in the analytical model. The use of the uncertainty model in the vehicle design process and flight planning support is discussed. The terminology and justification associated with tolerances as opposed to variations are also presented. Comparisons of and conclusions drawn from flight minus predicted data and uncertainties are given. Lessons learned from the Space Shuttle program concerning aerodynamic uncertainties are examined.

  10. Benefit-Cost Analysis of Undergraduate Education Programs: An Example Analysis of the Freshman Research Initiative.

    PubMed

    Walcott, Rebecca L; Corso, Phaedra S; Rodenbusch, Stacia E; Dolan, Erin L

    2018-01-01

    Institutions and administrators regularly have to make difficult choices about how best to invest resources to serve students. Yet economic evaluation, or the systematic analysis of the relationship between costs and outcomes of a program or policy, is relatively uncommon in higher education. This type of evaluation can be an important tool for decision makers considering questions of resource allocation. Our purpose with this essay is to describe methods for conducting one type of economic evaluation, a benefit-cost analysis (BCA), using an example of an existing undergraduate education program, the Freshman Research Initiative (FRI) at the University of Texas Austin. Our aim is twofold: to demonstrate how to apply BCA methodologies to evaluate an education program and to conduct an economic evaluation of FRI in particular. We explain the steps of BCA, including assessment of costs and benefits, estimation of the benefit-cost ratio, and analysis of uncertainty. We conclude that the university's investment in FRI generates a positive return for students in the form of increased future earning potential. © 2018 R. L. Walcott et al. CBE—Life Sciences Education © 2018 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  11. Numerical Roll Reversal Predictor Corrector Aerocapture and Precision Landing Guidance Algorithms for the Mars Surveyor Program 2001 Missions

    NASA Technical Reports Server (NTRS)

    Powell, Richard W.

    1998-01-01

    This paper describes the development and evaluation of a numerical roll reversal predictor-corrector guidance algorithm for the atmospheric flight portion of the Mars Surveyor Program 2001 Orbiter and Lander missions. The Lander mission utilizes direct entry and has a demanding requirement to deploy its parachute within 10 km of the target deployment point. The Orbiter mission utilizes aerocapture to achieve a precise captured orbit with a single atmospheric pass. Detailed descriptions of these predictor-corrector algorithms are given. Also, results of three and six degree-of-freedom Monte Carlo simulations which include navigation, aerodynamics, mass properties and atmospheric density uncertainties are presented.

  12. Use of PRA in Shuttle Decision Making Process

    NASA Technical Reports Server (NTRS)

    Boyer, Roger L.; Hamlin, Teri L.

    2010-01-01

    How do you use PRA to support an operating program? This presentation will explore how the Shuttle Program Management has used the Shuttle PRA in its decision making process. It will reveal how the PRA has evolved from a tool used to evaluate Shuttle upgrades like Electric Auxiliary Power Unit (EAPU) to a tool that supports Flight Readiness Reviews (FRR) and real-time flight decisions. Specific examples of Shuttle Program decisions that have used the Shuttle PRA as input will be provided including how it was used in the Hubble Space Telescope (HST) manifest decision. It will discuss the importance of providing management with a clear presentation of the analysis, applicable assumptions and limitations, along with estimates of the uncertainty. This presentation will show how the use of PRA by the Shuttle Program has evolved overtime and how it has been used in the decision making process providing specific examples.

  13. Value assignment and uncertainty evaluation for single-element reference solutions

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio; Bodnar, Olha; Butler, Therese A.; Molloy, John L.; Winchester, Michael R.

    2018-06-01

    A Bayesian statistical procedure is proposed for value assignment and uncertainty evaluation for the mass fraction of the elemental analytes in single-element solutions distributed as NIST standard reference materials. The principal novelty that we describe is the use of information about relative differences observed historically between the measured values obtained via gravimetry and via high-performance inductively coupled plasma optical emission spectrometry, to quantify the uncertainty component attributable to between-method differences. This information is encapsulated in a prior probability distribution for the between-method uncertainty component, and it is then used, together with the information provided by current measurement data, to produce a probability distribution for the value of the measurand from which an estimate and evaluation of uncertainty are extracted using established statistical procedures.

  14. Standards of Evidence for Conducting and Reporting Economic Evaluations in Prevention Science.

    PubMed

    Crowley, D Max; Dodge, Kenneth A; Barnett, W Steven; Corso, Phaedra; Duffy, Sarah; Graham, Phillip; Greenberg, Mark; Haskins, Ron; Hill, Laura; Jones, Damon E; Karoly, Lynn A; Kuklinski, Margaret R; Plotnick, Robert

    2018-04-01

    Over a decade ago, the Society for Prevention Research endorsed the first standards of evidence for research in preventive interventions. The growing recognition of the need to use limited resources to make sound investments in prevention led the Board of Directors to charge a new task force to set standards for research in analysis of the economic impact of preventive interventions. This article reports the findings of this group's deliberations, proposes standards for economic analyses, and identifies opportunities for future prevention science. Through examples, policymakers' need and use of economic analysis are described. Standards are proposed for framing economic analysis, estimating costs of prevention programs, estimating benefits of prevention programs, implementing summary metrics, handling uncertainty in estimates, and reporting findings. Topics for research in economic analysis are identified. The SPR Board of Directors endorses the "Standards of Evidence for Conducting and Reporting Economic Evaluations in Prevention Science."

  15. Surprise and opportunity for learning in Grand Canyon: the Glen Canyon Dam Adaptive Management Program

    USGS Publications Warehouse

    Melis, Theodore S.; Walters, Carl; Korman, Josh

    2015-01-01

    With a focus on resources of the Colorado River ecosystem below Glen Canyon Dam, the Glen Canyon Dam Adaptive Management Program has included a variety of experimental policy tests, ranging from manipulation of water releases from the dam to removal of non-native fish within Grand Canyon National Park. None of these field-scale experiments has yet produced unambiguous results in terms of management prescriptions. But there has been adaptive learning, mostly from unanticipated or surprising resource responses relative to predictions from ecosystem modeling. Surprise learning opportunities may often be viewed with dismay by some stakeholders who might not be clear about the purpose of science and modeling in adaptive management. However, the experimental results from the Glen Canyon Dam program actually represent scientific successes in terms of revealing new opportunities for developing better river management policies. A new long-term experimental management planning process for Glen Canyon Dam operations, started in 2011 by the U.S. Department of the Interior, provides an opportunity to refocus management objectives, identify and evaluate key uncertainties about the influence of dam releases, and refine monitoring for learning over the next several decades. Adaptive learning since 1995 is critical input to this long-term planning effort. Embracing uncertainty and surprise outcomes revealed by monitoring and ecosystem modeling will likely continue the advancement of resource objectives below the dam, and may also promote efficient learning in other complex programs.

  16. Study of Uncertainties of Predicting Space Shuttle Thermal Environment. [impact of heating rate prediction errors on weight of thermal protection system

    NASA Technical Reports Server (NTRS)

    Fehrman, A. L.; Masek, R. V.

    1972-01-01

    Quantitative estimates of the uncertainty in predicting aerodynamic heating rates for a fully reusable space shuttle system are developed and the impact of these uncertainties on Thermal Protection System (TPS) weight are discussed. The study approach consisted of statistical evaluations of the scatter of heating data on shuttle configurations about state-of-the-art heating prediction methods to define the uncertainty in these heating predictions. The uncertainties were then applied as heating rate increments to the nominal predicted heating rate to define the uncertainty in TPS weight. Separate evaluations were made for the booster and orbiter, for trajectories which included boost through reentry and touchdown. For purposes of analysis, the vehicle configuration is divided into areas in which a given prediction method is expected to apply, and separate uncertainty factors and corresponding uncertainty in TPS weight derived for each area.

  17. USACM Thematic Workshop On Uncertainty Quantification And Data-Driven Modeling.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, James R.

    The USACM Thematic Workshop on Uncertainty Quantification and Data-Driven Modeling was held on March 23-24, 2017, in Austin, TX. The organizers of the technical program were James R. Stewart of Sandia National Laboratories and Krishna Garikipati of University of Michigan. The administrative organizer was Ruth Hengst, who serves as Program Coordinator for the USACM. The organization of this workshop was coordinated through the USACM Technical Thrust Area on Uncertainty Quantification and Probabilistic Analysis. The workshop website (http://uqpm2017.usacm.org) includes the presentation agenda as well as links to several of the presentation slides (permission to access the presentations was granted by eachmore » of those speakers, respectively). Herein, this final report contains the complete workshop program that includes the presentation agenda, the presentation abstracts, and the list of posters.« less

  18. Uncertainty Quantification Techniques of SCALE/TSUNAMI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Mueller, Don

    2011-01-01

    The Standardized Computer Analysis for Licensing Evaluation (SCALE) code system developed at Oak Ridge National Laboratory (ORNL) includes Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI). The TSUNAMI code suite can quantify the predicted change in system responses, such as k{sub eff}, reactivity differences, or ratios of fluxes or reaction rates, due to changes in the energy-dependent, nuclide-reaction-specific cross-section data. Where uncertainties in the neutron cross-section data are available, the sensitivity of the system to the cross-section data can be applied to propagate the uncertainties in the cross-section data to an uncertainty in the system response. Uncertainty quantification ismore » useful for identifying potential sources of computational biases and highlighting parameters important to code validation. Traditional validation techniques often examine one or more average physical parameters to characterize a system and identify applicable benchmark experiments. However, with TSUNAMI correlation coefficients are developed by propagating the uncertainties in neutron cross-section data to uncertainties in the computed responses for experiments and safety applications through sensitivity coefficients. The bias in the experiments, as a function of their correlation coefficient with the intended application, is extrapolated to predict the bias and bias uncertainty in the application through trending analysis or generalized linear least squares techniques, often referred to as 'data adjustment.' Even with advanced tools to identify benchmark experiments, analysts occasionally find that the application models include some feature or material for which adequately similar benchmark experiments do not exist to support validation. For example, a criticality safety analyst may want to take credit for the presence of fission products in spent nuclear fuel. In such cases, analysts sometimes rely on 'expert judgment' to select an additional administrative margin to account for gap in the validation data or to conclude that the impact on the calculated bias and bias uncertainty is negligible. As a result of advances in computer programs and the evolution of cross-section covariance data, analysts can use the sensitivity and uncertainty analysis tools in the TSUNAMI codes to estimate the potential impact on the application-specific bias and bias uncertainty resulting from nuclides not represented in available benchmark experiments. This paper presents the application of methods described in a companion paper.« less

  19. A model-averaging method for assessing groundwater conceptual model uncertainty.

    PubMed

    Ye, Ming; Pohlmann, Karl F; Chapman, Jenny B; Pohll, Greg M; Reeves, Donald M

    2010-01-01

    This study evaluates alternative groundwater models with different recharge and geologic components at the northern Yucca Flat area of the Death Valley Regional Flow System (DVRFS), USA. Recharge over the DVRFS has been estimated using five methods, and five geological interpretations are available at the northern Yucca Flat area. Combining the recharge and geological components together with additional modeling components that represent other hydrogeological conditions yields a total of 25 groundwater flow models. As all the models are plausible given available data and information, evaluating model uncertainty becomes inevitable. On the other hand, hydraulic parameters (e.g., hydraulic conductivity) are uncertain in each model, giving rise to parametric uncertainty. Propagation of the uncertainty in the models and model parameters through groundwater modeling causes predictive uncertainty in model predictions (e.g., hydraulic head and flow). Parametric uncertainty within each model is assessed using Monte Carlo simulation, and model uncertainty is evaluated using the model averaging method. Two model-averaging techniques (on the basis of information criteria and GLUE) are discussed. This study shows that contribution of model uncertainty to predictive uncertainty is significantly larger than that of parametric uncertainty. For the recharge and geological components, uncertainty in the geological interpretations has more significant effect on model predictions than uncertainty in the recharge estimates. In addition, weighted residuals vary more for the different geological models than for different recharge models. Most of the calibrated observations are not important for discriminating between the alternative models, because their weighted residuals vary only slightly from one model to another.

  20. SU-E-T-625: Robustness Evaluation and Robust Optimization of IMPT Plans Based on Per-Voxel Standard Deviation of Dose Distributions.

    PubMed

    Liu, W; Mohan, R

    2012-06-01

    Proton dose distributions, IMPT in particular, are highly sensitive to setup and range uncertainties. We report a novel method, based on per-voxel standard deviation (SD) of dose distributions, to evaluate the robustness of proton plans and to robustly optimize IMPT plans to render them less sensitive to uncertainties. For each optimization iteration, nine dose distributions are computed - the nominal one, and one each for ± setup uncertainties along x, y and z axes and for ± range uncertainty. SD of dose in each voxel is used to create SD-volume histogram (SVH) for each structure. SVH may be considered a quantitative representation of the robustness of the dose distribution. For optimization, the desired robustness may be specified in terms of an SD-volume (SV) constraint on the CTV and incorporated as a term in the objective function. Results of optimization with and without this constraint were compared in terms of plan optimality and robustness using the so called'worst case' dose distributions; which are obtained by assigning the lowest among the nine doses to each voxel in the clinical target volume (CTV) and the highest to normal tissue voxels outside the CTV. The SVH curve and the area under it for each structure were used as quantitative measures of robustness. Penalty parameter of SV constraint may be varied to control the tradeoff between robustness and plan optimality. We applied these methods to one case each of H&N and lung. In both cases, we found that imposing SV constraint improved plan robustness but at the cost of normal tissue sparing. SVH-based optimization and evaluation is an effective tool for robustness evaluation and robust optimization of IMPT plans. Studies need to be conducted to test the methods for larger cohorts of patients and for other sites. This research is supported by National Cancer Institute (NCI) grant P01CA021239, the University Cancer Foundation via the Institutional Research Grant program at the University of Texas MD Anderson Cancer Center, and MD Anderson’s cancer center support grant CA016672. © 2012 American Association of Physicists in Medicine.

  1. Sum and mean. Standard programs for activation analysis.

    PubMed

    Lindstrom, R M

    1994-01-01

    Two computer programs in use for over a decade in the Nuclear Methods Group at NIST illustrate the utility of standard software: programs widely available and widely used, in which (ideally) well-tested public algorithms produce results that are well understood, and thereby capable of comparison, within the community of users. Sum interactively computes the position, net area, and uncertainty of the area of spectral peaks, and can give better results than automatic peak search programs when peaks are very small, very large, or unusually shaped. Mean combines unequal measurements of a single quantity, tests for consistency, and obtains the weighted mean and six measures of its uncertainty.

  2. Folic Acid Food Fortification—Its History, Effect, Concerns, and Future Directions

    PubMed Central

    Crider, Krista S.; Bailey, Lynn B.; Berry, Robert J.

    2011-01-01

    Periconceptional intake of folic acid is known to reduce a woman’s risk of having an infant affected by a neural tube birth defect (NTD). National programs to mandate fortification of food with folic acid have reduced the prevalence of NTDs worldwide. Uncertainty surrounding possible unintended consequences has led to concerns about higher folic acid intake and food fortification programs. This uncertainty emphasizes the need to continually monitor fortification programs for accurate measures of their effect and the ability to address concerns as they arise. This review highlights the history, effect, concerns, and future directions of folic acid food fortification programs. PMID:22254102

  3. Toward quantifying the effectiveness of water trading under uncertainty.

    PubMed

    Luo, B; Huang, G H; Zou, Y; Yin, Y Y

    2007-04-01

    This paper presents a methodology for quantifying the effectiveness of water-trading under uncertainty, by developing an optimization model based on the interval-parameter two-stage stochastic program (TSP) technique. In the study, the effectiveness of a water-trading program is measured by the water volume that can be released through trading from a statistical point of view. The methodology can also deal with recourse water allocation problems generated by randomness in water availability and, at the same time, tackle uncertainties expressed as intervals in the trading system. The developed methodology was tested with a hypothetical water-trading program in an agricultural system in the Swift Current Creek watershed, Canada. Study results indicate that the methodology can effectively measure the effectiveness of a trading program through estimating the water volume being released through trading in a long-term view. A sensitivity analysis was also conducted to analyze the effects of different trading costs on the trading program. It shows that the trading efforts would become ineffective when the trading costs are too high. The case study also demonstrates that the trading program is more effective in a dry season when total water availability is in shortage.

  4. BOOK REVIEW: Evaluating the Measurement Uncertainty: Fundamentals and practical guidance

    NASA Astrophysics Data System (ADS)

    Lira, Ignacio

    2003-08-01

    Evaluating the Measurement Uncertainty is a book written for anyone who makes and reports measurements. It attempts to fill the gaps in the ISO Guide to the Expression of Uncertainty in Measurement, or the GUM, and does a pretty thorough job. The GUM was written with the intent of being applicable by all metrologists, from the shop floor to the National Metrology Institute laboratory; however, the GUM has often been criticized for its lack of user-friendliness because it is primarily filled with statements, but with little explanation. Evaluating the Measurement Uncertainty gives lots of explanations. It is well written and makes use of many good figures and numerical examples. Also important, this book is written by a metrologist from a National Metrology Institute, and therefore up-to-date ISO rules, style conventions and definitions are correctly used and supported throughout. The author sticks very closely to the GUM in topical theme and with frequent reference, so readers who have not read GUM cover-to-cover may feel as if they are missing something. The first chapter consists of a reprinted lecture by T J Quinn, Director of the Bureau International des Poids et Mesures (BIPM), on the role of metrology in today's world. It is an interesting and informative essay that clearly outlines the importance of metrology in our modern society, and why accurate measurement capability, and by definition uncertainty evaluation, should be so important. Particularly interesting is the section on the need for accuracy rather than simply reproducibility. Evaluating the Measurement Uncertainty then begins at the beginning, with basic concepts and definitions. The third chapter carefully introduces the concept of standard uncertainty and includes many derivations and discussion of probability density functions. The author also touches on Monte Carlo methods, calibration correction quantities, acceptance intervals or guardbanding, and many other interesting cases. The book goes on to treat evaluation of expanded uncertainty, joint treatment of several measurands, least-squares adjustment, curve fitting and more. Chapter 6 is devoted to Bayesian inference. Perhaps one can say that Evaluating the Measurement Uncertainty caters to a wider reader-base than the GUM; however, a mathematical or statistical background is still advantageous. Also, this is not a book with a library of worked overall uncertainty evaluations for various measurements; the feel of the book is rather theoretical. The novice will still have some work to do—but this is a good place to start. I think this book is a fitting companion to the GUM because the text complements the GUM, from fundamental principles to more sophisticated measurement situations, and moreover includes intelligent discussion regarding intent and interpretation. Evaluating the Measurement Uncertainty is detailed, and I think most metrologists will really enjoy the detail and care put into this book. Jennifer Decker

  5. Sources of uncertainty in estimating stream solute export from headwater catchments at three sites

    Treesearch

    Ruth D. Yanai; Naoko Tokuchi; John L. Campbell; Mark B. Green; Eiji Matsuzaki; Stephanie N. Laseter; Cindi L. Brown; Amey S. Bailey; Pilar Lyons; Carrie R. Levine; Donald C. Buso; Gene E. Likens; Jennifer D. Knoepp; Keitaro Fukushima

    2015-01-01

    Uncertainty in the estimation of hydrologic export of solutes has never been fully evaluated at the scale of a small-watershed ecosystem. We used data from the Gomadansan Experimental Forest, Japan, Hubbard Brook Experimental Forest, USA, and Coweeta Hydrologic Laboratory, USA, to evaluate many sources of uncertainty, including the precision and accuracy of...

  6. Uncertainty evaluation in normalization of isotope delta measurement results against international reference materials.

    PubMed

    Meija, Juris; Chartrand, Michelle M G

    2018-01-01

    Isotope delta measurements are normalized against international reference standards. Although multi-point normalization is becoming a standard practice, the existing uncertainty evaluation practices are either undocumented or are incomplete. For multi-point normalization, we present errors-in-variables regression models for explicit accounting of the measurement uncertainty of the international standards along with the uncertainty that is attributed to their assigned values. This manuscript presents framework to account for the uncertainty that arises due to a small number of replicate measurements and discusses multi-laboratory data reduction while accounting for inevitable correlations between the laboratories due to the use of identical reference materials for calibration. Both frequentist and Bayesian methods of uncertainty analysis are discussed.

  7. Effect of Streamflow Forecast Uncertainty on Real-Time Reservoir Operation

    NASA Astrophysics Data System (ADS)

    Zhao, T.; Cai, X.; Yang, D.

    2010-12-01

    Various hydrological forecast products have been applied to real-time reservoir operation, including deterministic streamflow forecast (DSF), DSF-based probabilistic streamflow forecast (DPSF), and ensemble streamflow forecast (ESF), which represent forecast uncertainty in the form of deterministic forecast error, deterministic forecast error-based uncertainty distribution, and ensemble forecast errors, respectively. Compared to previous studies that treat these forecast products as ad hoc inputs for reservoir operation models, this paper attempts to model the uncertainties involved in the various forecast products and explores their effect on real-time reservoir operation decisions. In hydrology, there are various indices reflecting the magnitude of streamflow forecast uncertainty; meanwhile, few models illustrate the forecast uncertainty evolution process. This research introduces Martingale Model of Forecast Evolution (MMFE) from supply chain management and justifies its assumptions for quantifying the evolution of uncertainty in streamflow forecast as time progresses. Based on MMFE, this research simulates the evolution of forecast uncertainty in DSF, DPSF, and ESF, and applies the reservoir operation models (dynamic programming, DP; stochastic dynamic programming, SDP; and standard operation policy, SOP) to assess the effect of different forms of forecast uncertainty on real-time reservoir operation. Through a hypothetical single-objective real-time reservoir operation model, the results illustrate that forecast uncertainty exerts significant effects. Reservoir operation efficiency, as measured by a utility function, decreases as the forecast uncertainty increases. Meanwhile, these effects also depend on the type of forecast product being used. In general, the utility of reservoir operation with ESF is nearly as high as the utility obtained with a perfect forecast; the utilities of DSF and DPSF are similar to each other but not as efficient as ESF. Moreover, streamflow variability and reservoir capacity can change the magnitude of the effects of forecast uncertainty, but not the relative merit of DSF, DPSF, and ESF. Schematic diagram of the increase in forecast uncertainty with forecast lead-time and the dynamic updating property of real-time streamflow forecast

  8. Evaluating uncertainty in environmental life-cycle assessment. A case study comparing two insulation options for a Dutch one-family dwelling.

    PubMed

    Huijbregts, Mark A J; Gilijamse, Wim; Ragas, Ad M J; Reijnders, Lucas

    2003-06-01

    The evaluation of uncertainty is relatively new in environmental life-cycle assessment (LCA). It provides useful information to assess the reliability of LCA-based decisions and to guide future research toward reducing uncertainty. Most uncertainty studies in LCA quantify only one type of uncertainty, i.e., uncertainty due to input data (parameter uncertainty). However, LCA outcomes can also be uncertain due to normative choices (scenario uncertainty) and the mathematical models involved (model uncertainty). The present paper outlines a new methodology that quantifies parameter, scenario, and model uncertainty simultaneously in environmental life-cycle assessment. The procedure is illustrated in a case study that compares two insulation options for a Dutch one-family dwelling. Parameter uncertainty was quantified by means of Monte Carlo simulation. Scenario and model uncertainty were quantified by resampling different decision scenarios and model formulations, respectively. Although scenario and model uncertainty were not quantified comprehensively, the results indicate that both types of uncertainty influence the case study outcomes. This stresses the importance of quantifying parameter, scenario, and model uncertainty simultaneously. The two insulation options studied were found to have significantly different impact scores for global warming, stratospheric ozone depletion, and eutrophication. The thickest insulation option has the lowest impact on global warming and eutrophication, and the highest impact on stratospheric ozone depletion.

  9. The US-DOE ARM/ASR Effort in Quantifying Uncertainty in Ground-Based Cloud Property Retrievals (Invited)

    NASA Astrophysics Data System (ADS)

    Xie, S.; Protat, A.; Zhao, C.

    2013-12-01

    One primary goal of the US Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) program is to obtain and retrieve cloud microphysical properties from detailed cloud observations using ground-based active and passive remote sensors. However, there is large uncertainty in the retrieved cloud property products. Studies have shown that the uncertainty could arise from instrument limitations, measurement errors, sampling errors, retrieval algorithm deficiencies in assumptions, as well as inconsistent input data and constraints used by different algorithms. To quantify the uncertainty in cloud retrievals, a scientific focus group, Quantification of Uncertainties In Cloud Retrievals (QUICR), was recently created by the DOE Atmospheric System Research (ASR) program. This talk will provide an overview of the recent research activities conducted within QUICR and discuss its current collaborations with the European cloud retrieval community and future plans. The goal of QUICR is to develop a methodology for characterizing and quantifying uncertainties in current and future ARM cloud retrievals. The Work at LLNL was performed under the auspices of the U. S. Department of Energy (DOE), Office of Science, Office of Biological and Environmental Research by Lawrence Livermore National Laboratory under contract No. DE-AC52-07NA27344. LLNL-ABS-641258.

  10. Uncertainty in age-specific harvest estimates and consequences for white-tailed deer management

    USGS Publications Warehouse

    Collier, B.A.; Krementz, D.G.

    2007-01-01

    Age structure proportions (proportion of harvested individuals within each age class) are commonly used as support for regulatory restrictions and input for deer population models. Such use requires critical evaluation when harvest regulations force hunters to selectively harvest specific age classes, due to impact on the underlying population age structure. We used a stochastic population simulation model to evaluate the impact of using harvest proportions to evaluate changes in population age structure under a selective harvest management program at two scales. Using harvest proportions to parameterize the age-specific harvest segment of the model for the local scale showed that predictions of post-harvest age structure did not vary dependent upon whether selective harvest criteria were in use or not. At the county scale, yearling frequency in the post-harvest population increased, but model predictions indicated that post-harvest population size of 2.5 years old males would decline below levels found before implementation of the antler restriction, reducing the number of individuals recruited into older age classes. Across the range of age-specific harvest rates modeled, our simulation predicted that underestimation of age-specific harvest rates has considerable influence on predictions of post-harvest population age structure. We found that the consequence of uncertainty in harvest rates corresponds to uncertainty in predictions of residual population structure, and this correspondence is proportional to scale. Our simulations also indicate that regardless of use of harvest proportions or harvest rates, at either the local or county scale the modeled SHC had a high probability (>0.60 and >0.75, respectively) of eliminating recruitment into >2.5 years old age classes. Although frequently used to increase population age structure, our modeling indicated that selective harvest criteria can decrease or eliminate the number of white-tailed deer recruited into older age classes. Thus, we suggest that using harvest proportions for management planning and evaluation should be viewed with caution. In addition, we recommend that managers focus more attention on estimation of age-specific harvest rates, and modeling approaches which combine harvest rates with information from harvested individuals to further increase their ability to effectively manage deer populations under selective harvest programs. ?? 2006 Elsevier B.V. All rights reserved.

  11. Uncertainty Evaluation of Measurements with Pyranometers and Pyrheliometers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Konings, Jorgen; Habte, Aron

    2016-01-03

    Evaluating photovoltaic (PV) cells, modules, arrays and systems performance of solar energy relies on accurate measurement of the available solar radiation resources. Solar radiation resources are measured using radiometers such as pyranometers (global horizontal irradiance) and pyrheliometers (direct normal irradiance). The accuracy of solar radiation data measured by radiometers depends not only on the specification of the instrument but also on a) the calibration procedure, b) the measurement conditions and maintenance, and c) the environmental conditions. Therefore, statements about the overall measurement uncertainty can only be made on an individual basis, taking all relevant factors into account. This paper providesmore » guidelines and recommended procedures for estimating the uncertainty in measurements by radiometers using the Guide to the Expression of Uncertainty (GUM) Method. Special attention is paid to the concept of data availability and its link to uncertainty evaluation.« less

  12. Challenges in modeling the X-29 flight test performance

    NASA Technical Reports Server (NTRS)

    Hicks, John W.; Kania, Jan; Pearce, Robert; Mills, Glen

    1987-01-01

    Presented are methods, instrumentation, and difficulties associated with drag measurement of the X-29A aircraft. The initial performance objective of the X-29A program emphasized drag polar shapes rather than absolute drag levels. Priorities during the flight envelope expansion restricted the evaluation of aircraft performance. Changes in aircraft configuration, uncertainties in angle-of-attack calibration, and limitations in instrumentation complicated the analysis. Limited engine instrumentation with uncertainties in overall in-flight thrust accuracy made it difficult to obtain reliable values of coefficient of parasite drag. The aircraft was incapable of tracking the automatic camber control trim schedule for optimum wing flaperon deflection during typical dynamic performance maneuvers; this has also complicated the drag polar shape modeling. The X-29A was far enough off the schedule that the developed trim drag correction procedure has proven inadequate. However, good drag polar shapes have been developed throughout the flight envelope. Preliminary flight results have compared well with wind tunnel predictions. A more comprehensive analysis must be done to complete performance models. The detailed flight performance program with a calibrated engine will benefit from the experience gained during this preliminary performance phase.

  13. Challenges in modeling the X-29A flight test performance

    NASA Technical Reports Server (NTRS)

    Hicks, John W.; Kania, Jan; Pearce, Robert; Mills, Glen

    1987-01-01

    The paper presents the methods, instrumentation, and difficulties associated with drag measurement of the X-29A aircraft. The initial performance objective of the X-29A program emphasized drag polar shapes rather than absolute drag levels. Priorities during the flight envelope expansion restricted the evaluation of aircraft performance. Changes in aircraft configuration, uncertainties in angle-of-attack calibration, and limitations in instrumentation complicated the analysis. Limited engine instrumentation with uncertainties in overall in-flight thrust accuracy made it difficult to obtain reliable values of coefficient of parasite drag. The aircraft was incapable of tracking the automatic camber control trim schedule for optimum wing flaperon deflection during typical dynamic performance maneuvers; this has also complicated the drag polar shape modeling. The X-29A was far enough off the schedule that the developed trim drag correction procedure has proven inadequate. Despite these obstacles, good drag polar shapes have been developed throughout the flight envelope. Preliminary flight results have compared well with wind tunnel predictions. A more comprehensive analysis must be done to complete the performance models. The detailed flight performance program with a calibrated engine will benefit from the experience gained during this preliminary performance phase.

  14. Improving healthcare empowerment through breast cancer patient navigation: a mixed methods evaluation in a safety-net setting.

    PubMed

    Gabitova, Guzyal; Burke, Nancy J

    2014-09-19

    Breast cancer mortality rates in the U.S. remain relatively high, particularly among ethnic minorities and low-income populations. Unequal access to quality care, lower follow up rates, and poor treatment adherence contribute to rising disparities among these groups. Healthcare empowerment (HCE) is theorized to improve patient outcomes through collaboration with providers and improving understanding of and compliance with treatment. Patient navigation is a health care organizational intervention that essentially improves healthcare empowerment by providing informational, emotional, and psychosocial support. Patient navigators address barriers to care through multilingual coordination of treatment and incorporation of access to community services, support, and education into the continuum of cancer care. Utilizing survey and qualitative methods, we evaluated the patient navigation program in a Northern California safety-net hospital Breast Clinic by assessing its impact on patients' experiences with cancer care and providers' perspectives on the program. We conducted qualitative interviews with 16 patients and 4 service providers, conducted approximately 66 hours of clinic observations, and received feedback through the self-administered survey from 66 patients. The role of the patient navigator at the Breast Clinic included providing administrative assistance, psychosocial support, improved knowledge, better understanding of treatment process, and ensuring better communication between patients and providers. As such, patient navigators facilitated improved collaboration between patients and providers and understanding of interdisciplinary care processes. The survey results suggested that the majority of patients across all ethnic backgrounds and age groups were highly satisfied with the program and had a positive perception of their navigator. Interviews with patients and providers highlighted the roles of a navigator in ensuring continuity of care, improving treatment completion rates, and reducing providers' workload and waiting time. Uncertainty about the navigator's role among the patients was a weakness of the program. Patient navigation in the Breast Clinic had a positive impact on patients' experiences with care and healthcare empowerment. Clarifying uncertainties about the navigators' role would aid successful outcomes.

  15. Accuracy requirements and uncertainties in radiotherapy: a report of the International Atomic Energy Agency.

    PubMed

    van der Merwe, Debbie; Van Dyk, Jacob; Healy, Brendan; Zubizarreta, Eduardo; Izewska, Joanna; Mijnheer, Ben; Meghzifene, Ahmed

    2017-01-01

    Radiotherapy technology continues to advance and the expectation of improved outcomes requires greater accuracy in various radiotherapy steps. Different factors affect the overall accuracy of dose delivery. Institutional comprehensive quality assurance (QA) programs should ensure that uncertainties are maintained at acceptable levels. The International Atomic Energy Agency has recently developed a report summarizing the accuracy achievable and the suggested action levels, for each step in the radiotherapy process. Overview of the report: The report seeks to promote awareness and encourage quantification of uncertainties in order to promote safer and more effective patient treatments. The radiotherapy process and the radiobiological and clinical frameworks that define the need for accuracy are depicted. Factors that influence uncertainty are described for a range of techniques, technologies and systems. Methodologies for determining and combining uncertainties are presented, and strategies for reducing uncertainties through QA programs are suggested. The role of quality audits in providing international benchmarking of achievable accuracy and realistic action levels is also discussed. The report concludes with nine general recommendations: (1) Radiotherapy should be applied as accurately as reasonably achievable, technical and biological factors being taken into account. (2) For consistency in prescribing, reporting and recording, recommendations of the International Commission on Radiation Units and Measurements should be implemented. (3) Each institution should determine uncertainties for their treatment procedures. Sample data are tabulated for typical clinical scenarios with estimates of the levels of accuracy that are practically achievable and suggested action levels. (4) Independent dosimetry audits should be performed regularly. (5) Comprehensive quality assurance programs should be in place. (6) Professional staff should be appropriately educated and adequate staffing levels should be maintained. (7) For reporting purposes, uncertainties should be presented. (8) Manufacturers should provide training on all equipment. (9) Research should aid in improving the accuracy of radiotherapy. Some example research projects are suggested.

  16. A Novel Biobjective Risk-Based Model for Stochastic Air Traffic Network Flow Optimization Problem.

    PubMed

    Cai, Kaiquan; Jia, Yaoguang; Zhu, Yanbo; Xiao, Mingming

    2015-01-01

    Network-wide air traffic flow management (ATFM) is an effective way to alleviate demand-capacity imbalances globally and thereafter reduce airspace congestion and flight delays. The conventional ATFM models assume the capacities of airports or airspace sectors are all predetermined. However, the capacity uncertainties due to the dynamics of convective weather may make the deterministic ATFM measures impractical. This paper investigates the stochastic air traffic network flow optimization (SATNFO) problem, which is formulated as a weighted biobjective 0-1 integer programming model. In order to evaluate the effect of capacity uncertainties on ATFM, the operational risk is modeled via probabilistic risk assessment and introduced as an extra objective in SATNFO problem. Computation experiments using real-world air traffic network data associated with simulated weather data show that presented model has far less constraints compared to stochastic model with nonanticipative constraints, which means our proposed model reduces the computation complexity.

  17. Optimizing Integrated Terminal Airspace Operations Under Uncertainty

    NASA Technical Reports Server (NTRS)

    Bosson, Christabelle; Xue, Min; Zelinski, Shannon

    2014-01-01

    In the terminal airspace, integrated departures and arrivals have the potential to increase operations efficiency. Recent research has developed geneticalgorithm- based schedulers for integrated arrival and departure operations under uncertainty. This paper presents an alternate method using a machine jobshop scheduling formulation to model the integrated airspace operations. A multistage stochastic programming approach is chosen to formulate the problem and candidate solutions are obtained by solving sample average approximation problems with finite sample size. Because approximate solutions are computed, the proposed algorithm incorporates the computation of statistical bounds to estimate the optimality of the candidate solutions. A proof-ofconcept study is conducted on a baseline implementation of a simple problem considering a fleet mix of 14 aircraft evolving in a model of the Los Angeles terminal airspace. A more thorough statistical analysis is also performed to evaluate the impact of the number of scenarios considered in the sampled problem. To handle extensive sampling computations, a multithreading technique is introduced.

  18. A Hierarchical Multi-Model Approach for Uncertainty Segregation, Prioritization and Comparative Evaluation of Competing Modeling Propositions

    NASA Astrophysics Data System (ADS)

    Tsai, F. T.; Elshall, A. S.; Hanor, J. S.

    2012-12-01

    Subsurface modeling is challenging because of many possible competing propositions for each uncertain model component. How can we judge that we are selecting the correct proposition for an uncertain model component out of numerous competing propositions? How can we bridge the gap between synthetic mental principles such as mathematical expressions on one hand, and empirical observation such as observation data on the other hand when uncertainty exists on both sides? In this study, we introduce hierarchical Bayesian model averaging (HBMA) as a multi-model (multi-proposition) framework to represent our current state of knowledge and decision for hydrogeological structure modeling. The HBMA framework allows for segregating and prioritizing different sources of uncertainty, and for comparative evaluation of competing propositions for each source of uncertainty. We applied the HBMA to a study of hydrostratigraphy and uncertainty propagation of the Southern Hills aquifer system in the Baton Rouge area, Louisiana. We used geophysical data for hydrogeological structure construction through indictor hydrostratigraphy method and used lithologic data from drillers' logs for model structure calibration. However, due to uncertainty in model data, structure and parameters, multiple possible hydrostratigraphic models were produced and calibrated. The study considered four sources of uncertainties. To evaluate mathematical structure uncertainty, the study considered three different variogram models and two geological stationarity assumptions. With respect to geological structure uncertainty, the study considered two geological structures with respect to the Denham Springs-Scotlandville fault. With respect to data uncertainty, the study considered two calibration data sets. These four sources of uncertainty with their corresponding competing modeling propositions resulted in 24 calibrated models. The results showed that by segregating different sources of uncertainty, HBMA analysis provided insights on uncertainty priorities and propagation. In addition, it assisted in evaluating the relative importance of competing modeling propositions for each uncertain model component. By being able to dissect the uncertain model components and provide weighted representation of the competing propositions for each uncertain model component based on the background knowledge, the HBMA functions as an epistemic framework for advancing knowledge about the system under study.

  19. Advanced Small Modular Reactor Economics Model Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison, Thomas J.

    2014-10-01

    The US Department of Energy Office of Nuclear Energy’s Advanced Small Modular Reactor (SMR) research and development activities focus on four key areas: Developing assessment methods for evaluating advanced SMR technologies and characteristics; and Developing and testing of materials, fuels and fabrication techniques; and Resolving key regulatory issues identified by US Nuclear Regulatory Commission and industry; and Developing advanced instrumentation and controls and human-machine interfaces. This report focuses on development of assessment methods to evaluate advanced SMR technologies and characteristics. Specifically, this report describes the expansion and application of the economic modeling effort at Oak Ridge National Laboratory. Analysis ofmore » the current modeling methods shows that one of the primary concerns for the modeling effort is the handling of uncertainty in cost estimates. Monte Carlo–based methods are commonly used to handle uncertainty, especially when implemented by a stand-alone script within a program such as Python or MATLAB. However, a script-based model requires each potential user to have access to a compiler and an executable capable of handling the script. Making the model accessible to multiple independent analysts is best accomplished by implementing the model in a common computing tool such as Microsoft Excel. Excel is readily available and accessible to most system analysts, but it is not designed for straightforward implementation of a Monte Carlo–based method. Using a Monte Carlo algorithm requires in-spreadsheet scripting and statistical analyses or the use of add-ons such as Crystal Ball. An alternative method uses propagation of error calculations in the existing Excel-based system to estimate system cost uncertainty. This method has the advantage of using Microsoft Excel as is, but it requires the use of simplifying assumptions. These assumptions do not necessarily bring into question the analytical results. In fact, the analysis shows that the propagation of error method introduces essentially negligible error, especially when compared to the uncertainty associated with some of the estimates themselves. The results of these uncertainty analyses generally quantify and identify the sources of uncertainty in the overall cost estimation. The obvious generalization—that capital cost uncertainty is the main driver—can be shown to be an accurate generalization for the current state of reactor cost analysis. However, the detailed analysis on a component-by-component basis helps to demonstrate which components would benefit most from research and development to decrease the uncertainty, as well as which components would benefit from research and development to decrease the absolute cost.« less

  20. A linear programming approach to characterizing norm bounded uncertainty from experimental data

    NASA Technical Reports Server (NTRS)

    Scheid, R. E.; Bayard, D. S.; Yam, Y.

    1991-01-01

    The linear programming spectral overbounding and factorization (LPSOF) algorithm, an algorithm for finding a minimum phase transfer function of specified order whose magnitude tightly overbounds a specified nonparametric function of frequency, is introduced. This method has direct application to transforming nonparametric uncertainty bounds (available from system identification experiments) into parametric representations required for modern robust control design software (i.e., a minimum-phase transfer function multiplied by a norm-bounded perturbation).

  1. Validation of Radiometric Standards for the Laboratory Calibration of Reflected-Solar Earth Observing Satellite Instruments

    NASA Technical Reports Server (NTRS)

    Butler, James J.; Johnson, B. Carol; Rice, Joseph P.; Brown, Steven W.; Barnes, Robert A.

    2007-01-01

    Historically, the traceability of the laboratory calibration of Earth-observing satellite instruments to a primary radiometric reference scale (SI units) is the responsibility of each instrument builder. For the NASA Earth Observing System (EOS), a program has been developed using laboratory transfer radiometers, each with its own traceability to the primary radiance scale of a national metrology laboratory, to independently validate the radiances assigned to the laboratory sources of the instrument builders. The EOS Project Science Office also developed a validation program for the measurement of onboard diffuse reflecting plaques, which are also used as radiometric standards for Earth-observing satellite instruments. Summarized results of these validation campaigns, with an emphasis on the current state-of-the-art uncertainties in laboratory radiometric standards, will be presented. Future mission uncertainty requirements, and possible enhancements to the EOS validation program to ensure that those uncertainties can be met, will be presented.

  2. Prediction of seismic collapse risk of steel moment frame mid-rise structures by meta-heuristic algorithms

    NASA Astrophysics Data System (ADS)

    Jough, Fooad Karimi Ghaleh; Şensoy, Serhan

    2016-12-01

    Different performance levels may be obtained for sideway collapse evaluation of steel moment frames depending on the evaluation procedure used to handle uncertainties. In this article, the process of representing modelling uncertainties, record to record (RTR) variations and cognitive uncertainties for moment resisting steel frames of various heights is discussed in detail. RTR uncertainty is used by incremental dynamic analysis (IDA), modelling uncertainties are considered through backbone curves and hysteresis loops of component, and cognitive uncertainty is presented in three levels of material quality. IDA is used to evaluate RTR uncertainty based on strong ground motion records selected by the k-means algorithm, which is favoured over Monte Carlo selection due to its time saving appeal. Analytical equations of the Response Surface Method are obtained through IDA results by the Cuckoo algorithm, which predicts the mean and standard deviation of the collapse fragility curve. The Takagi-Sugeno-Kang model is used to represent material quality based on the response surface coefficients. Finally, collapse fragility curves with the various sources of uncertainties mentioned are derived through a large number of material quality values and meta variables inferred by the Takagi-Sugeno-Kang fuzzy model based on response surface method coefficients. It is concluded that a better risk management strategy in countries where material quality control is weak, is to account for cognitive uncertainties in fragility curves and the mean annual frequency.

  3. Estimating the uncertainty in thermochemical calculations for oxygen-hydrogen combustors

    NASA Astrophysics Data System (ADS)

    Sims, Joseph David

    The thermochemistry program CEA2 was combined with the statistical thermodynamics program PAC99 in a Monte Carlo simulation to determine the uncertainty in several CEA2 output variables due to uncertainty in thermodynamic reference values for the reactant and combustion species. In all, six typical performance parameters were examined, along with the required intermediate calculations (five gas properties and eight stoichiometric coefficients), for three hydrogen-oxygen combustors: a main combustor, an oxidizer preburner and a fuel preburner. The three combustors were analyzed in two different modes: design mode, where, for the first time, the uncertainty in thermodynamic reference values---taken from the literature---was considered (inputs to CEA2 were specified and so had no uncertainty); and data reduction mode, where inputs to CEA2 did have uncertainty. The inputs to CEA2 were contrived experimental measurements that were intended to represent the typical combustor testing facility. In design mode, uncertainties in the performance parameters were on the order of 0.1% for the main combustor, on the order of 0.05% for the oxidizer preburner and on the order of 0.01% for the fuel preburner. Thermodynamic reference values for H2O were the dominant sources of uncertainty, as was the assigned enthalpy for liquid oxygen. In data reduction mode, uncertainties in performance parameters increased significantly as a result of the uncertainties in experimental measurements compared to uncertainties in thermodynamic reference values. Main combustor and fuel preburner theoretical performance values had uncertainties of about 0.5%, while the oxidizer preburner had nearly 2%. Associated experimentally-determined performance values for all three combustors were 3% to 4%. The dominant sources of uncertainty in this mode were the propellant flowrates. These results only apply to hydrogen-oxygen combustors and should not be generalized to every propellant combination. Species for a hydrogen-oxygen system are relatively simple, thereby resulting in low thermodynamic reference value uncertainties. Hydrocarbon combustors, solid rocket motors and hybrid rocket motors have combustion gases containing complex molecules that will likely have thermodynamic reference values with large uncertainties. Thus, every chemical system should be analyzed in a similar manner as that shown in this work.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knudsen, J.K.; Smith, C.L.

    The steps involved to incorporate parameter uncertainty into the Nuclear Regulatory Commission (NRC) accident sequence precursor (ASP) models is covered in this paper. Three different uncertainty distributions (i.e., lognormal, beta, gamma) were evaluated to Determine the most appropriate distribution. From the evaluation, it was Determined that the lognormal distribution will be used for the ASP models uncertainty parameters. Selection of the uncertainty parameters for the basic events is also discussed. This paper covers the process of determining uncertainty parameters for the supercomponent basic events (i.e., basic events that are comprised of more than one component which can have more thanmore » one failure mode) that are utilized in the ASP models. Once this is completed, the ASP model is ready to be utilized to propagate parameter uncertainty for event assessments.« less

  5. Advanced probabilistic methods for quantifying the effects of various uncertainties in structural response

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.

    1988-01-01

    The effects of actual variations, also called uncertainties, in geometry and material properties on the structural response of a space shuttle main engine turbopump blade are evaluated. A normal distribution was assumed to represent the uncertainties statistically. Uncertainties were assumed to be totally random, partially correlated, and fully correlated. The magnitude of these uncertainties were represented in terms of mean and variance. Blade responses, recorded in terms of displacements, natural frequencies, and maximum stress, was evaluated and plotted in the form of probabilistic distributions under combined uncertainties. These distributions provide an estimate of the range of magnitudes of the response and probability of occurrence of a given response. Most importantly, these distributions provide the information needed to estimate quantitatively the risk in a structural design.

  6. Probability and Confidence Trade-space (PACT) Evaluation: Accounting for Uncertainty in Sparing Assessments

    NASA Technical Reports Server (NTRS)

    Anderson, Leif; Box, Neil; Carter, Katrina; DiFilippo, Denise; Harrington, Sean; Jackson, David; Lutomski, Michael

    2012-01-01

    There are two general shortcomings to the current annual sparing assessment: 1. The vehicle functions are currently assessed according to confidence targets, which can be misleading- overly conservative or optimistic. 2. The current confidence levels are arbitrarily determined and do not account for epistemic uncertainty (lack of knowledge) in the ORU failure rate. There are two major categories of uncertainty that impact Sparing Assessment: (a) Aleatory Uncertainty: Natural variability in distribution of actual failures around an Mean Time Between Failure (MTBF) (b) Epistemic Uncertainty : Lack of knowledge about the true value of an Orbital Replacement Unit's (ORU) MTBF We propose an approach to revise confidence targets and account for both categories of uncertainty, an approach we call Probability and Confidence Trade-space (PACT) evaluation.

  7. Top down arsenic uncertainty measurement in water and sediments from Guarapiranga dam (Brazil)

    NASA Astrophysics Data System (ADS)

    Faustino, M. G.; Lange, C. N.; Monteiro, L. R.; Furusawa, H. A.; Marques, J. R.; Stellato, T. B.; Soares, S. M. V.; da Silva, T. B. S. C.; da Silva, D. B.; Cotrim, M. E. B.; Pires, M. A. F.

    2018-03-01

    Total arsenic measurements assessment regarding legal threshold demands more than average and standard deviation approach. In this way, analytical measurement uncertainty evaluation was conducted in order to comply with legal requirements and to allow the balance of arsenic in both water and sediment compartments. A top-down approach for measurement uncertainties was applied to evaluate arsenic concentrations in water and sediments from Guarapiranga dam (São Paulo, Brazil). Laboratory quality control and arsenic interlaboratory tests data were used in this approach to estimate the uncertainties associated with the methodology.

  8. Interval-parameter semi-infinite fuzzy-stochastic mixed-integer programming approach for environmental management under multiple uncertainties.

    PubMed

    Guo, P; Huang, G H

    2010-03-01

    In this study, an interval-parameter semi-infinite fuzzy-chance-constrained mixed-integer linear programming (ISIFCIP) approach is developed for supporting long-term planning of waste-management systems under multiple uncertainties in the City of Regina, Canada. The method improves upon the existing interval-parameter semi-infinite programming (ISIP) and fuzzy-chance-constrained programming (FCCP) by incorporating uncertainties expressed as dual uncertainties of functional intervals and multiple uncertainties of distributions with fuzzy-interval admissible probability of violating constraint within a general optimization framework. The binary-variable solutions represent the decisions of waste-management-facility expansion, and the continuous ones are related to decisions of waste-flow allocation. The interval solutions can help decision-makers to obtain multiple decision alternatives, as well as provide bases for further analyses of tradeoffs between waste-management cost and system-failure risk. In the application to the City of Regina, Canada, two scenarios are considered. In Scenario 1, the City's waste-management practices would be based on the existing policy over the next 25 years. The total diversion rate for the residential waste would be approximately 14%. Scenario 2 is associated with a policy for waste minimization and diversion, where 35% diversion of residential waste should be achieved within 15 years, and 50% diversion over 25 years. In this scenario, not only landfill would be expanded, but also CF and MRF would be expanded. Through the scenario analyses, useful decision support for the City's solid-waste managers and decision-makers has been generated. Three special characteristics of the proposed method make it unique compared with other optimization techniques that deal with uncertainties. Firstly, it is useful for tackling multiple uncertainties expressed as intervals, functional intervals, probability distributions, fuzzy sets, and their combinations; secondly, it has capability in addressing the temporal variations of the functional intervals; thirdly, it can facilitate dynamic analysis for decisions of facility-expansion planning and waste-flow allocation within a multi-facility, multi-period and multi-option context. Copyright 2009 Elsevier Ltd. All rights reserved.

  9. Assessing and reducing hydrogeologic model uncertainty

    USDA-ARS?s Scientific Manuscript database

    NRC is sponsoring research that couples model abstraction techniques with model uncertainty assessment methods. Insights and information from this program will be useful in decision making by NRC staff, licensees and stakeholders in their assessment of subsurface radionuclide transport. All analytic...

  10. Plurality of Type A evaluations of uncertainty

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio; Pintar, Adam L.

    2017-10-01

    The evaluations of measurement uncertainty involving the application of statistical methods to measurement data (Type A evaluations as specified in the Guide to the Expression of Uncertainty in Measurement, GUM) comprise the following three main steps: (i) developing a statistical model that captures the pattern of dispersion or variability in the experimental data, and that relates the data either to the measurand directly or to some intermediate quantity (input quantity) that the measurand depends on; (ii) selecting a procedure for data reduction that is consistent with this model and that is fit for the purpose that the results are intended to serve; (iii) producing estimates of the model parameters, or predictions based on the fitted model, and evaluations of uncertainty that qualify either those estimates or these predictions, and that are suitable for use in subsequent uncertainty propagation exercises. We illustrate these steps in uncertainty evaluations related to the measurement of the mass fraction of vanadium in a bituminous coal reference material, including the assessment of the homogeneity of the material, and to the calibration and measurement of the amount-of-substance fraction of a hydrochlorofluorocarbon in air, and of the age of a meteorite. Our goal is to expose the plurality of choices that can reasonably be made when taking each of the three steps outlined above, and to show that different choices typically lead to different estimates of the quantities of interest, and to different evaluations of the associated uncertainty. In all the examples, the several alternatives considered represent choices that comparably competent statisticians might make, but who differ in the assumptions that they are prepared to rely on, and in their selection of approach to statistical inference. They represent also alternative treatments that the same statistician might give to the same data when the results are intended for different purposes.

  11. A multistage stochastic programming model for a multi-period strategic expansion of biofuel supply chain under evolving uncertainties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Fei; Huang, Yongxi

    Here, we develop a multistage, stochastic mixed-integer model to support biofuel supply chain expansion under evolving uncertainties. By utilizing the block-separable recourse property, we reformulate the multistage program in an equivalent two-stage program and solve it using an enhanced nested decomposition method with maximal non-dominated cuts. We conduct extensive numerical experiments and demonstrate the application of the model and algorithm in a case study based on the South Carolina settings. The value of multistage stochastic programming method is also explored by comparing the model solution with the counterparts of an expected value based deterministic model and a two-stage stochastic model.

  12. A multistage stochastic programming model for a multi-period strategic expansion of biofuel supply chain under evolving uncertainties

    DOE PAGES

    Xie, Fei; Huang, Yongxi

    2018-02-04

    Here, we develop a multistage, stochastic mixed-integer model to support biofuel supply chain expansion under evolving uncertainties. By utilizing the block-separable recourse property, we reformulate the multistage program in an equivalent two-stage program and solve it using an enhanced nested decomposition method with maximal non-dominated cuts. We conduct extensive numerical experiments and demonstrate the application of the model and algorithm in a case study based on the South Carolina settings. The value of multistage stochastic programming method is also explored by comparing the model solution with the counterparts of an expected value based deterministic model and a two-stage stochastic model.

  13. Position uncertainty distribution for articulated arm coordinate measuring machine based on simplified definite integration

    NASA Astrophysics Data System (ADS)

    You, Xu; Zhi-jian, Zong; Qun, Gao

    2018-07-01

    This paper describes a methodology for the position uncertainty distribution of an articulated arm coordinate measuring machine (AACMM). First, a model of the structural parameter uncertainties was established by statistical method. Second, the position uncertainty space volume of the AACMM in a certain configuration was expressed using a simplified definite integration method based on the structural parameter uncertainties; it was then used to evaluate the position accuracy of the AACMM in a certain configuration. Third, the configurations of a certain working point were calculated by an inverse solution, and the position uncertainty distribution of a certain working point was determined; working point uncertainty can be evaluated by the weighting method. Lastly, the position uncertainty distribution in the workspace of the ACCMM was described by a map. A single-point contrast test of a 6-joint AACMM was carried out to verify the effectiveness of the proposed method, and it was shown that the method can describe the position uncertainty of the AACMM and it was used to guide the calibration of the AACMM and the choice of AACMM’s accuracy area.

  14. Accounting for uncertainty in marine reserve design.

    PubMed

    Halpern, Benjamin S; Regan, Helen M; Possingham, Hugh P; McCarthy, Michael A

    2006-01-01

    Ecosystems and the species and communities within them are highly complex systems that defy predictions with any degree of certainty. Managing and conserving these systems in the face of uncertainty remains a daunting challenge, particularly with respect to developing networks of marine reserves. Here we review several modelling frameworks that explicitly acknowledge and incorporate uncertainty, and then use these methods to evaluate reserve spacing rules given increasing levels of uncertainty about larval dispersal distances. Our approach finds similar spacing rules as have been proposed elsewhere - roughly 20-200 km - but highlights several advantages provided by uncertainty modelling over more traditional approaches to developing these estimates. In particular, we argue that uncertainty modelling can allow for (1) an evaluation of the risk associated with any decision based on the assumed uncertainty; (2) a method for quantifying the costs and benefits of reducing uncertainty; and (3) a useful tool for communicating to stakeholders the challenges in managing highly uncertain systems. We also argue that incorporating rather than avoiding uncertainty will increase the chances of successfully achieving conservation and management goals.

  15. Optimal control of native predators

    USGS Publications Warehouse

    Martin, Julien; O'Connell, Allan F.; Kendall, William L.; Runge, Michael C.; Simons, Theodore R.; Waldstein, Arielle H.; Schulte, Shiloh A.; Converse, Sarah J.; Smith, Graham W.; Pinion, Timothy; Rikard, Michael; Zipkin, Elise F.

    2010-01-01

    We apply decision theory in a structured decision-making framework to evaluate how control of raccoons (Procyon lotor), a native predator, can promote the conservation of a declining population of American Oystercatchers (Haematopus palliatus) on the Outer Banks of North Carolina. Our management objective was to maintain Oystercatcher productivity above a level deemed necessary for population recovery while minimizing raccoon removal. We evaluated several scenarios including no raccoon removal, and applied an adaptive optimization algorithm to account for parameter uncertainty. We show how adaptive optimization can be used to account for uncertainties about how raccoon control may affect Oystercatcher productivity. Adaptive management can reduce this type of uncertainty and is particularly well suited for addressing controversial management issues such as native predator control. The case study also offers several insights that may be relevant to the optimal control of other native predators. First, we found that stage-specific removal policies (e.g., yearling versus adult raccoon removals) were most efficient if the reproductive values among stage classes were very different. Second, we found that the optimal control of raccoons would result in higher Oystercatcher productivity than the minimum levels recommended for this species. Third, we found that removing more raccoons initially minimized the total number of removals necessary to meet long term management objectives. Finally, if for logistical reasons managers cannot sustain a removal program by removing a minimum number of raccoons annually, managers may run the risk of creating an ecological trap for Oystercatchers.

  16. Bayesian methods for uncertainty factor application for derivation of reference values.

    PubMed

    Simon, Ted W; Zhu, Yiliang; Dourson, Michael L; Beck, Nancy B

    2016-10-01

    In 2014, the National Research Council (NRC) published Review of EPA's Integrated Risk Information System (IRIS) Process that considers methods EPA uses for developing toxicity criteria for non-carcinogens. These criteria are the Reference Dose (RfD) for oral exposure and Reference Concentration (RfC) for inhalation exposure. The NRC Review suggested using Bayesian methods for application of uncertainty factors (UFs) to adjust the point of departure dose or concentration to a level considered to be without adverse effects for the human population. The NRC foresaw Bayesian methods would be potentially useful for combining toxicity data from disparate sources-high throughput assays, animal testing, and observational epidemiology. UFs represent five distinct areas for which both adjustment and consideration of uncertainty may be needed. NRC suggested UFs could be represented as Bayesian prior distributions, illustrated the use of a log-normal distribution to represent the composite UF, and combined this distribution with a log-normal distribution representing uncertainty in the point of departure (POD) to reflect the overall uncertainty. Here, we explore these suggestions and present a refinement of the methodology suggested by NRC that considers each individual UF as a distribution. From an examination of 24 evaluations from EPA's IRIS program, when individual UFs were represented using this approach, the geometric mean fold change in the value of the RfD or RfC increased from 3 to over 30, depending on the number of individual UFs used and the sophistication of the assessment. We present example calculations and recommendations for implementing the refined NRC methodology. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  17. On the Application of Science Systems Engineering and Uncertainty Quantification for Ice Sheet Science and Sea Level Projections

    NASA Astrophysics Data System (ADS)

    Schlegel, Nicole-Jeanne; Boening, Carmen; Larour, Eric; Limonadi, Daniel; Schodlok, Michael; Seroussi, Helene; Watkins, Michael

    2017-04-01

    Research and development activities at the Jet Propulsion Laboratory (JPL) currently support the creation of a framework to formally evaluate the observational needs within earth system science. One of the pilot projects of this effort aims to quantify uncertainties in global mean sea level rise projections, due to contributions from the continental ice sheets. Here, we take advantage of established uncertainty quantification tools embedded within the JPL-University of California at Irvine Ice Sheet System Model (ISSM). We conduct sensitivity and Monte-Carlo style sampling experiments on forward simulations of the Greenland and Antarctic ice sheets. By varying internal parameters and boundary conditions of the system over both extreme and credible worst-case ranges, we assess the impact of the different parameter ranges on century-scale sea level rise projections. The results inform efforts to a) isolate the processes and inputs that are most responsible for determining ice sheet contribution to sea level; b) redefine uncertainty brackets for century-scale projections; and c) provide a prioritized list of measurements, along with quantitative information on spatial and temporal resolution, required for reducing uncertainty in future sea level rise projections. Results indicate that ice sheet mass loss is dependent on the spatial resolution of key boundary conditions - such as bedrock topography and melt rates at the ice-ocean interface. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  18. Thermonuclear 19F(p, {{\\boldsymbol{\\alpha }}}_{0})16O reaction rate

    NASA Astrophysics Data System (ADS)

    He, Jian-Jun; Lombardo, Ivano; Dell'Aquila, Daniele; Xu, Yi; Zhang, Li-Yong; Liu, Wei-Ping

    2018-01-01

    The thermonuclear 19F(p, {{{α }}}0)16O reaction rate in the temperature region 0.007-10 GK has been derived by re-evaluating the available experimental data, together with the low-energy theoretical R-matrix extrapolations. Our new rate deviates by up to about 30% compared to the previous results, although all rates are consistent within the uncertainties. At very low temperature (e.g. 0.01 GK) our reaction rate is about 20% lower than the most recently published rate, because of a difference in the low energy extrapolated S-factor and a more accurate estimate of the reduced mass used in the calculation of the reaction rate. At temperatures above ˜1 GK, our rate is lower, for instance, by about 20% around 1.75 GK, because we have re-evaluated the previous data (Isoya et al., Nucl. Phys. 7, 116 (1958)) in a meticulous way. The present interpretation is supported by the direct experimental data. The uncertainties of the present evaluated rate are estimated to be about 20% in the temperature region below 0.2 GK, and are mainly caused by the lack of low-energy experimental data and the large uncertainties in the existing data. Asymptotic giant branch (AGB) stars evolve at temperatures below 0.2 GK, where the 19F(p, {{α }})16O reaction may play a very important role. However, the current accuracy of the reaction rate is insufficient to help to describe, in a careful way, the fluorine over-abundances observed in AGB stars. Precise cross section (or S factor) data in the low energy region are therefore needed for astrophysical nucleosynthesis studies. Supported by National Natural Science Foundation of China (11490562, 11490560, 11675229) and National Key Research and Development Program of China (2016YFA0400503)

  19. Evaluating the uncertainty of input quantities in measurement models

    NASA Astrophysics Data System (ADS)

    Possolo, Antonio; Elster, Clemens

    2014-06-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) gives guidance about how values and uncertainties should be assigned to the input quantities that appear in measurement models. This contribution offers a concrete proposal for how that guidance may be updated in light of the advances in the evaluation and expression of measurement uncertainty that were made in the course of the twenty years that have elapsed since the publication of the GUM, and also considering situations that the GUM does not yet contemplate. Our motivation is the ongoing conversation about a new edition of the GUM. While generally we favour a Bayesian approach to uncertainty evaluation, we also recognize the value that other approaches may bring to the problems considered here, and focus on methods for uncertainty evaluation and propagation that are widely applicable, including to cases that the GUM has not yet addressed. In addition to Bayesian methods, we discuss maximum-likelihood estimation, robust statistical methods, and measurement models where values of nominal properties play the same role that input quantities play in traditional models. We illustrate these general-purpose techniques in concrete examples, employing data sets that are realistic but that also are of conveniently small sizes. The supplementary material available online lists the R computer code that we have used to produce these examples (stacks.iop.org/Met/51/3/339/mmedia). Although we strive to stay close to clause 4 of the GUM, which addresses the evaluation of uncertainty for input quantities, we depart from it as we review the classes of measurement models that we believe are generally useful in contemporary measurement science. We also considerably expand and update the treatment that the GUM gives to Type B evaluations of uncertainty: reviewing the state-of-the-art, disciplined approach to the elicitation of expert knowledge, and its encapsulation in probability distributions that are usable in uncertainty propagation exercises. In this we deviate markedly and emphatically from the GUM Supplement 1, which gives pride of place to the Principle of Maximum Entropy as a means to assign probability distributions to input quantities.

  20. Modeling sustainability in renewable energy supply chain systems

    NASA Astrophysics Data System (ADS)

    Xie, Fei

    This dissertation aims at modeling sustainability of renewable fuel supply chain systems against emerging challenges. In particular, the dissertation focuses on the biofuel supply chain system design, and manages to develop advanced modeling framework and corresponding solution methods in tackling challenges in sustaining biofuel supply chain systems. These challenges include: (1) to integrate "environmental thinking" into the long-term biofuel supply chain planning; (2) to adopt multimodal transportation to mitigate seasonality in biofuel supply chain operations; (3) to provide strategies in hedging against uncertainty from conversion technology; and (4) to develop methodologies in long-term sequential planning of the biofuel supply chain under uncertainties. All models are mixed integer programs, which also involves multi-objective programming method and two-stage/multistage stochastic programming methods. In particular for the long-term sequential planning under uncertainties, to reduce the computational challenges due to the exponential expansion of the scenario tree, I also developed efficient ND-Max method which is more efficient than CPLEX and Nested Decomposition method. Through result analysis of four independent studies, it is found that the proposed modeling frameworks can effectively improve the economic performance, enhance environmental benefits and reduce risks due to systems uncertainties for the biofuel supply chain systems.

  1. Informative Bayesian Type A uncertainty evaluation, especially applicable to a small number of observations

    NASA Astrophysics Data System (ADS)

    Cox, M.; Shirono, K.

    2017-10-01

    A criticism levelled at the Guide to the Expression of Uncertainty in Measurement (GUM) is that it is based on a mixture of frequentist and Bayesian thinking. In particular, the GUM’s Type A (statistical) uncertainty evaluations are frequentist, whereas the Type B evaluations, using state-of-knowledge distributions, are Bayesian. In contrast, making the GUM fully Bayesian implies, among other things, that a conventional objective Bayesian approach to Type A uncertainty evaluation for a number n of observations leads to the impractical consequence that n must be at least equal to 4, thus presenting a difficulty for many metrologists. This paper presents a Bayesian analysis of Type A uncertainty evaluation that applies for all n ≥slant 2 , as in the frequentist analysis in the current GUM. The analysis is based on assuming that the observations are drawn from a normal distribution (as in the conventional objective Bayesian analysis), but uses an informative prior based on lower and upper bounds for the standard deviation of the sampling distribution for the quantity under consideration. The main outcome of the analysis is a closed-form mathematical expression for the factor by which the standard deviation of the mean observation should be multiplied to calculate the required standard uncertainty. Metrological examples are used to illustrate the approach, which is straightforward to apply using a formula or look-up table.

  2. Flight Experiment Verification of Shuttle Boundary Layer Transition Prediction Tool

    NASA Technical Reports Server (NTRS)

    Berry, Scott A.; Berger, Karen T.; Horvath, Thomas J.; Wood, William A.

    2016-01-01

    Boundary layer transition at hypersonic conditions is critical to the design of future high-speed aircraft and spacecraft. Accurate methods to predict transition would directly impact the aerothermodynamic environments used to size a hypersonic vehicle's thermal protection system. A transition prediction tool, based on wind tunnel derived discrete roughness correlations, was developed and implemented for the Space Shuttle return-to-flight program. This tool was also used to design a boundary layer transition flight experiment in order to assess correlation uncertainties, particularly with regard to high Mach-number transition and tunnel-to-flight scaling. A review is provided of the results obtained from the flight experiment in order to evaluate the transition prediction tool implemented for the Shuttle program.

  3. Observational uncertainty and regional climate model evaluation: A pan-European perspective

    NASA Astrophysics Data System (ADS)

    Kotlarski, Sven; Szabó, Péter; Herrera, Sixto; Räty, Olle; Keuler, Klaus; Soares, Pedro M.; Cardoso, Rita M.; Bosshard, Thomas; Pagé, Christian; Boberg, Fredrik; Gutiérrez, José M.; Jaczewski, Adam; Kreienkamp, Frank; Liniger, Mark. A.; Lussana, Cristian; Szepszo, Gabriella

    2017-04-01

    Local and regional climate change assessments based on downscaling methods crucially depend on the existence of accurate and reliable observational reference data. In dynamical downscaling via regional climate models (RCMs) observational data can influence model development itself and, later on, model evaluation, parameter calibration and added value assessment. In empirical-statistical downscaling, observations serve as predictand data and directly influence model calibration with corresponding effects on downscaled climate change projections. Focusing on the evaluation of RCMs, we here analyze the influence of uncertainties in observational reference data on evaluation results in a well-defined performance assessment framework and on a European scale. For this purpose we employ three different gridded observational reference grids, namely (1) the well-established EOBS dataset (2) the recently developed EURO4M-MESAN regional re-analysis, and (3) several national high-resolution and quality-controlled gridded datasets that recently became available. In terms of climate models five reanalysis-driven experiments carried out by five different RCMs within the EURO-CORDEX framework are used. Two variables (temperature and precipitation) and a range of evaluation metrics that reflect different aspects of RCM performance are considered. We furthermore include an illustrative model ranking exercise and relate observational spread to RCM spread. The results obtained indicate a varying influence of observational uncertainty on model evaluation depending on the variable, the season, the region and the specific performance metric considered. Over most parts of the continent, the influence of the choice of the reference dataset for temperature is rather small for seasonal mean values and inter-annual variability. Here, model uncertainty (as measured by the spread between the five RCM simulations considered) is typically much larger than reference data uncertainty. For parameters of the daily temperature distribution and for the spatial pattern correlation, however, important dependencies on the reference dataset can arise. The related evaluation uncertainties can be as large or even larger than model uncertainty. For precipitation the influence of observational uncertainty is, in general, larger than for temperature. It often dominates model uncertainty especially for the evaluation of the wet day frequency, the spatial correlation and the shape and location of the distribution of daily values. But even the evaluation of large-scale seasonal mean values can be considerably affected by the choice of the reference. When employing a simple and illustrative model ranking scheme on these results it is found that RCM ranking in many cases depends on the reference dataset employed.

  4. A two-stage mixed-integer fuzzy programming with interval-valued membership functions approach for flood-diversion planning.

    PubMed

    Wang, S; Huang, G H

    2013-03-15

    Flood disasters have been extremely severe in recent decades, and they account for about one third of all natural catastrophes throughout the world. In this study, a two-stage mixed-integer fuzzy programming with interval-valued membership functions (TMFP-IMF) approach is developed for flood-diversion planning under uncertainty. TMFP-IMF integrates the fuzzy flexible programming, two-stage stochastic programming, and integer programming within a general framework. A concept of interval-valued fuzzy membership function is introduced to address complexities of system uncertainties. TMFP-IMF can not only deal with uncertainties expressed as fuzzy sets and probability distributions, but also incorporate pre-regulated water-diversion policies directly into its optimization process. TMFP-IMF is applied to a hypothetical case study of flood-diversion planning for demonstrating its applicability. Results indicate that reasonable solutions can be generated for binary and continuous variables. A variety of flood-diversion and capacity-expansion schemes can be obtained under four scenarios, which enable decision makers (DMs) to identify the most desired one based on their perceptions and attitudes towards the objective-function value and constraints. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Molten nitrate salt technology development

    NASA Astrophysics Data System (ADS)

    Carling, R. W.

    1981-04-01

    This paper presents an overview of the experimental programs underway in support of the Thermal Energy Storage for Solar Thermal Applications (TESSTA) program. The experimental programs are concentrating on molten nitrate salts which have been proposed as heat transfer and energy storage medium. The salt composition of greatest interest is drawsalt, nominally a 50-50 molar mixture of NaNO3 and KNO3 with a melting point of 220 C. Several technical uncertainties have been identified that must be resolved before nitrate based solar plants can be commercialized. Research programs at Sandia National Laboratories, universities, and industrial suppliers have been implemented to resolve these technical uncertainties. The experimental programs involve corrosion, decomposition, physical properties, and environmental cracking. Summaries of each project and how they impact central receiver applications such as the repowering/industrial retrofit and cogeneration program are presented.

  6. Application of automated measurement and verification to utility energy efficiency program data

    DOE PAGES

    Granderson, Jessica; Touzani, Samir; Fernandes, Samuel; ...

    2017-02-17

    Trustworthy savings calculations are critical to convincing regulators of both the cost-effectiveness of energy efficiency program investments and their ability to defer supply-side capital investments. Today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of energy efficiency programs. They also require time-consuming data acquisition. A spectrum of savings calculation approaches is used, with some relying more heavily on measured data and others relying more heavily on estimated, modeled, or stipulated data. The increasing availability of “smart” meters and devices that report near-real time data, combined with new analytical approaches to quantifymore » savings, offers the potential to conduct M&V more quickly and at lower cost, with comparable or improved accuracy. Commercial energy management and information systems (EMIS) technologies are beginning to offer these ‘M&V 2.0’ capabilities, and program administrators want to understand how they might assist programs in quickly and accurately measuring energy savings. This paper presents the results of recent testing of the ability to use automation to streamline the M&V process. In this paper, we apply an automated whole-building M&V tool to historic data sets from energy efficiency programs to begin to explore the accuracy, cost, and time trade-offs between more traditional M&V, and these emerging streamlined methods that use high-resolution energy data and automated computational intelligence. For the data sets studied we evaluate the fraction of buildings that are well suited to automated baseline characterization, the uncertainty in gross savings that is due to M&V 2.0 tools’ model error, and indications of labor time savings, and how the automated savings results compare to prior, traditionally determined savings results. The results show that 70% of the buildings were well suited to the automated approach. In a majority of the cases (80%) savings and uncertainties for each individual building were quantified to levels above the criteria in ASHRAE Guideline 14. In addition the findings suggest that M&V 2.0 methods may also offer time-savings relative to traditional approaches. Lastly, we discuss the implications of these findings relative to the potential evolution of M&V, and pilots currently being launched to test how M&V automation can be integrated into ratepayer-funded programs and professional implementation and evaluation practice.« less

  7. Application of automated measurement and verification to utility energy efficiency program data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granderson, Jessica; Touzani, Samir; Fernandes, Samuel

    Trustworthy savings calculations are critical to convincing regulators of both the cost-effectiveness of energy efficiency program investments and their ability to defer supply-side capital investments. Today’s methods for measurement and verification (M&V) of energy savings constitute a significant portion of the total costs of energy efficiency programs. They also require time-consuming data acquisition. A spectrum of savings calculation approaches is used, with some relying more heavily on measured data and others relying more heavily on estimated, modeled, or stipulated data. The increasing availability of “smart” meters and devices that report near-real time data, combined with new analytical approaches to quantifymore » savings, offers the potential to conduct M&V more quickly and at lower cost, with comparable or improved accuracy. Commercial energy management and information systems (EMIS) technologies are beginning to offer these ‘M&V 2.0’ capabilities, and program administrators want to understand how they might assist programs in quickly and accurately measuring energy savings. This paper presents the results of recent testing of the ability to use automation to streamline the M&V process. In this paper, we apply an automated whole-building M&V tool to historic data sets from energy efficiency programs to begin to explore the accuracy, cost, and time trade-offs between more traditional M&V, and these emerging streamlined methods that use high-resolution energy data and automated computational intelligence. For the data sets studied we evaluate the fraction of buildings that are well suited to automated baseline characterization, the uncertainty in gross savings that is due to M&V 2.0 tools’ model error, and indications of labor time savings, and how the automated savings results compare to prior, traditionally determined savings results. The results show that 70% of the buildings were well suited to the automated approach. In a majority of the cases (80%) savings and uncertainties for each individual building were quantified to levels above the criteria in ASHRAE Guideline 14. In addition the findings suggest that M&V 2.0 methods may also offer time-savings relative to traditional approaches. Lastly, we discuss the implications of these findings relative to the potential evolution of M&V, and pilots currently being launched to test how M&V automation can be integrated into ratepayer-funded programs and professional implementation and evaluation practice.« less

  8. Robust portfolio selection based on asymmetric measures of variability of stock returns

    NASA Astrophysics Data System (ADS)

    Chen, Wei; Tan, Shaohua

    2009-10-01

    This paper addresses a new uncertainty set--interval random uncertainty set for robust optimization. The form of interval random uncertainty set makes it suitable for capturing the downside and upside deviations of real-world data. These deviation measures capture distributional asymmetry and lead to better optimization results. We also apply our interval random chance-constrained programming to robust mean-variance portfolio selection under interval random uncertainty sets in the elements of mean vector and covariance matrix. Numerical experiments with real market data indicate that our approach results in better portfolio performance.

  9. Evaluation of thyroid radioactivity measurement data from Hanford workers, 1944--1946

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ikenberry, T.A.

    1991-05-01

    This report describes the preliminary results of an evaluation conducted in support of the Hanford Environmental Dose Reconstruction (HEDR) Project. The primary objective of the HEDR Project is to estimate the radiation doses that populations could have received from nuclear operations at the Hanford Site since 1944. A secondary objective is to make information that HEDR staff members used in estimate radiation doses available to the public. The objectives of this report to make available thyroid measurement data from Hanford workers for the year 1944 through 1946, and to investigate the suitability of those data for use in the HEDRmore » dose estimation process. An important part of this investigation was to provide a description of the uncertainty associated with the data. Lack of documentation on thyroid measurements from this period required that assumptions be made to perform data evaluations. These assumptions introduce uncertainty into the evaluations that could be significant. It is important to recognize the nature of these assumptions, the inherent uncertainty, and the propagation of this uncertainty, and the propagation of this uncertainty through data evaluations to any conclusions that can be made by using the data. 15 refs., 1 fig., 5 tabs.« less

  10. Gum-compliant uncertainty propagations for Pu and U concentration measurements using the 1st-prototype XOS/LANL hiRX instrument; an SRNL H-Canyon Test Bed performance evaluation project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, Michael K.; O'Rourke, Patrick E.

    An SRNL H-Canyon Test Bed performance evaluation project was completed jointly by SRNL and LANL on a prototype monochromatic energy dispersive x-ray fluorescence instrument, the hiRX. A series of uncertainty propagations were generated based upon plutonium and uranium measurements performed using the alpha-prototype hiRX instrument. Data reduction and uncertainty modeling provided in this report were performed by the SRNL authors. Observations and lessons learned from this evaluation were also used to predict the expected uncertainties that should be achievable at multiple plutonium and uranium concentration levels provided instrument hardware and software upgrades being recommended by LANL and SRNL are performed.

  11. An uncertainty analysis of wildfire modeling [Chapter 13

    Treesearch

    Karin Riley; Matthew Thompson

    2017-01-01

    Before fire models can be understood, evaluated, and effectively applied to support decision making, model-based uncertainties must be analyzed. In this chapter, we identify and classify sources of uncertainty using an established analytical framework, and summarize results graphically in an uncertainty matrix. Our analysis facilitates characterization of the...

  12. Evaluating Predictive Uncertainty of Hyporheic Exchange Modelling

    NASA Astrophysics Data System (ADS)

    Chow, R.; Bennett, J.; Dugge, J.; Wöhling, T.; Nowak, W.

    2017-12-01

    Hyporheic exchange is the interaction of water between rivers and groundwater, and is difficult to predict. One of the largest contributions to predictive uncertainty for hyporheic fluxes have been attributed to the representation of heterogeneous subsurface properties. This research aims to evaluate which aspect of the subsurface representation - the spatial distribution of hydrofacies or the model for local-scale (within-facies) heterogeneity - most influences the predictive uncertainty. Also, we seek to identify data types that help reduce this uncertainty best. For this investigation, we conduct a modelling study of the Steinlach River meander, in Southwest Germany. The Steinlach River meander is an experimental site established in 2010 to monitor hyporheic exchange at the meander scale. We use HydroGeoSphere, a fully integrated surface water-groundwater model, to model hyporheic exchange and to assess the predictive uncertainty of hyporheic exchange transit times (HETT). A highly parameterized complex model is built and treated as `virtual reality', which is in turn modelled with simpler subsurface parameterization schemes (Figure). Then, we conduct Monte-Carlo simulations with these models to estimate the predictive uncertainty. Results indicate that: Uncertainty in HETT is relatively small for early times and increases with transit times. Uncertainty from local-scale heterogeneity is negligible compared to uncertainty in the hydrofacies distribution. Introducing more data to a poor model structure may reduce predictive variance, but does not reduce predictive bias. Hydraulic head observations alone cannot constrain the uncertainty of HETT, however an estimate of hyporheic exchange flux proves to be more effective at reducing this uncertainty. Figure: Approach for evaluating predictive model uncertainty. A conceptual model is first developed from the field investigations. A complex model (`virtual reality') is then developed based on that conceptual model. This complex model then serves as the basis to compare simpler model structures. Through this approach, predictive uncertainty can be quantified relative to a known reference solution.

  13. An Information Search Model of Evaluative Concerns in Intergroup Interaction

    ERIC Educational Resources Information Center

    Vorauer, Jacquie D.

    2006-01-01

    In an information search model, evaluative concerns during intergroup interaction are conceptualized as a joint function of uncertainty regarding and importance attached to out-group members' views of oneself. High uncertainty generally fosters evaluative concerns during intergroup exchanges. Importance depends on whether out-group members'…

  14. Integrated Arrival and Departure Schedule Optimization Under Uncertainty

    NASA Technical Reports Server (NTRS)

    Xue, Min; Zelinski, Shannon

    2014-01-01

    In terminal airspace, integrating arrivals and departures with shared waypoints provides the potential of improving operational efficiency by allowing direct routes when possible. Incorporating stochastic evaluation as a post-analysis process of deterministic optimization, and imposing a safety buffer in deterministic optimization, are two ways to learn and alleviate the impact of uncertainty and to avoid unexpected outcomes. This work presents a third and direct way to take uncertainty into consideration during the optimization. The impact of uncertainty was incorporated into cost evaluations when searching for the optimal solutions. The controller intervention count was computed using a heuristic model and served as another stochastic cost besides total delay. Costs under uncertainty were evaluated using Monte Carlo simulations. The Pareto fronts that contain a set of solutions were identified and the trade-off between delays and controller intervention count was shown. Solutions that shared similar delays but had different intervention counts were investigated. The results showed that optimization under uncertainty could identify compromise solutions on Pareto fonts, which is better than deterministic optimization with extra safety buffers. It helps decision-makers reduce controller intervention while achieving low delays.

  15. Chemical purity using quantitative 1H-nuclear magnetic resonance: a hierarchical Bayesian approach for traceable calibrations

    NASA Astrophysics Data System (ADS)

    Toman, Blaza; Nelson, Michael A.; Lippa, Katrice A.

    2016-10-01

    Chemical purity assessment using quantitative 1H-nuclear magnetic resonance spectroscopy is a method based on ratio references of mass and signal intensity of the analyte species to that of chemical standards of known purity. As such, it is an example of a calculation using a known measurement equation with multiple inputs. Though multiple samples are often analyzed during purity evaluations in order to assess measurement repeatability, the uncertainty evaluation must also account for contributions from inputs to the measurement equation. Furthermore, there may be other uncertainty components inherent in the experimental design, such as independent implementation of multiple calibration standards. As such, the uncertainty evaluation is not purely bottom up (based on the measurement equation) or top down (based on the experimental design), but inherently contains elements of both. This hybrid form of uncertainty analysis is readily implemented with Bayesian statistical analysis. In this article we describe this type of analysis in detail and illustrate it using data from an evaluation of chemical purity and its uncertainty for a folic acid material.

  16. Development, implementation and evaluation of a clinical research engagement and leadership capacity building program in a large Australian health care service.

    PubMed

    Misso, Marie L; Ilic, Dragan; Haines, Terry P; Hutchinson, Alison M; East, Christine E; Teede, Helena J

    2016-01-14

    Health professionals need to be integrated more effectively in clinical research to ensure that research addresses clinical needs and provides practical solutions at the coal face of care. In light of limited evidence on how best to achieve this, evaluation of strategies to introduce, adapt and sustain evidence-based practices across different populations and settings is required. This project aims to address this gap through the co-design, development, implementation, evaluation, refinement and ultimately scale-up of a clinical research engagement and leadership capacity building program in a clinical setting with little to no co-ordinated approach to clinical research engagement and education. The protocol is based on principles of research capacity building and on a six-step framework, which have previously led to successful implementation and long-term sustainability. A mixed methods study design will be used. Methods will include: (1) a review of the literature about strategies that engage health professionals in research through capacity building and/or education in research methods; (2) a review of existing local research education and support elements; (3) a needs assessment in the local clinical setting, including an online cross-sectional survey and semi-structured interviews; (4) co-design and development of an educational and support program; (5) implementation of the program in the clinical environment; and (6) pre- and post-implementation evaluation and ultimately program scale-up. The evaluation focuses on research activity and knowledge, attitudes and preferences about clinical research, evidence-based practice and leadership and post implementation, about their satisfaction with the program. The investigators will evaluate the feasibility and effect of the program according to capacity building measures and will revise where appropriate prior to scale-up. It is anticipated that this clinical research engagement and leadership capacity building program will enable and enhance clinically relevant research to be led and conducted by health professionals in the health setting. This approach will also encourage identification of areas of clinical uncertainty and need that can be addressed through clinical research within the health setting.

  17. Application of Non-Deterministic Methods to Assess Modeling Uncertainties for Reinforced Carbon-Carbon Debris Impacts

    NASA Technical Reports Server (NTRS)

    Lyle, Karen H.; Fasanella, Edwin L.; Melis, Matthew; Carney, Kelly; Gabrys, Jonathan

    2004-01-01

    The Space Shuttle Columbia Accident Investigation Board (CAIB) made several recommendations for improving the NASA Space Shuttle Program. An extensive experimental and analytical program has been developed to address two recommendations related to structural impact analysis. The objective of the present work is to demonstrate the application of probabilistic analysis to assess the effect of uncertainties on debris impacts on Space Shuttle Reinforced Carbon-Carbon (RCC) panels. The probabilistic analysis is used to identify the material modeling parameters controlling the uncertainty. A comparison of the finite element results with limited experimental data provided confidence that the simulations were adequately representing the global response of the material. Five input parameters were identified as significantly controlling the response.

  18. Fuzzy Energy and Reserve Co-optimization With High Penetration of Renewable Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Cong; Botterud, Audun; Zhou, Zhi

    In this study, we propose a fuzzy-based energy and reserve co-optimization model with consideration of high penetration of renewable energy. Under the assumption of a fixed uncertainty set of renewables, a two-stage robust model is proposed for clearing energy and reserves in the first stage and checking the feasibility and robustness of re-dispatches in the second stage. Fuzzy sets and their membership functions are introduced into the optimization model to represent the satisfaction degree of the variable uncertainty sets. The lower bound of the uncertainty set is expressed as fuzzy membership functions. The solutions are obtained by transforming the fuzzymore » mathematical programming formulation into traditional mixed integer linear programming problems.« less

  19. Fuzzy Energy and Reserve Co-optimization With High Penetration of Renewable Energy

    DOE PAGES

    Liu, Cong; Botterud, Audun; Zhou, Zhi; ...

    2016-10-21

    In this study, we propose a fuzzy-based energy and reserve co-optimization model with consideration of high penetration of renewable energy. Under the assumption of a fixed uncertainty set of renewables, a two-stage robust model is proposed for clearing energy and reserves in the first stage and checking the feasibility and robustness of re-dispatches in the second stage. Fuzzy sets and their membership functions are introduced into the optimization model to represent the satisfaction degree of the variable uncertainty sets. The lower bound of the uncertainty set is expressed as fuzzy membership functions. The solutions are obtained by transforming the fuzzymore » mathematical programming formulation into traditional mixed integer linear programming problems.« less

  20. SeaWiFS technical report series. Volume 16: The second SeaWiFS Intercalibration Round-Robin Experiment, SIRREX-2, June 1993

    NASA Technical Reports Server (NTRS)

    Hooker, Stanford B. (Editor); Firestone, Elaine R. (Editor); Mueller, James L.; Mclean, James T.; Johnson, B. Carol; Westphal, Todd L.; Cooper, John W.

    1994-01-01

    The results of the second Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Intercalibration Round-Robin Experiment (SIRREX-2), which was held at the Center for Hydro-Optics and Remote Sensing (CHORS) at San Diego State University on 14-25 Jun. 1993 are presented. SeaWiFS is an ocean color radiometer that is scheduled for launch in 1994. The SIRREXs are part of the SeaWiFS Calibration and Validation Program that includes the GSFC, CHORS, NIST, and several other laboratories. GSFC maintains the radiometric scales (spectral radiance and irradiance) for the SeaWiFS program using spectral irradiance standards lamps, which are calibrated by NIST. The purpose of each SIRREX is to assure that the radiometric scales which are realized by the laboratories who participate in the SeaWiFS Calibration and Validation Program are correct; that is, the uncertainties of the radiometric scales are such that measurements of normalized water-leaving radiance using oceanographic radiometers have uncertainties of 5%. SIRREX-1 demonstrated, from the internal consistency of the results, that the program goals would not be met without improvements to the instrumentation. The results of SIRREX-2 demonstrate that spectral irradiance scales realized using the GSFC standard irradiance lamp (F269) are consistent with the program goals, as the uncertainty of these measurements is assessed to be about 1%. However, this is not true for the spectral radiance scales, where again the internal consistency of the results is used to assess the uncertainty. This is attributed to inadequate performance and characterization of the instrumentation. For example, spatial nonuniformities, spectral features, and sensitivity to illumination configuration were observed in some of the integrating sphere sources. The results of SIRREX-2 clearly indicate the direction for future work, with the main emphasis on instrument characterization and the assessment of the measurement uncertainties so that the results may be stated in a more definitive manner.

  1. IAEA Coordinated Research Project on HTGR Reactor Physics, Thermal-hydraulics and Depletion Uncertainty Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strydom, Gerhard; Bostelmann, F.

    The continued development of High Temperature Gas Cooled Reactors (HTGRs) requires verification of HTGR design and safety features with reliable high fidelity physics models and robust, efficient, and accurate codes. The predictive capability of coupled neutronics/thermal-hydraulics and depletion simulations for reactor design and safety analysis can be assessed with sensitivity analysis (SA) and uncertainty analysis (UA) methods. Uncertainty originates from errors in physical data, manufacturing uncertainties, modelling and computational algorithms. (The interested reader is referred to the large body of published SA and UA literature for a more complete overview of the various types of uncertainties, methodologies and results obtained).more » SA is helpful for ranking the various sources of uncertainty and error in the results of core analyses. SA and UA are required to address cost, safety, and licensing needs and should be applied to all aspects of reactor multi-physics simulation. SA and UA can guide experimental, modelling, and algorithm research and development. Current SA and UA rely either on derivative-based methods such as stochastic sampling methods or on generalized perturbation theory to obtain sensitivity coefficients. Neither approach addresses all needs. In order to benefit from recent advances in modelling and simulation and the availability of new covariance data (nuclear data uncertainties) extensive sensitivity and uncertainty studies are needed for quantification of the impact of different sources of uncertainties on the design and safety parameters of HTGRs. Only a parallel effort in advanced simulation and in nuclear data improvement will be able to provide designers with more robust and well validated calculation tools to meet design target accuracies. In February 2009, the Technical Working Group on Gas-Cooled Reactors (TWG-GCR) of the International Atomic Energy Agency (IAEA) recommended that the proposed Coordinated Research Program (CRP) on the HTGR Uncertainty Analysis in Modelling (UAM) be implemented. This CRP is a continuation of the previous IAEA and Organization for Economic Co-operation and Development (OECD)/Nuclear Energy Agency (NEA) international activities on Verification and Validation (V&V) of available analytical capabilities for HTGR simulation for design and safety evaluations. Within the framework of these activities different numerical and experimental benchmark problems were performed and insight was gained about specific physics phenomena and the adequacy of analysis methods.« less

  2. Research on uncertainty evaluation measure and method of voltage sag severity

    NASA Astrophysics Data System (ADS)

    Liu, X. N.; Wei, J.; Ye, S. Y.; Chen, B.; Long, C.

    2018-01-01

    Voltage sag is an inevitable serious problem of power quality in power system. This paper focuses on a general summarization and reviews on the concepts, indices and evaluation methods about voltage sag severity. Considering the complexity and uncertainty of influencing factors, damage degree, the characteristics and requirements of voltage sag severity in the power source-network-load sides, the measure concepts and their existing conditions, evaluation indices and methods of voltage sag severity have been analyzed. Current evaluation techniques, such as stochastic theory, fuzzy logic, as well as their fusion, are reviewed in detail. An index system about voltage sag severity is provided for comprehensive study. The main aim of this paper is to propose thought and method of severity research based on advanced uncertainty theory and uncertainty measure. This study may be considered as a valuable guide for researchers who are interested in the domain of voltage sag severity.

  3. Multivariate Probabilistic Analysis of an Hydrological Model

    NASA Astrophysics Data System (ADS)

    Franceschini, Samuela; Marani, Marco

    2010-05-01

    Model predictions derived based on rainfall measurements and hydrological model results are often limited by the systematic error of measuring instruments, by the intrinsic variability of the natural processes and by the uncertainty of the mathematical representation. We propose a means to identify such sources of uncertainty and to quantify their effects based on point-estimate approaches, as a valid alternative to cumbersome Montecarlo methods. We present uncertainty analyses on the hydrologic response to selected meteorological events, in the mountain streamflow-generating portion of the Brenta basin at Bassano del Grappa, Italy. The Brenta river catchment has a relatively uniform morphology and quite a heterogeneous rainfall-pattern. In the present work, we evaluate two sources of uncertainty: data uncertainty (the uncertainty due to data handling and analysis) and model uncertainty (the uncertainty related to the formulation of the model). We thus evaluate the effects of the measurement error of tipping-bucket rain gauges, the uncertainty in estimating spatially-distributed rainfall through block kriging, and the uncertainty associated with estimated model parameters. To this end, we coupled a deterministic model based on the geomorphological theory of the hydrologic response to probabilistic methods. In particular we compare the results of Monte Carlo Simulations (MCS) to the results obtained, in the same conditions, using Li's Point Estimate Method (LiM). The LiM is a probabilistic technique that approximates the continuous probability distribution function of the considered stochastic variables by means of discrete points and associated weights. This allows to satisfactorily reproduce results with only few evaluations of the model function. The comparison between the LiM and MCS results highlights the pros and cons of using an approximating method. LiM is less computationally demanding than MCS, but has limited applicability especially when the model response is highly nonlinear. Higher-order approximations can provide more accurate estimations, but reduce the numerical advantage of the LiM. The results of the uncertainty analysis identify the main sources of uncertainty in the computation of river discharge. In this particular case the spatial variability of rainfall and the model parameters uncertainty are shown to have the greatest impact on discharge evaluation. This, in turn, highlights the need to support any estimated hydrological response with probability information and risk analysis results in order to provide a robust, systematic framework for decision making.

  4. Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamadameen, Abdulqader Othman; Zainuddin, Zaitul Marlizawati

    This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α{sup –}. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen’s method is employed to find a compromise solution, supported by illustrative numerical example.

  5. Robust Programming Problems Based on the Mean-Variance Model Including Uncertainty Factors

    NASA Astrophysics Data System (ADS)

    Hasuike, Takashi; Ishii, Hiroaki

    2009-01-01

    This paper considers robust programming problems based on the mean-variance model including uncertainty sets and fuzzy factors. Since these problems are not well-defined problems due to fuzzy factors, it is hard to solve them directly. Therefore, introducing chance constraints, fuzzy goals and possibility measures, the proposed models are transformed into the deterministic equivalent problems. Furthermore, in order to solve these equivalent problems efficiently, the solution method is constructed introducing the mean-absolute deviation and doing the equivalent transformations.

  6. Cost-effectiveness of an exercise program during pregnancy to prevent gestational diabetes: results of an economic evaluation alongside a randomised controlled trial.

    PubMed

    Oostdam, Nicolette; Bosmans, Judith; Wouters, Maurice G A J; Eekhoff, Elisabeth M W; van Mechelen, Willem; van Poppel, Mireille N M

    2012-07-04

    The prevalence of gestational diabetes mellitus (GDM) is increasing worldwide. GDM and the risks associated with GDM lead to increased health care costs and losses in productivity. The objective of this study is to evaluate whether the FitFor2 exercise program during pregnancy is cost-effective from a societal perspective as compared to standard care. A randomised controlled trial (RCT) and simultaneous economic evaluation of the FitFor2 program were conducted. Pregnant women at risk for GDM were randomised to an exercise program to prevent high maternal blood glucose (n = 62) or to standard care (n = 59). The exercise program consisted of two sessions of aerobic and strengthening exercises per week. Clinical outcome measures were maternal fasting blood glucose levels, insulin sensitivity and infant birth weight. Quality of life was measured using the EuroQol 5-D and quality-adjusted life-years (QALYs) were calculated. Resource utilization and sick leave data were collected by questionnaires. Data were analysed according to the intention-to-treat principle. Missing data were imputed using multiple imputations. Bootstrapping techniques estimated the uncertainty surrounding the cost differences and incremental cost-effectiveness ratios. There were no statistically significant differences in any outcome measure. During pregnancy, total health care costs and costs of productivity losses were statistically non-significant (mean difference €1308; 95%CI €-229 - €3204). The cost-effectiveness analyses showed that the exercise program was not cost-effective in comparison to the control group for blood glucose levels, insulin sensitivity, infant birth weight or QALYs. The twice-weekly exercise program for pregnant women at risk for GDM evaluated in the present study was not cost-effective compared to standard care. Based on these results, implementation of this exercise program for the prevention of GDM cannot be recommended. NTR1139.

  7. Quantifying the Sources and Sinks of Greenhouse Gases: What Does It Take to Satisfy Scientific and Decision-Making Needs?

    NASA Astrophysics Data System (ADS)

    Davis, K. J.; Keller, K.; Ogle, S. M.; Smith, S.

    2014-12-01

    Changes in the sources and sinks of greenhouse gases (GHGs) are key drivers of anthropogenic climate change. It is hence not surprising that current and emerging U.S. governmental science priorities and programs focused on climate change (e.g. a U.S. Carbon Cycle Science Plan; the U.S. Carbon Cycle Science Program, the U.S. Global Change Research Program, Executive Order 13653 'Preparing the U.S. for the Impacts of Climate Change') all call for an improved understanding of these sources and sinks.. Measurements of the total atmospheric burden of these gases are well established, but measurements of their sources and sinks are difficult to make over spatial and temporal scales that are relevant for scientific and decisionmaking needs. Quantifying the uncertainty in these measurements is particularly challenging. This talk reviews the intersection of the state of knowledge of GHG sources and sinks, focusing in particular on CO2 and CH4, and science and decision-making needs for this information. Different science and decision-making needs require differing levels of uncertainty. A number of high-priority needs (early detection of changes in the Earth system, projections of future climate, support of markets or regulations) often require a high degree of accuracy and/or precision. We will critically evaluate current U.S. planning to documents to infer current perceived needs for GHG source/sink quantification, attempting to translate these needs into quantitative uncertainty metrics. We will compare these perceived needs with the current state of the art of GHG source/sink quantification, including the apparent pattern of systematic differences between so-called "top down" and "bottom-up" flux estimates. This comparison will enable us to identify where needs can be readily satisfied, and where gaps in technology exist. Finally, we will examine what steps could be taken to close existing gaps.

  8. Determining the nuclear data uncertainty on MONK10 and WIMS10 criticality calculations

    NASA Astrophysics Data System (ADS)

    Ware, Tim; Dobson, Geoff; Hanlon, David; Hiles, Richard; Mason, Robert; Perry, Ray

    2017-09-01

    The ANSWERS Software Service is developing a number of techniques to better understand and quantify uncertainty on calculations of the neutron multiplication factor, k-effective, in nuclear fuel and other systems containing fissile material. The uncertainty on the calculated k-effective arises from a number of sources, including nuclear data uncertainties, manufacturing tolerances, modelling approximations and, for Monte Carlo simulation, stochastic uncertainty. For determining the uncertainties due to nuclear data, a set of application libraries have been generated for use with the MONK10 Monte Carlo and the WIMS10 deterministic criticality and reactor physics codes. This paper overviews the generation of these nuclear data libraries by Latin hypercube sampling of JEFF-3.1.2 evaluated data based upon a library of covariance data taken from JEFF, ENDF/B, JENDL and TENDL evaluations. Criticality calculations have been performed with MONK10 and WIMS10 using these sampled libraries for a number of benchmark models of fissile systems. Results are presented which show the uncertainty on k-effective for these systems arising from the uncertainty on the input nuclear data.

  9. Advanced Modeling and Uncertainty Quantification for Flight Dynamics; Interim Results and Challenges

    NASA Technical Reports Server (NTRS)

    Hyde, David C.; Shweyk, Kamal M.; Brown, Frank; Shah, Gautam

    2014-01-01

    As part of the NASA Vehicle Systems Safety Technologies (VSST), Assuring Safe and Effective Aircraft Control Under Hazardous Conditions (Technical Challenge #3), an effort is underway within Boeing Research and Technology (BR&T) to address Advanced Modeling and Uncertainty Quantification for Flight Dynamics (VSST1-7). The scope of the effort is to develop and evaluate advanced multidisciplinary flight dynamics modeling techniques, including integrated uncertainties, to facilitate higher fidelity response characterization of current and future aircraft configurations approaching and during loss-of-control conditions. This approach is to incorporate multiple flight dynamics modeling methods for aerodynamics, structures, and propulsion, including experimental, computational, and analytical. Also to be included are techniques for data integration and uncertainty characterization and quantification. This research shall introduce new and updated multidisciplinary modeling and simulation technologies designed to improve the ability to characterize airplane response in off-nominal flight conditions. The research shall also introduce new techniques for uncertainty modeling that will provide a unified database model comprised of multiple sources, as well as an uncertainty bounds database for each data source such that a full vehicle uncertainty analysis is possible even when approaching or beyond Loss of Control boundaries. Methodologies developed as part of this research shall be instrumental in predicting and mitigating loss of control precursors and events directly linked to causal and contributing factors, such as stall, failures, damage, or icing. The tasks will include utilizing the BR&T Water Tunnel to collect static and dynamic data to be compared to the GTM extended WT database, characterizing flight dynamics in off-nominal conditions, developing tools for structural load estimation under dynamic conditions, devising methods for integrating various modeling elements into a real-time simulation capability, generating techniques for uncertainty modeling that draw data from multiple modeling sources, and providing a unified database model that includes nominal plus increments for each flight condition. This paper presents status of testing in the BR&T water tunnel and analysis of the resulting data and efforts to characterize these data using alternative modeling methods. Program challenges and issues are also presented.

  10. A generalized interval fuzzy mixed integer programming model for a multimodal transportation problem under uncertainty

    NASA Astrophysics Data System (ADS)

    Tian, Wenli; Cao, Chengxuan

    2017-03-01

    A generalized interval fuzzy mixed integer programming model is proposed for the multimodal freight transportation problem under uncertainty, in which the optimal mode of transport and the optimal amount of each type of freight transported through each path need to be decided. For practical purposes, three mathematical methods, i.e. the interval ranking method, fuzzy linear programming method and linear weighted summation method, are applied to obtain equivalents of constraints and parameters, and then a fuzzy expected value model is presented. A heuristic algorithm based on a greedy criterion and the linear relaxation algorithm are designed to solve the model.

  11. A comparative experimental evaluation of uncertainty estimation methods for two-component PIV

    NASA Astrophysics Data System (ADS)

    Boomsma, Aaron; Bhattacharya, Sayantan; Troolin, Dan; Pothos, Stamatios; Vlachos, Pavlos

    2016-09-01

    Uncertainty quantification in planar particle image velocimetry (PIV) measurement is critical for proper assessment of the quality and significance of reported results. New uncertainty estimation methods have been recently introduced generating interest about their applicability and utility. The present study compares and contrasts current methods, across two separate experiments and three software packages in order to provide a diversified assessment of the methods. We evaluated the performance of four uncertainty estimation methods, primary peak ratio (PPR), mutual information (MI), image matching (IM) and correlation statistics (CS). The PPR method was implemented and tested in two processing codes, using in-house open source PIV processing software (PRANA, Purdue University) and Insight4G (TSI, Inc.). The MI method was evaluated in PRANA, as was the IM method. The CS method was evaluated using DaVis (LaVision, GmbH). Utilizing two PIV systems for high and low-resolution measurements and a laser doppler velocimetry (LDV) system, data were acquired in a total of three cases: a jet flow and a cylinder in cross flow at two Reynolds numbers. LDV measurements were used to establish a point validation against which the high-resolution PIV measurements were validated. Subsequently, the high-resolution PIV measurements were used as a reference against which the low-resolution PIV data were assessed for error and uncertainty. We compared error and uncertainty distributions, spatially varying RMS error and RMS uncertainty, and standard uncertainty coverages. We observed that qualitatively, each method responded to spatially varying error (i.e. higher error regions resulted in higher uncertainty predictions in that region). However, the PPR and MI methods demonstrated reduced uncertainty dynamic range response. In contrast, the IM and CS methods showed better response, but under-predicted the uncertainty ranges. The standard coverages (68% confidence interval) ranged from approximately 65%-77% for PPR and MI methods, 40%-50% for IM and near 50% for CS. These observations illustrate some of the strengths and weaknesses of the methods considered herein and identify future directions for development and improvement.

  12. COMMUNICATING THE PARAMETER UNCERTAINTY IN THE IQWIG EFFICIENCY FRONTIER TO DECISION-MAKERS

    PubMed Central

    Stollenwerk, Björn; Lhachimi, Stefan K; Briggs, Andrew; Fenwick, Elisabeth; Caro, Jaime J; Siebert, Uwe; Danner, Marion; Gerber-Grote, Andreas

    2015-01-01

    The Institute for Quality and Efficiency in Health Care (IQWiG) developed—in a consultation process with an international expert panel—the efficiency frontier (EF) approach to satisfy a range of legal requirements for economic evaluation in Germany's statutory health insurance system. The EF approach is distinctly different from other health economic approaches. Here, we evaluate established tools for assessing and communicating parameter uncertainty in terms of their applicability to the EF approach. Among these are tools that perform the following: (i) graphically display overall uncertainty within the IQWiG EF (scatter plots, confidence bands, and contour plots) and (ii) communicate the uncertainty around the reimbursable price. We found that, within the EF approach, most established plots were not always easy to interpret. Hence, we propose the use of price reimbursement acceptability curves—a modification of the well-known cost-effectiveness acceptability curves. Furthermore, it emerges that the net monetary benefit allows an intuitive interpretation of parameter uncertainty within the EF approach. This research closes a gap for handling uncertainty in the economic evaluation approach of the IQWiG methods when using the EF. However, the precise consequences of uncertainty when determining prices are yet to be defined. © 2014 The Authors. Health Economics published by John Wiley & Sons Ltd. PMID:24590819

  13. Light Water Reactor Sustainability Program, U.S. Efforts in Support of Examinations at Fukushima Daiichi-2017 Evaluations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farmer, Mitchell T.

    Although the accident signatures from each unit at the Fukushima Daiichi Nuclear Power Station (NPS) [Daiichi] differ, much is not known about the end-state of core materials within these units. Some of this uncertainty can be attributed to a lack of information related to cooling system operation and cooling water injection. There is also uncertainty in our understanding of phenomena affecting: a) in-vessel core damage progression during severe accidents in boiling water reactors (BWRs), and b) accident progression after vessel failure (ex-vessel progression) for BWRs and Pressurized Water Reactors (PWRs). These uncertainties arise due to limited full scale prototypic data.more » Similar to what occurred after the accident at Three Mile Island Unit 2, these Daiichi units offer the international community a means to reduce such uncertainties by obtaining prototypic data from multiple full-scale BWR severe accidents. Information obtained from Daiichi is required to inform Decontamination and Decommissioning activities, improving the ability of the Tokyo Electric Power Company Holdings, Incorporated (TEPCO Holdings) to characterize potential hazards and to ensure the safety of workers involved with cleanup activities. This document, which has been updated to include FY2017 information, summarizes results from U.S. efforts to use information obtained by TEPCO Holdings to enhance the safety of existing and future nuclear power plant designs. This effort, which was initiated in 2014 by the Reactor Safety Technologies Pathway of the Department of Energy Office of Nuclear Energy Light Water Reactor (LWR) Sustainability Program, consists of a group of U.S. experts in LWR safety and plant operations that have identified examination needs and are evaluating TEPCO Holdings information from Daiichi that address these needs. Each year, annual reports include examples demonstrating that significant safety insights are being obtained in the areas of component performance, fission product release and transport, debris end-state location, and combustible gas generation and transport. In addition to reducing uncertainties related to severe accident modeling progression, these insights are being used to update guidance for severe accident prevention, mitigation, and emergency planning. Furthermore, reduced uncertainties in modeling the events at Daiichi will improve the realism of reactor safety evaluations and inform future D&D activities by improving the capability for characterizing potential hazards to workers involved with cleanup activities. Highlights in this FY2017 report include new insights with respect to the forces required to produce the observed Daiichi Unit 1 (1F1) shield plug endstate, the observed leakage from 1F1 components, and the amount of combustible gas generation required to produce the observed explosions in Daiichi Units 3 and 4 (1F3 and 1F4). This report contains an appendix with a list of examination needs that was updated after U.S. experts reviewed recently obtained information from examinations at Daiichi. Additional details for higher priority, near-term, examination activities are also provided. This report also includes an appendix with a description of an updated website that has been reformatted to better assist U.S. experts by providing information in an archived retrievable location, as well as an appendix summarizing U.S. Forensics activities to host the TMI-2 Knowledge Transfer and Relevance to Fukushima Meeting that was held in Idaho Falls, ID, on October 10-14, 2016.« less

  14. Competitive Advantage, Uncertainty, and Weapons Procurement: Striking Balance for the Future

    DTIC Science & Technology

    2009-05-02

    position of the Department of the Army, Department of Defense, or the U.S. Government. COMPETITIVE ADVANTAGE , UNCERTAINTY, AND WEAPONS PROCUREMENT: STRIKING...Competitive Advantage , Uncertainty, and Weapons Procurement: Striking Balance for the Future 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d...the right balance of investments all leading to a sustained competitive advantage . This paper presents an analysis of how effective this overhaul has

  15. A robust multi-objective global supplier selection model under currency fluctuation and price discount

    NASA Astrophysics Data System (ADS)

    Zarindast, Atousa; Seyed Hosseini, Seyed Mohamad; Pishvaee, Mir Saman

    2017-06-01

    Robust supplier selection problem, in a scenario-based approach has been proposed, when the demand and exchange rates are subject to uncertainties. First, a deterministic multi-objective mixed integer linear programming is developed; then, the robust counterpart of the proposed mixed integer linear programming is presented using the recent extension in robust optimization theory. We discuss decision variables, respectively, by a two-stage stochastic planning model, a robust stochastic optimization planning model which integrates worst case scenario in modeling approach and finally by equivalent deterministic planning model. The experimental study is carried out to compare the performances of the three models. Robust model resulted in remarkable cost saving and it illustrated that to cope with such uncertainties, we should consider them in advance in our planning. In our case study different supplier were selected due to this uncertainties and since supplier selection is a strategic decision, it is crucial to consider these uncertainties in planning approach.

  16. Uncertainties of optical parameters and their propagations in an analytical ocean color inversion algorithm.

    PubMed

    Lee, ZhongPing; Arnone, Robert; Hu, Chuanmin; Werdell, P Jeremy; Lubac, Bertrand

    2010-01-20

    Following the theory of error propagation, we developed analytical functions to illustrate and evaluate the uncertainties of inherent optical properties (IOPs) derived by the quasi-analytical algorithm (QAA). In particular, we evaluated the effects of uncertainties of these optical parameters on the inverted IOPs: the absorption coefficient at the reference wavelength, the extrapolation of particle backscattering coefficient, and the spectral ratios of absorption coefficients of phytoplankton and detritus/gelbstoff, respectively. With a systematically simulated data set (46,200 points), we found that the relative uncertainty of QAA-derived total absorption coefficients in the blue-green wavelengths is generally within +/-10% for oceanic waters. The results of this study not only establish theoretical bases to evaluate and understand the effects of the various variables on IOPs derived from remote-sensing reflectance, but also lay the groundwork to analytically estimate uncertainties of these IOPs for each pixel. These are required and important steps for the generation of quality maps of IOP products derived from satellite ocean color remote sensing.

  17. Monte-Carlo-based uncertainty propagation with hierarchical models—a case study in dynamic torque

    NASA Astrophysics Data System (ADS)

    Klaus, Leonard; Eichstädt, Sascha

    2018-04-01

    For a dynamic calibration, a torque transducer is described by a mechanical model, and the corresponding model parameters are to be identified from measurement data. A measuring device for the primary calibration of dynamic torque, and a corresponding model-based calibration approach, have recently been developed at PTB. The complete mechanical model of the calibration set-up is very complex, and involves several calibration steps—making a straightforward implementation of a Monte Carlo uncertainty evaluation tedious. With this in mind, we here propose to separate the complete model into sub-models, with each sub-model being treated with individual experiments and analysis. The uncertainty evaluation for the overall model then has to combine the information from the sub-models in line with Supplement 2 of the Guide to the Expression of Uncertainty in Measurement. In this contribution, we demonstrate how to carry this out using the Monte Carlo method. The uncertainty evaluation involves various input quantities of different origin and the solution of a numerical optimisation problem.

  18. A tool for efficient, model-independent management optimization under uncertainty

    USGS Publications Warehouse

    White, Jeremy; Fienen, Michael N.; Barlow, Paul M.; Welter, Dave E.

    2018-01-01

    To fill a need for risk-based environmental management optimization, we have developed PESTPP-OPT, a model-independent tool for resource management optimization under uncertainty. PESTPP-OPT solves a sequential linear programming (SLP) problem and also implements (optional) efficient, “on-the-fly” (without user intervention) first-order, second-moment (FOSM) uncertainty techniques to estimate model-derived constraint uncertainty. Combined with a user-specified risk value, the constraint uncertainty estimates are used to form chance-constraints for the SLP solution process, so that any optimal solution includes contributions from model input and observation uncertainty. In this way, a “single answer” that includes uncertainty is yielded from the modeling analysis. PESTPP-OPT uses the familiar PEST/PEST++ model interface protocols, which makes it widely applicable to many modeling analyses. The use of PESTPP-OPT is demonstrated with a synthetic, integrated surface-water/groundwater model. The function and implications of chance constraints for this synthetic model are discussed.

  19. Conservative strategy-based ensemble surrogate model for optimal groundwater remediation design at DNAPLs-contaminated sites

    NASA Astrophysics Data System (ADS)

    Ouyang, Qi; Lu, Wenxi; Lin, Jin; Deng, Wenbing; Cheng, Weiguo

    2017-08-01

    The surrogate-based simulation-optimization techniques are frequently used for optimal groundwater remediation design. When this technique is used, surrogate errors caused by surrogate-modeling uncertainty may lead to generation of infeasible designs. In this paper, a conservative strategy that pushes the optimal design into the feasible region was used to address surrogate-modeling uncertainty. In addition, chance-constrained programming (CCP) was adopted to compare with the conservative strategy in addressing this uncertainty. Three methods, multi-gene genetic programming (MGGP), Kriging (KRG) and support vector regression (SVR), were used to construct surrogate models for a time-consuming multi-phase flow model. To improve the performance of the surrogate model, ensemble surrogates were constructed based on combinations of different stand-alone surrogate models. The results show that: (1) the surrogate-modeling uncertainty was successfully addressed by the conservative strategy, which means that this method is promising for addressing surrogate-modeling uncertainty. (2) The ensemble surrogate model that combines MGGP with KRG showed the most favorable performance, which indicates that this ensemble surrogate can utilize both stand-alone surrogate models to improve the performance of the surrogate model.

  20. Isotherm Sensor Calibration Program for Mars Science Laboratory Heat Shield Flight Data Analysis

    NASA Technical Reports Server (NTRS)

    Santos, Jose A.; Oishi, Tomo; Martinez, Ed R.

    2011-01-01

    Seven instrumented sensor plugs were installed on the Mars Science Laboratory heat shield in December 2008 as part of the Mars Science Laboratory Entry, Descent, and Landing Instrumentation (MEDLI) project. These sensor plugs contain four in-depth thermocouples and one Hollow aErothermal Ablation and Temperature (HEAT) sensor. The HEAT sensor follows the time progression of a 700 C isotherm through the thickness of a thermal protection system (TPS) material. The data can be used to infer char depth and, when analyzed in conjunction with the thermocouple data, the thermal gradient through the TPS material can also be determined. However, the uncertainty on the isotherm value is not well defined. To address this uncertainty, a team at NASA Ames Research Center is carrying out a HEAT sensor calibration test program. The scope of this test program is described, and initial results from experiments conducted in the laboratory to study the isotherm temperature of the HEAT sensor are presented. Data from the laboratory tests indicate an isotherm temperature of 720 C 60 C. An overview of near term arc jet testing is also given, including preliminary data from 30.48cm 30.48cm PICA panels instrumented with two MEDLI sensor plugs and tested in the NASA Ames Panel Test Facility. Forward work includes analysis of the arc jet test data, including an evaluation of the isotherm value based on the instant in time when it reaches a thermocouple depth.

  1. Post-introduction economic evaluation of pneumococcal conjugate vaccination in Ecuador, Honduras, and Paraguay.

    PubMed

    Constenla, Dagna O

    2015-11-01

    A decision-analytic model was constructed to evaluate the economic impact of post-introduction pneumococcal conjugate vaccine (PCV) programs in Ecuador, Honduras, and Paraguay from the societal perspective. Hypothetical birth cohorts were followed for a 20-year period in each country. Estimates of disease burden, vaccine effectiveness, and health care costs were derived from primary and secondary data sources. Costs were expressed in 2014 US$. Sensitivity analyses were performed to assess the impact of model input uncertainties. Over the 20 years of vaccine program implementation, the health care costs per case ranged from US$ 764 854 to more than US$ 1 million. Vaccination prevented more than 50% of pneumococcal cases and deaths per country. At a cost of US$ 16 per dose, the cost per disability-adjusted life year (DALY) averted for the 10-valent PCV (PCV10) and the 13-valet PCV (PCV13) ranged from US$ 796 (Honduras) to US$ 1 340 (Ecuador) and from US$ 691 (Honduras) to US$ 1 166 (Ecuador) respectively. At a reduced price (US$ 7 per dose), the cost per DALY averted ranged from US$ 327 (Honduras) to US$ 528 (Ecuador) and from US$ 281 (Honduras) to US$ 456 (Ecuador) for PCV10 and PCV13 respectively. Several model parameters influenced the results of the analysis, including vaccine price, vaccine efficacy, disease incidence, and costs. The economic impact of post-introduction PCV needs to be assessed in a context of uncertainty regarding changing antibiotic resistance, herd and serotype replacement effects, differential vaccine prices, and government budget constraints.

  2. Proficiency testing of Hb A1c: a 4-year experience in Taiwan and the Asian Pacific region.

    PubMed

    Shiesh, Shu-Chu; Wiedmeyer, Hsiao-Mei; Kao, Jau-Tsuen; Vasikaran, Samuel D; Lopez, Joseph B

    2009-10-01

    The correlation between hemoglobin A(1c) (Hb A(1c)) and risk for complications in diabetic patients heightens the need to measure Hb A(1c) with accuracy. We evaluated the current performance for measuring Hb A(1c) in the Asian and Pacific region by examining data submitted by laboratories participating in the Taiwan proficiency-testing program. Five fresh-pooled blood samples were sent to participating laboratories twice each year. The results were evaluated against target values assigned by the National Glycohemoglobin Standardization Program network laboratories; a passing criterion of +/-7% of the target value was used. Measurement uncertainty at Hb A(1c) concentrations of 7.0% and 8.0% were determined. A total of 276 laboratories from 11 countries took part in the Hb A(1c) survey. At the Hb A(1c) concentrations tested method-specific interlaboratory imprecision (CVs) were 1.1%-13.9% in 2005, 1.3%-10.1% in 2006, 1.2%-8.2% in 2007, and 1.1%-6.1% in 2008. Differences between target values and median values from the commonly used methods ranged from -0.24% to 0.22% Hb A(1c) in 2008. In 2005 83% of laboratories passed the survey, and in 2008 93% passed. At 7.0% Hb A(1c), measurement uncertainty was on average 0.49% Hb A(1c). The use of accuracy-based proficiency testing with stringent quality criteria has improved the performance of Hb A(1c) testing in the Asian and Pacific laboratories during the 4 years of assessment.

  3. Evaluation of Sources of Uncertainties in Solar Resource Measurement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habte, Aron M; Sengupta, Manajit

    This poster presents a high-level overview of sources of uncertainties in solar resource measurement, demonstrating the impact of various sources of uncertainties -- such as cosine response, thermal offset, spectral response, and others -- on the accuracy of data from several radiometers. The study provides insight on how to reduce the impact of some of the sources of uncertainties.

  4. Model parameter uncertainty analysis for an annual field-scale P loss model

    NASA Astrophysics Data System (ADS)

    Bolster, Carl H.; Vadas, Peter A.; Boykin, Debbie

    2016-08-01

    Phosphorous (P) fate and transport models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. Because all models are simplifications of complex systems, there will exist an inherent amount of uncertainty associated with their predictions. It is therefore important that efforts be directed at identifying, quantifying, and communicating the different sources of model uncertainties. In this study, we conducted an uncertainty analysis with the Annual P Loss Estimator (APLE) model. Our analysis included calculating parameter uncertainties and confidence and prediction intervals for five internal regression equations in APLE. We also estimated uncertainties of the model input variables based on values reported in the literature. We then predicted P loss for a suite of fields under different management and climatic conditions while accounting for uncertainties in the model parameters and inputs and compared the relative contributions of these two sources of uncertainty to the overall uncertainty associated with predictions of P loss. Both the overall magnitude of the prediction uncertainties and the relative contributions of the two sources of uncertainty varied depending on management practices and field characteristics. This was due to differences in the number of model input variables and the uncertainties in the regression equations associated with each P loss pathway. Inspection of the uncertainties in the five regression equations brought attention to a previously unrecognized limitation with the equation used to partition surface-applied fertilizer P between leaching and runoff losses. As a result, an alternate equation was identified that provided similar predictions with much less uncertainty. Our results demonstrate how a thorough uncertainty and model residual analysis can be used to identify limitations with a model. Such insight can then be used to guide future data collection and model development and evaluation efforts.

  5. Uncertainty indication in soil function maps - transparent and easy-to-use information to support sustainable use of soil resources

    NASA Astrophysics Data System (ADS)

    Greiner, Lucie; Nussbaum, Madlene; Papritz, Andreas; Zimmermann, Stephan; Gubler, Andreas; Grêt-Regamey, Adrienne; Keller, Armin

    2018-05-01

    Spatial information on soil function fulfillment (SFF) is increasingly being used to inform decision-making in spatial planning programs to support sustainable use of soil resources. Soil function maps visualize soils abilities to fulfill their functions, e.g., regulating water and nutrient flows, providing habitats, and supporting biomass production based on soil properties. Such information must be reliable for informed and transparent decision-making in spatial planning programs. In this study, we add to the transparency of soil function maps by (1) indicating uncertainties arising from the prediction of soil properties generated by digital soil mapping (DSM) that are used for soil function assessment (SFA) and (2) showing the response of different SFA methods to the propagation of uncertainties through the assessment. For a study area of 170 km2 in the Swiss Plateau, we map 10 static soil sub-functions for agricultural soils for a spatial resolution of 20 × 20 m together with their uncertainties. Mapping the 10 soil sub-functions using simple ordinal assessment scales reveals pronounced spatial patterns with a high variability of SFF scores across the region, linked to the inherent properties of the soils and terrain attributes and climate conditions. Uncertainties in soil properties propagated through SFA methods generally lead to substantial uncertainty in the mapped soil sub-functions. We propose two types of uncertainty maps that can be readily understood by stakeholders. Cumulative distribution functions of SFF scores indicate that SFA methods respond differently to the propagated uncertainty of soil properties. Even where methods are comparable on the level of complexity and assessment scale, their comparability in view of uncertainty propagation might be different. We conclude that comparable uncertainty indications in soil function maps are relevant to enable informed and transparent decisions on the sustainable use of soil resources.

  6. Stochastic Robust Mathematical Programming Model for Power System Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Cong; Changhyeok, Lee; Haoyong, Chen

    2016-01-01

    This paper presents a stochastic robust framework for two-stage power system optimization problems with uncertainty. The model optimizes the probabilistic expectation of different worst-case scenarios with ifferent uncertainty sets. A case study of unit commitment shows the effectiveness of the proposed model and algorithms.

  7. Assessing Uncertainties in Gridded Emissions: A Case Study for Fossil Fuel Carbon Dioxide (FFCO2) Emission Data

    NASA Technical Reports Server (NTRS)

    Oda, T.; Ott, L.; Lauvaux, T.; Feng, S.; Bun, R.; Roman, M.; Baker, D. F.; Pawson, S.

    2017-01-01

    Fossil fuel carbon dioxide (CO2) emissions (FFCO2) are the largest input to the global carbon cycle on a decadal time scale. Because total emissions are assumed to be reasonably well constrained by fuel statistics, FFCO2 often serves as a reference in order to deduce carbon uptake by poorly understood terrestrial and ocean sinks. Conventional atmospheric CO2 flux inversions solve for spatially explicit regional sources and sinks and estimate land and ocean fluxes by subtracting FFCO2. Thus, errors in FFCO2 can propagate into the final inferred flux estimates. Gridded emissions are often based on disaggregation of emissions estimated at national or regional level. Although national and regional total FFCO2 are well known, gridded emission fields are subject to additional uncertainties due to the emission disaggregation. Assessing such uncertainties is often challenging because of the lack of physical measurements for evaluation. We first review difficulties in assessing uncertainties associated with gridded FFCO2 emission data and present several approaches for evaluation of such uncertainties at multiple scales. Given known limitations, inter-emission data differences are often used as a proxy for the uncertainty. The popular approach allows us to characterize differences in emissions, but does not allow us to fully quantify emission disaggregation biases. Our work aims to vicariously evaluate FFCO2 emission data using atmospheric models and measurements. We show a global simulation experiment where uncertainty estimates are propagated as an atmospheric tracer (uncertainty tracer) alongside CO2 in NASA's GEOS model and discuss implications of FFCO2 uncertainties in the context of flux inversions. We also demonstrate the use of high resolution urban CO2 simulations as a tool for objectively evaluating FFCO2 data over intense emission regions. Though this study focuses on FFCO2 emission data, the outcome of this study could also help improve the knowledge of similar gridded emissions data for non-CO2 compounds with similar emission characteristics.

  8. Assessing uncertainties in gridded emissions: A case study for fossil fuel carbon dioxide (FFCO2) emission data

    NASA Astrophysics Data System (ADS)

    Oda, T.; Ott, L. E.; Lauvaux, T.; Feng, S.; Bun, R.; Roman, M. O.; Baker, D. F.; Pawson, S.

    2017-12-01

    Fossil fuel carbon dioxide (CO2) emissions (FFCO2) are the largest input to the global carbon cycle on a decadal time scale. Because total emissions are assumed to be reasonably well constrained by fuel statistics, FFCO2 often serves as a reference in order to deduce carbon uptake by poorly understood terrestrial and ocean sinks. Conventional atmospheric CO2 flux inversions solve for spatially explicit regional sources and sinks and estimate land and ocean fluxes by subtracting FFCO2. Thus, errors in FFCO2 can propagate into the final inferred flux estimates. Gridded emissions are often based on disaggregation of emissions estimated at national or regional level. Although national and regional total FFCO2 are well known, gridded emission fields are subject to additional uncertainties due to the emission disaggregation. Assessing such uncertainties is often challenging because of the lack of physical measurements for evaluation. We first review difficulties in assessing uncertainties associated with gridded FFCO2 emission data and present several approaches for evaluation of such uncertainties at multiple scales. Given known limitations, inter-emission data differences are often used as a proxy for the uncertainty. The popular approach allows us to characterize differences in emissions, but does not allow us to fully quantify emission disaggregation biases. Our work aims to vicariously evaluate FFCO2 emission data using atmospheric models and measurements. We show a global simulation experiment where uncertainty estimates are propagated as an atmospheric tracer (uncertainty tracer) alongside CO2 in NASA's GEOS model and discuss implications of FFCO2 uncertainties in the context of flux inversions. We also demonstrate the use of high resolution urban CO2 simulations as a tool for objectively evaluating FFCO2 data over intense emission regions. Though this study focuses on FFCO2 emission data, the outcome of this study could also help improve the knowledge of similar gridded emissions data for non-CO2 compounds that share emission sectors.

  9. Effect of Uncertainty on Deterministic Runway Scheduling

    NASA Technical Reports Server (NTRS)

    Gupta, Gautam; Malik, Waqar; Jung, Yoon C.

    2012-01-01

    Active runway scheduling involves scheduling departures for takeoffs and arrivals for runway crossing subject to numerous constraints. This paper evaluates the effect of uncertainty on a deterministic runway scheduler. The evaluation is done against a first-come- first-serve scheme. In particular, the sequence from a deterministic scheduler is frozen and the times adjusted to satisfy all separation criteria; this approach is tested against FCFS. The comparison is done for both system performance (throughput and system delay) and predictability, and varying levels of congestion are considered. The modeling of uncertainty is done in two ways: as equal uncertainty in availability at the runway as for all aircraft, and as increasing uncertainty for later aircraft. Results indicate that the deterministic approach consistently performs better than first-come-first-serve in both system performance and predictability.

  10. Accounting for sensor calibration, data validation, measurement and sampling uncertainties in monitoring urban drainage systems.

    PubMed

    Bertrand-Krajewski, J L; Bardin, J P; Mourad, M; Béranger, Y

    2003-01-01

    Assessing the functioning and the performance of urban drainage systems on both rainfall event and yearly time scales is usually based on online measurements of flow rates and on samples of influent effluent for some rainfall events per year. In order to draw pertinent scientific and operational conclusions from the measurement results, it is absolutely necessary to use appropriate methods and techniques in order to i) calibrate sensors and analytical methods, ii) validate raw data, iii) evaluate measurement uncertainties, iv) evaluate the number of rainfall events to sample per year in order to determine performance indicator with a given uncertainty. Based an previous work, the paper gives a synthetic review of required and techniques, and illustrates their application to storage and settling tanks. Experiments show that, controlled and careful experimental conditions, relative uncertainties are about 20% for flow rates in sewer pipes, 6-10% for volumes, 25-35% for TSS concentrations and loads, and 18-276% for TSS removal rates. In order to evaluate the annual pollutant interception efficiency of storage and settling tanks with a given uncertainty, efforts should first be devoted to decrease the sampling uncertainty by increasing the number of sampled events.

  11. Evaluation of Uncertainty in Constituent Input Parameters for Modeling the Fate of RDX

    DTIC Science & Technology

    2015-07-01

    exercise was to evaluate the importance of chemical -specific model input parameters, the impacts of their uncertainty, and the potential benefits of... chemical -specific inputs for RDX that were determined to be sensitive with relatively high uncertainty: these included the soil-water linear...Koc for organic chemicals . The EFS values provided for log Koc of RDX were 1.72 and 1.95. OBJECTIVE: TREECS™ (http://el.erdc.usace.army.mil/treecs

  12. Uncertainties in mapping forest carbon in urban ecosystems.

    PubMed

    Chen, Gang; Ozelkan, Emre; Singh, Kunwar K; Zhou, Jun; Brown, Marilyn R; Meentemeyer, Ross K

    2017-02-01

    Spatially explicit urban forest carbon estimation provides a baseline map for understanding the variation in forest vertical structure, informing sustainable forest management and urban planning. While high-resolution remote sensing has proven promising for carbon mapping in highly fragmented urban landscapes, data cost and availability are the major obstacle prohibiting accurate, consistent, and repeated measurement of forest carbon pools in cities. This study aims to evaluate the uncertainties of forest carbon estimation in response to the combined impacts of remote sensing data resolution and neighborhood spatial patterns in Charlotte, North Carolina. The remote sensing data for carbon mapping were resampled to a range of resolutions, i.e., LiDAR point cloud density - 5.8, 4.6, 2.3, and 1.2 pt s/m 2 , aerial optical NAIP (National Agricultural Imagery Program) imagery - 1, 5, 10, and 20 m. Urban spatial patterns were extracted to represent area, shape complexity, dispersion/interspersion, diversity, and connectivity of landscape patches across the residential neighborhoods with built-up densities from low, medium-low, medium-high, to high. Through statistical analyses, we found that changing remote sensing data resolution introduced noticeable uncertainties (variation) in forest carbon estimation at the neighborhood level. Higher uncertainties were caused by the change of LiDAR point density (causing 8.7-11.0% of variation) than changing NAIP image resolution (causing 6.2-8.6% of variation). For both LiDAR and NAIP, urban neighborhoods with a higher degree of anthropogenic disturbance unveiled a higher level of uncertainty in carbon mapping. However, LiDAR-based results were more likely to be affected by landscape patch connectivity, and the NAIP-based estimation was found to be significantly influenced by the complexity of patch shape. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Comparison of Homogeneous and Heterogeneous CFD Fuel Models for Phase I of the IAEA CRP on HTR Uncertainties Benchmark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerhard Strydom; Su-Jong Yoon

    2014-04-01

    Computational Fluid Dynamics (CFD) evaluation of homogeneous and heterogeneous fuel models was performed as part of the Phase I calculations of the International Atomic Energy Agency (IAEA) Coordinate Research Program (CRP) on High Temperature Reactor (HTR) Uncertainties in Modeling (UAM). This study was focused on the nominal localized stand-alone fuel thermal response, as defined in Ex. I-3 and I-4 of the HTR UAM. The aim of the stand-alone thermal unit-cell simulation is to isolate the effect of material and boundary input uncertainties on a very simplified problem, before propagation of these uncertainties are performed in subsequent coupled neutronics/thermal fluids phasesmore » on the benchmark. In many of the previous studies for high temperature gas cooled reactors, the volume-averaged homogeneous mixture model of a single fuel compact has been applied. In the homogeneous model, the Tristructural Isotropic (TRISO) fuel particles in the fuel compact were not modeled directly and an effective thermal conductivity was employed for the thermo-physical properties of the fuel compact. On the contrary, in the heterogeneous model, the uranium carbide (UCO), inner and outer pyrolytic carbon (IPyC/OPyC) and silicon carbide (SiC) layers of the TRISO fuel particles are explicitly modeled. The fuel compact is modeled as a heterogeneous mixture of TRISO fuel kernels embedded in H-451 matrix graphite. In this study, a steady-state and transient CFD simulations were performed with both homogeneous and heterogeneous models to compare the thermal characteristics. The nominal values of the input parameters are used for this CFD analysis. In a future study, the effects of input uncertainties in the material properties and boundary parameters will be investigated and reported.« less

  14. Neutronics qualification of the Jules Horowitz reactor fuel by interpretation of the VALMONT experimental program - Transposition of the uncertainties on the reactivity of JHR with JEF2.2 and JEFF3.1.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leray, O.; Hudelot, J. P.; Antony, M.

    2011-07-01

    The new European material testing Jules Horowitz Reactor (JHR), currently under construction in Cadarache center (CEA France), will use LEU (20% enrichment in {sup 235}U) fuels (U{sub 3}Si{sub 2} for the start up and UMoAl in the future) which are quite different from the industrial oxide fuel, for which an extensive neutronics qualification database has been established. The HORUS3D/N neutronics calculation scheme, used for the design and safety studies of the JHR, is being developed within the framework of a rigorous verification-validation-qualification methodology. In this framework, the experimental VALMONT (Validation of Aluminium Molybdenum uranium fuel for Neutronics) program has beenmore » performed in the MINERVE facility of CEA Cadarache (France), in order to qualify the capability of HORUS3D/N to accurately calculate the reactivity of the JHR reactor. The MINERVE facility using the oscillation technique provides accurate measurements of reactivity effect of samples. The VALMONT program includes oscillations of samples of UAl{sub x}/Al and UMo/Al with enrichments ranging from 0.2% to 20% and Uranium densities from 2.2 to 8 g/cm{sup 3}. The geometry of the samples and the pitch of the experimental lattice ensure maximum representativeness with the neutron spectrum expected for JHR. By comparing the effect of the sample with the one of a known fuel specimen, the reactivity effect can be measured in absolute terms and be compared to computational results. Special attention was paid to the rigorous determination and reduction of the experimental uncertainties. The calculational analysis of the VALMONT results was performed with the French deterministic code APOLLO2. A comparison of the impact of the different calculation methods, data libraries and energy meshes that were tested is presented. The interpretation of the VALMONT experimental program allowed the qualification of JHR fuel UMoAl8 (with an enrichment of 19.75% {sup 235}U) by the Minerve-dedicated interpretation tool: PIMS. The effect of energy meshes and evaluations put forward the JEFF3.1.1/SHEM scheme that leads to a better calculation of the reactivity effect of VALMONT samples. Then, in order to quantify the impact of the uncertainties linked to the basic nuclear data, their propagation from the cross section measurement to the final computational result was analysed in a rigorous way by using a nuclear data re-estimation method based on Gauss-Newton iterations. This study concludes that the prior uncertainties due to nuclear data (uranium, aluminium, beryllium and water) on the reactivity of the Begin Of Cycle (BOC) for the JHR core reach 1217 pcm at 2{sigma}. Now, the uppermost uncertainty on the JHR reactivity is due to aluminium. (authors)« less

  15. Metrology: Calibration and measurement processes guidelines

    NASA Technical Reports Server (NTRS)

    Castrup, Howard T.; Eicke, Woodward G.; Hayes, Jerry L.; Mark, Alexander; Martin, Robert E.; Taylor, James L.

    1994-01-01

    The guide is intended as a resource to aid engineers and systems contracts in the design, implementation, and operation of metrology, calibration, and measurement systems, and to assist NASA personnel in the uniform evaluation of such systems supplied or operated by contractors. Methodologies and techniques acceptable in fulfilling metrology quality requirements for NASA programs are outlined. The measurement process is covered from a high level through more detailed discussions of key elements within the process, Emphasis is given to the flowdown of project requirements to measurement system requirements, then through the activities that will provide measurements with defined quality. In addition, innovations and techniques for error analysis, development of statistical measurement process control, optimization of calibration recall systems, and evaluation of measurement uncertainty are presented.

  16. A Retrospective Analysis of the Benefits and Impacts of U.S. Renewable Portfolio Standards

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiser, Ryan; Barbose, Galen; Heeter, Jenny

    This is the second in a series of reports exploring the costs, benefits, and other impacts of state renewable portfolio standards (RPS), both retrospectively and prospectively. This report focuses on the benefits and impacts of all state RPS programs, in aggregate, for the year 2013 (the most-recent year for which the requisite data were available). Relying on a well-vetted set of methods, the study evaluates a number of important benefits and impacts in both physical and monetary terms, where possible, and characterizes key uncertainties. The prior study in this series focused on historical RPS compliance costs, and future work willmore » evaluate costs, benefits, and other impacts of RPS policies prospectively.« less

  17. Impacts of Process and Prediction Uncertainties on Projected Hanford Waste Glass Amount

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gervasio, Vivianaluxa; Vienna, John D.; Kim, Dong-Sang

    Analyses were performed to evaluate the impacts of using the advanced glass models, constraints (Vienna et al. 2016), and uncertainty descriptions on projected Hanford glass mass. The maximum allowable WOL was estimated for waste compositions while simultaneously satisfying all applicable glass property and composition constraints with sufficient confidence. Different components of prediction and composition/process uncertainties were systematically included in the calculations to evaluate their impacts on glass mass. The analyses estimated the production of 23,360 MT of IHLW glass when no uncertainties were taken into accound. Accounting for prediction and composition/process uncertainties resulted in 5.01 relative percent increase in estimatedmore » glass mass 24,531 MT. Roughly equal impacts were found for prediction uncertainties (2.58 RPD) and composition/process uncertainties (2.43 RPD). ILAW mass was predicted to be 282,350 MT without uncertainty and with weaste loading “line” rules in place. Accounting for prediction and composition/process uncertainties resulted in only 0.08 relative percent increase in estimated glass mass of 282,562 MTG. Without application of line rules the glass mass decreases by 10.6 relative percent (252,490 MT) for the case with no uncertainties. Addition of prediction uncertainties increases glass mass by 1.32 relative percent and the addition of composition/process uncertainties increase glass mass by an additional 7.73 relative percent (9.06 relative percent increase combined). The glass mass estimate without line rules (275,359 MT) was 2.55 relative percent lower than that with the line rules (282,562 MT), after accounting for all applicable uncertainties.« less

  18. Sustainable Cost Models for mHealth at Scale: Modeling Program Data from m4RH Tanzania.

    PubMed

    Mangone, Emily R; Agarwal, Smisha; L'Engle, Kelly; Lasway, Christine; Zan, Trinity; van Beijma, Hajo; Orkis, Jennifer; Karam, Robert

    2016-01-01

    There is increasing evidence that mobile phone health interventions ("mHealth") can improve health behaviors and outcomes and are critically important in low-resource, low-access settings. However, the majority of mHealth programs in developing countries fail to reach scale. One reason may be the challenge of developing financially sustainable programs. The goal of this paper is to explore strategies for mHealth program sustainability and develop cost-recovery models for program implementers using 2014 operational program data from Mobile for Reproductive Health (m4RH), a national text-message (SMS) based health communication service in Tanzania. We delineated 2014 m4RH program costs and considered three strategies for cost-recovery for the m4RH program: user pay-for-service, SMS cost reduction, and strategic partnerships. These inputs were used to develop four different cost-recovery scenarios. The four scenarios leveraged strategic partnerships to reduce per-SMS program costs and create per-SMS program revenue and varied the structure for user financial contribution. Finally, we conducted break-even and uncertainty analyses to evaluate the costs and revenues of these models at the 2014 user volume (125,320) and at any possible break-even volume. In three of four scenarios, costs exceeded revenue by $94,596, $34,443, and $84,571 at the 2014 user volume. However, these costs represented large reductions (54%, 83%, and 58%, respectively) from the 2014 program cost of $203,475. Scenario four, in which the lowest per-SMS rate ($0.01 per SMS) was negotiated and users paid for all m4RH SMS sent or received, achieved a $5,660 profit at the 2014 user volume. A Monte Carlo uncertainty analysis demonstrated that break-even points were driven by user volume rather than variations in program costs. These results reveal that breaking even was only probable when all SMS costs were transferred to users and the lowest per-SMS cost was negotiated with telecom partners. While this strategy was sustainable for the implementer, a central concern is that health information may not reach those who are too poor to pay, limiting the program's reach and impact. Incorporating strategies presented here may make mHealth programs more appealing to funders and investors but need further consideration to balance sustainability, scale, and impact.

  19. A computer program for uncertainty analysis integrating regression and Bayesian methods

    USGS Publications Warehouse

    Lu, Dan; Ye, Ming; Hill, Mary C.; Poeter, Eileen P.; Curtis, Gary

    2014-01-01

    This work develops a new functionality in UCODE_2014 to evaluate Bayesian credible intervals using the Markov Chain Monte Carlo (MCMC) method. The MCMC capability in UCODE_2014 is based on the FORTRAN version of the differential evolution adaptive Metropolis (DREAM) algorithm of Vrugt et al. (2009), which estimates the posterior probability density function of model parameters in high-dimensional and multimodal sampling problems. The UCODE MCMC capability provides eleven prior probability distributions and three ways to initialize the sampling process. It evaluates parametric and predictive uncertainties and it has parallel computing capability based on multiple chains to accelerate the sampling process. This paper tests and demonstrates the MCMC capability using a 10-dimensional multimodal mathematical function, a 100-dimensional Gaussian function, and a groundwater reactive transport model. The use of the MCMC capability is made straightforward and flexible by adopting the JUPITER API protocol. With the new MCMC capability, UCODE_2014 can be used to calculate three types of uncertainty intervals, which all can account for prior information: (1) linear confidence intervals which require linearity and Gaussian error assumptions and typically 10s–100s of highly parallelizable model runs after optimization, (2) nonlinear confidence intervals which require a smooth objective function surface and Gaussian observation error assumptions and typically 100s–1,000s of partially parallelizable model runs after optimization, and (3) MCMC Bayesian credible intervals which require few assumptions and commonly 10,000s–100,000s or more partially parallelizable model runs. Ready access allows users to select methods best suited to their work, and to compare methods in many circumstances.

  20. Measurement uncertainty evaluation of conicity error inspected on CMM

    NASA Astrophysics Data System (ADS)

    Wang, Dongxia; Song, Aiguo; Wen, Xiulan; Xu, Youxiong; Qiao, Guifang

    2016-01-01

    The cone is widely used in mechanical design for rotation, centering and fixing. Whether the conicity error can be measured and evaluated accurately will directly influence its assembly accuracy and working performance. According to the new generation geometrical product specification(GPS), the error and its measurement uncertainty should be evaluated together. The mathematical model of the minimum zone conicity error is established and an improved immune evolutionary algorithm(IIEA) is proposed to search for the conicity error. In the IIEA, initial antibodies are firstly generated by using quasi-random sequences and two kinds of affinities are calculated. Then, each antibody clone is generated and they are self-adaptively mutated so as to maintain diversity. Similar antibody is suppressed and new random antibody is generated. Because the mathematical model of conicity error is strongly nonlinear and the input quantities are not independent, it is difficult to use Guide to the expression of uncertainty in the measurement(GUM) method to evaluate measurement uncertainty. Adaptive Monte Carlo method(AMCM) is proposed to estimate measurement uncertainty in which the number of Monte Carlo trials is selected adaptively and the quality of the numerical results is directly controlled. The cone parts was machined on lathe CK6140 and measured on Miracle NC 454 Coordinate Measuring Machine(CMM). The experiment results confirm that the proposed method not only can search for the approximate solution of the minimum zone conicity error(MZCE) rapidly and precisely, but also can evaluate measurement uncertainty and give control variables with an expected numerical tolerance. The conicity errors computed by the proposed method are 20%-40% less than those computed by NC454 CMM software and the evaluation accuracy improves significantly.

  1. Optimal Decisions for Organ Exchanges in a Kidney Paired Donation Program.

    PubMed

    Li, Yijiang; Song, Peter X-K; Zhou, Yan; Leichtman, Alan B; Rees, Michael A; Kalbfleisch, John D

    2014-05-01

    The traditional concept of barter exchange in economics has been extended in the modern era to the area of living-donor kidney transplantation, where one incompatible donor-candidate pair is matched to another pair with a complementary incompatibility, such that the donor from one pair gives an organ to a compatible candidate in the other pair and vice versa. Kidney paired donation (KPD) programs provide a unique and important platform for living incompatible donor-candidate pairs to exchange organs in order to achieve mutual benefit. In this paper, we propose novel organ allocation strategies to arrange kidney exchanges under uncertainties with advantages, including (i) allowance for a general utility-based evaluation of potential kidney transplants and an explicit consideration of stochastic features inherent in a KPD program; and (ii) exploitation of possible alternative exchanges when the originally planned allocation cannot be fully executed. This allocation strategy is implemented using an integer programming (IP) formulation, and its implication is assessed via a data-based simulation system by tracking an evolving KPD program over a series of match runs. Extensive simulation studies are provided to illustrate our proposed approach.

  2. The atmospheric effects of stratospheric aircraft: A third program report

    NASA Technical Reports Server (NTRS)

    Stolarski, Richard S. (Editor); Wesoky, Howard L. (Editor)

    1993-01-01

    A third report from the Atmospheric Effects of Stratospheric Aircraft (AESA) component of NASA's High-Speed Research Program (HSRP) is presented. Market and technology considerations continue to provide an impetus for high-speed civil transport research. A recent United Nations Environment Program scientific assessment showed that considerable uncertainty still exists about the possible impact of aircraft on the atmosphere. The AESA was designed to develop the body of scientific knowledge necessary for the evaluation of the impact of stratospheric aircraft on the atmosphere. The first Program report presented the basic objectives and plans for AESA. This third report marks the midpoint of the program and presents the status of the ongoing research on the impact of stratospheric aircraft on the atmosphere as reported at the third annual AESA Program meeting in June 1993. The focus of the program is on predicted atmospheric changes resulting from projected HSCT emissions. Topics reported on cover how high-speed civil transports (HSCT) might affect stratospheric ozone, emissions scenarios and databases to assess potential atmospheric effects from HSCT's, calculated results from 2-D zonal mean models using emissions data, engine trace constituent measurements, and exhaust plume/aircraft wake vortex interactions.

  3. Two-stage fuzzy-stochastic robust programming: a hybrid model for regional air quality management.

    PubMed

    Li, Yongping; Huang, Guo H; Veawab, Amornvadee; Nie, Xianghui; Liu, Lei

    2006-08-01

    In this study, a hybrid two-stage fuzzy-stochastic robust programming (TFSRP) model is developed and applied to the planning of an air-quality management system. As an extension of existing fuzzy-robust programming and two-stage stochastic programming methods, the TFSRP can explicitly address complexities and uncertainties of the study system without unrealistic simplifications. Uncertain parameters can be expressed as probability density and/or fuzzy membership functions, such that robustness of the optimization efforts can be enhanced. Moreover, economic penalties as corrective measures against any infeasibilities arising from the uncertainties are taken into account. This method can, thus, provide a linkage to predefined policies determined by authorities that have to be respected when a modeling effort is undertaken. In its solution algorithm, the fuzzy decision space can be delimited through specification of the uncertainties using dimensional enlargement of the original fuzzy constraints. The developed model is applied to a case study of regional air quality management. The results indicate that reasonable solutions have been obtained. The solutions can be used for further generating pollution-mitigation alternatives with minimized system costs and for providing a more solid support for sound environmental decisions.

  4. The Second International Piping Integrity Research Group (IPIRG-2) program. Final report, October 1991--April 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hopper, A.; Wilowski, G.; Scott, P.

    1997-03-01

    The IPIRG-2 program was an international group program managed by the US NRC and funded by organizations from 15 nations. The emphasis of the IPIRG-2 program was the development of data to verify fracture analyses for cracked pipes and fittings subjected to dynamic/cyclic load histories typical of seismic events. The scope included: (1) the study of more complex dynamic/cyclic load histories, i.e., multi-frequency, variable amplitude, simulated seismic excitations, than those considered in the IPIRG-1 program, (2) crack sizes more typical of those considered in Leak-Before-Break (LBB) and in-service flaw evaluations, (3) through-wall-cracked pipe experiments which can be used to validatemore » LBB-type fracture analyses, (4) cracks in and around pipe fittings, such as elbows, and (5) laboratory specimen and separate effect pipe experiments to provide better insight into the effects of dynamic and cyclic load histories. Also undertaken were an uncertainty analysis to identify the issues most important for LBB or in-service flaw evaluations, updating computer codes and databases, the development and conduct of a series of round-robin analyses, and analyst`s group meetings to provide a forum for nuclear piping experts from around the world to exchange information on the subject of pipe fracture technology. 17 refs., 104 figs., 41 tabs.« less

  5. Comprehensive Approach to Verification and Validation of CFD Simulations Applied to Backward Facing Step-Application of CFD Uncertainty Analysis

    NASA Technical Reports Server (NTRS)

    Groves, Curtis E.; LLie, Marcel; Shallhorn, Paul A.

    2012-01-01

    There are inherent uncertainties and errors associated with using Computational Fluid Dynamics (CFD) to predict the flow field and there is no standard method for evaluating uncertainty in the CFD community. This paper describes an approach to -validate the . uncertainty in using CFD. The method will use the state of the art uncertainty analysis applying different turbulence niodels and draw conclusions on which models provide the least uncertainty and which models most accurately predict the flow of a backward facing step.

  6. A stochastic method to characterize model uncertainty for a Nutrient TMDL

    USDA-ARS?s Scientific Manuscript database

    The U.S. EPA’s Total Maximum Daily Load (TMDL) program has encountered resistances in its implementation partly because of its strong dependence on mathematical models to set limitations on the release of impairing substances. The uncertainty associated with predictions of such models is often not s...

  7. Self-Efficacy for Resolving Environmental Uncertainties: Implications for Entrepreneurial Educational and Support Programs

    ERIC Educational Resources Information Center

    Pushkarskaya, Helen; Usher, Ellen L.

    2010-01-01

    Using a unique sample of rural Kentucky residents, we demonstrated that, in the domain of operational and competitive environmental uncertainties, self-efficacy beliefs are significantly higher among nascent entrepreneurs than among non-entrepreneurs. We employed the hierarchical logistic regression analysis to demonstrate that this result is…

  8. Solid Waste Program technical baseline description

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carlson, A.B.

    1994-07-01

    The system engineering approach has been taken to describe the technical baseline under which the Solid Waste Program is currently operating. The document contains a mission analysis, function analysis, system definition, documentation requirements, facility and project bases, and uncertainties facing the program.

  9. Decision analysis for conservation breeding: Maximizing production for reintroduction of whooping cranes

    USGS Publications Warehouse

    Smith, Des H.V.; Converse, Sarah J.; Gibson, Keith; Moehrenschlager, Axel; Link, William A.; Olsen, Glenn H.; Maguire, Kelly

    2011-01-01

    Captive breeding is key to management of severely endangered species, but maximizing captive production can be challenging because of poor knowledge of species breeding biology and the complexity of evaluating different management options. In the face of uncertainty and complexity, decision-analytic approaches can be used to identify optimal management options for maximizing captive production. Building decision-analytic models requires iterations of model conception, data analysis, model building and evaluation, identification of remaining uncertainty, further research and monitoring to reduce uncertainty, and integration of new data into the model. We initiated such a process to maximize captive production of the whooping crane (Grus americana), the world's most endangered crane, which is managed through captive breeding and reintroduction. We collected 15 years of captive breeding data from 3 institutions and used Bayesian analysis and model selection to identify predictors of whooping crane hatching success. The strongest predictor, and that with clear management relevance, was incubation environment. The incubation period of whooping crane eggs is split across two environments: crane nests and artificial incubators. Although artificial incubators are useful for allowing breeding pairs to produce multiple clutches, our results indicate that crane incubation is most effective at promoting hatching success. Hatching probability increased the longer an egg spent in a crane nest, from 40% hatching probability for eggs receiving 1 day of crane incubation to 95% for those receiving 30 days (time incubated in each environment varied independently of total incubation period). Because birds will lay fewer eggs when they are incubating longer, a tradeoff exists between the number of clutches produced and egg hatching probability. We developed a decision-analytic model that estimated 16 to be the optimal number of days of crane incubation needed to maximize the number of offspring produced. These results show that using decision-analytic tools to account for uncertainty in captive breeding can improve the rate at which such programs contribute to wildlife reintroductions. 

  10. Fostering Environmental Literacy For A Changing Earth: Interactive and Participatory Outreach Programs at Biosphere 2

    NASA Astrophysics Data System (ADS)

    Pavao-Zuckerman, M.; Huxman, T.; Morehouse, B.

    2008-12-01

    Earth system and ecological sustainability problems are complex outcomes of biological, physical, social, and economic interactions. A common goal of outreach and education programs is to foster a scientifically literate community that possesses the knowledge to contribute to environmental policies and decision making. Uncertainty and variability that is both inherent in Earth system and ecological sciences can confound such goals of improved ecological literacy. Public programs provide an opportunity to engage lay-persons in the scientific method, allowing them to experience science in action and confront these uncertainties face-on. We begin with a definition of scientific literacy that expands its conceptualization of science beyond just a collection of facts and concepts to one that views science as a process to aid understanding of natural phenomena. A process-based scientific literacy allows the public, teachers, and students to assimilate new information, evaluate climate research, and to ultimately make decisions that are informed by science. The Biosphere 2 facility (B2) is uniquely suited for such outreach programs because it allows linking Earth system and ecological science research activities in a large scale controlled environment setting with outreach and education opportunities. A primary outreach goal is to demonstrate science in action to an audience that ranges from K-12 groups to retired citizens. Here we discuss approaches to outreach programs that focus on soil-water-atmosphere-plant interactions and their roles in the impacts and causes of global environmental change. We describe a suite of programs designed to vary the amount of participation a visitor has with the science process (from passive learning to data collection to helping design experiments) to test the hypothesis that active learning fosters increased scientific literacy and the creation of science advocates. We argue that a revised framing of the scientific method with a more open role for citizens in science will have greater success in fostering science literacy and produce a citizenry that is equipped to tackle complex environmental decision making.

  11. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paulson, Patrick R.; Purohit, Sumit; Rodriguez, Luke R.

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  12. Helium Mass Spectrometer Leak Detection: A Method to Quantify Total Measurement Uncertainty

    NASA Technical Reports Server (NTRS)

    Mather, Janice L.; Taylor, Shawn C.

    2015-01-01

    In applications where leak rates of components or systems are evaluated against a leak rate requirement, the uncertainty of the measured leak rate must be included in the reported result. However, in the helium mass spectrometer leak detection method, the sensitivity, or resolution, of the instrument is often the only component of the total measurement uncertainty noted when reporting results. To address this shortfall, a measurement uncertainty analysis method was developed that includes the leak detector unit's resolution, repeatability, hysteresis, and drift, along with the uncertainty associated with the calibration standard. In a step-wise process, the method identifies the bias and precision components of the calibration standard, the measurement correction factor (K-factor), and the leak detector unit. Together these individual contributions to error are combined and the total measurement uncertainty is determined using the root-sum-square method. It was found that the precision component contributes more to the total uncertainty than the bias component, but the bias component is not insignificant. For helium mass spectrometer leak rate tests where unit sensitivity alone is not enough, a thorough evaluation of the measurement uncertainty such as the one presented herein should be performed and reported along with the leak rate value.

  13. A Framework for Quantifying Measurement Uncertainties and Uncertainty Propagation in HCCI/LTGC Engine Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petitpas, Guillaume; McNenly, Matthew J.; Whitesides, Russell A.

    In this study, a framework for estimating experimental measurement uncertainties for a Homogenous Charge Compression Ignition (HCCI)/Low-Temperature Gasoline Combustion (LTGC) engine testing facility is presented. Detailed uncertainty quantification is first carried out for the measurement of the in-cylinder pressure, whose variations during the cycle provide most of the information for performance evaluation. Standard uncertainties of other measured quantities, such as the engine geometry and speed, the air and fuel flow rate and the intake/exhaust dry molar fractions are also estimated. Propagating those uncertainties using a Monte Carlo simulation and Bayesian inference methods then allows for estimation of uncertainties of themore » mass-average temperature and composition at IVC and throughout the cycle; and also of the engine performances such as gross Integrated Mean Effective Pressure, Heat Release and Ringing Intensity. Throughout the analysis, nominal values for uncertainty inputs were taken from a well-characterized engine test facility. However, the analysis did not take into account the calibration practice of experiments run in that facility and the resulting uncertainty values are therefore not indicative of the expected accuracy of those experimental results. A future study will employ the methodology developed here to explore the effects of different calibration methods on the different uncertainty values in order to evaluate best practices for accurate engine measurements.« less

  14. A Framework for Quantifying Measurement Uncertainties and Uncertainty Propagation in HCCI/LTGC Engine Experiments

    DOE PAGES

    Petitpas, Guillaume; McNenly, Matthew J.; Whitesides, Russell A.

    2017-03-28

    In this study, a framework for estimating experimental measurement uncertainties for a Homogenous Charge Compression Ignition (HCCI)/Low-Temperature Gasoline Combustion (LTGC) engine testing facility is presented. Detailed uncertainty quantification is first carried out for the measurement of the in-cylinder pressure, whose variations during the cycle provide most of the information for performance evaluation. Standard uncertainties of other measured quantities, such as the engine geometry and speed, the air and fuel flow rate and the intake/exhaust dry molar fractions are also estimated. Propagating those uncertainties using a Monte Carlo simulation and Bayesian inference methods then allows for estimation of uncertainties of themore » mass-average temperature and composition at IVC and throughout the cycle; and also of the engine performances such as gross Integrated Mean Effective Pressure, Heat Release and Ringing Intensity. Throughout the analysis, nominal values for uncertainty inputs were taken from a well-characterized engine test facility. However, the analysis did not take into account the calibration practice of experiments run in that facility and the resulting uncertainty values are therefore not indicative of the expected accuracy of those experimental results. A future study will employ the methodology developed here to explore the effects of different calibration methods on the different uncertainty values in order to evaluate best practices for accurate engine measurements.« less

  15. Fundamental uncertainty limit of optical flow velocimetry according to Heisenberg's uncertainty principle.

    PubMed

    Fischer, Andreas

    2016-11-01

    Optical flow velocity measurements are important for understanding the complex behavior of flows. Although a huge variety of methods exist, they are either based on a Doppler or a time-of-flight measurement principle. Doppler velocimetry evaluates the velocity-dependent frequency shift of light scattered at a moving particle, whereas time-of-flight velocimetry evaluates the traveled distance of a scattering particle per time interval. Regarding the aim of achieving a minimal measurement uncertainty, it is unclear if one principle allows to achieve lower uncertainties or if both principles can achieve equal uncertainties. For this reason, the natural, fundamental uncertainty limit according to Heisenberg's uncertainty principle is derived for Doppler and time-of-flight measurement principles, respectively. The obtained limits of the velocity uncertainty are qualitatively identical showing, e.g., a direct proportionality for the absolute value of the velocity to the power of 32 and an indirect proportionality to the square root of the scattered light power. Hence, both measurement principles have identical potentials regarding the fundamental uncertainty limit due to the quantum mechanical behavior of photons. This fundamental limit can be attained (at least asymptotically) in reality either with Doppler or time-of-flight methods, because the respective Cramér-Rao bounds for dominating photon shot noise, which is modeled as white Poissonian noise, are identical with the conclusions from Heisenberg's uncertainty principle.

  16. Grey-Theory-Based Optimization Model of Emergency Logistics Considering Time Uncertainty.

    PubMed

    Qiu, Bao-Jian; Zhang, Jiang-Hua; Qi, Yuan-Tao; Liu, Yang

    2015-01-01

    Natural disasters occur frequently in recent years, causing huge casualties and property losses. Nowadays, people pay more and more attention to the emergency logistics problems. This paper studies the emergency logistics problem with multi-center, multi-commodity, and single-affected-point. Considering that the path near the disaster point may be damaged, the information of the state of the paths is not complete, and the travel time is uncertainty, we establish the nonlinear programming model that objective function is the maximization of time-satisfaction degree. To overcome these drawbacks: the incomplete information and uncertain time, this paper firstly evaluates the multiple roads of transportation network based on grey theory and selects the reliable and optimal path. Then simplify the original model under the scenario that the vehicle only follows the optimal path from the emergency logistics center to the affected point, and use Lingo software to solve it. The numerical experiments are presented to show the feasibility and effectiveness of the proposed method.

  17. Grey-Theory-Based Optimization Model of Emergency Logistics Considering Time Uncertainty

    PubMed Central

    Qiu, Bao-Jian; Zhang, Jiang-Hua; Qi, Yuan-Tao; Liu, Yang

    2015-01-01

    Natural disasters occur frequently in recent years, causing huge casualties and property losses. Nowadays, people pay more and more attention to the emergency logistics problems. This paper studies the emergency logistics problem with multi-center, multi-commodity, and single-affected-point. Considering that the path near the disaster point may be damaged, the information of the state of the paths is not complete, and the travel time is uncertainty, we establish the nonlinear programming model that objective function is the maximization of time-satisfaction degree. To overcome these drawbacks: the incomplete information and uncertain time, this paper firstly evaluates the multiple roads of transportation network based on grey theory and selects the reliable and optimal path. Then simplify the original model under the scenario that the vehicle only follows the optimal path from the emergency logistics center to the affected point, and use Lingo software to solve it. The numerical experiments are presented to show the feasibility and effectiveness of the proposed method. PMID:26417946

  18. Diving into the consumer nutrition environment: A Bayesian spatial factor analysis of neighborhood restaurant environment.

    PubMed

    Luan, Hui; Law, Jane; Lysy, Martin

    2018-02-01

    Neighborhood restaurant environment (NRE) plays a vital role in shaping residents' eating behaviors. While NRE 'healthfulness' is a multi-facet concept, most studies evaluate it based only on restaurant type, thus largely ignoring variations of in-restaurant features. In the few studies that do account for such features, healthfulness scores are simply averaged over accessible restaurants, thereby concealing any uncertainty that attributed to neighborhoods' size or spatial correlation. To address these limitations, this paper presents a Bayesian Spatial Factor Analysis for assessing NRE healthfulness in the city of Kitchener, Canada. Several in-restaurant characteristics are included. By treating NRE healthfulness as a spatially correlated latent variable, the adopted modeling approach can: (i) identify specific indicators most relevant to NRE healthfulness, (ii) provide healthfulness estimates for neighborhoods without accessible restaurants, and (iii) readily quantify uncertainties in the healthfulness index. Implications of the analysis for intervention program development and community food planning are discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1988-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach was developed to quantify the effects of the random uncertainties. The results indicate that only the variations in geometry have significant effects.

  20. MODFLOW-2000, the U.S. Geological Survey modular ground-water model -- Documentation of MOD-PREDICT for predictions, prediction sensitivity analysis, and evaluation of uncertainty

    USGS Publications Warehouse

    Tonkin, M.J.; Hill, Mary C.; Doherty, John

    2003-01-01

    This document describes the MOD-PREDICT program, which helps evaluate userdefined sets of observations, prior information, and predictions, using the ground-water model MODFLOW-2000. MOD-PREDICT takes advantage of the existing Observation and Sensitivity Processes (Hill and others, 2000) by initiating runs of MODFLOW-2000 and using the output files produced. The names and formats of the MODFLOW-2000 input files are unchanged, such that full backward compatibility is maintained. A new name file and input files are required for MOD-PREDICT. The performance of MOD-PREDICT has been tested in a variety of applications. Future applications, however, might reveal errors that were not detected in the test simulations. Users are requested to notify the U.S. Geological Survey of any errors found in this document or the computer program using the email address available at the web address below. Updates might occasionally be made to this document, to the MOD-PREDICT program, and to MODFLOW- 2000. Users can check for updates on the Internet at URL http://water.usgs.gov/software/ground water.html/.

  1. Sustainable Cost Models for mHealth at Scale: Modeling Program Data from m4RH Tanzania

    PubMed Central

    Mangone, Emily R.; Agarwal, Smisha; L’Engle, Kelly; Lasway, Christine; Zan, Trinity; van Beijma, Hajo; Orkis, Jennifer; Karam, Robert

    2016-01-01

    Background There is increasing evidence that mobile phone health interventions (“mHealth”) can improve health behaviors and outcomes and are critically important in low-resource, low-access settings. However, the majority of mHealth programs in developing countries fail to reach scale. One reason may be the challenge of developing financially sustainable programs. The goal of this paper is to explore strategies for mHealth program sustainability and develop cost-recovery models for program implementers using 2014 operational program data from Mobile for Reproductive Health (m4RH), a national text-message (SMS) based health communication service in Tanzania. Methods We delineated 2014 m4RH program costs and considered three strategies for cost-recovery for the m4RH program: user pay-for-service, SMS cost reduction, and strategic partnerships. These inputs were used to develop four different cost-recovery scenarios. The four scenarios leveraged strategic partnerships to reduce per-SMS program costs and create per-SMS program revenue and varied the structure for user financial contribution. Finally, we conducted break-even and uncertainty analyses to evaluate the costs and revenues of these models at the 2014 user volume (125,320) and at any possible break-even volume. Results In three of four scenarios, costs exceeded revenue by $94,596, $34,443, and $84,571 at the 2014 user volume. However, these costs represented large reductions (54%, 83%, and 58%, respectively) from the 2014 program cost of $203,475. Scenario four, in which the lowest per-SMS rate ($0.01 per SMS) was negotiated and users paid for all m4RH SMS sent or received, achieved a $5,660 profit at the 2014 user volume. A Monte Carlo uncertainty analysis demonstrated that break-even points were driven by user volume rather than variations in program costs. Conclusions These results reveal that breaking even was only probable when all SMS costs were transferred to users and the lowest per-SMS cost was negotiated with telecom partners. While this strategy was sustainable for the implementer, a central concern is that health information may not reach those who are too poor to pay, limiting the program’s reach and impact. Incorporating strategies presented here may make mHealth programs more appealing to funders and investors but need further consideration to balance sustainability, scale, and impact. PMID:26824747

  2. The Effect of Uncertainty Management Program on Quality of Life Among Vietnamese Women at 3 Weeks Postmastectomy.

    PubMed

    Ha, Xuan Thi Nhu; Thanasilp, Sureeporn; Thato, Ratsiri

    2018-05-10

    In Vietnam, breast cancer is a top contributor to cancer-related deaths in women. Evidence shows that, after mastectomy, women in Vietnam have a lower quality of life than women in other countries. In addition, high uncertainty is a predictor of low quality of life postmastectomy. Therefore, if nurses can manage uncertainty, the quality of life postmastectomy can improve. This study examined the effect of the Uncertainty Management Program (UMP) on quality of life at 3 weeks postmastectomy in Vietnamese women. This research was a quasi-experimental study using a "posttest only with control group" design. There were 115 subjects assigned to either the experimental group (n = 57), who participated in the UMP and routine care, or the control group (n = 58), who received only routine care. Participants were assessed 2 times postmastectomy using the modified Quality of Life Index Scale-Vietnamese version. The experimental group exhibited low uncertainty before discharge and significantly higher quality of life than the control group at 1 and 3 weeks postmastectomy, respectively (P < .05). Women's physical well-being, psychological well-being, body image concerns, and social concerns were significantly increased with UMP. The UMP was considered as a promising program that might benefit the QoL of women with breast cancer 3 weeks postmastectomy. The UMP appears feasible to apply for women with breast cancer to improve their QoL postmastectomy in various settings. Nurses can flexibility instruct women in their holistic care attention both in the hospital and at home.

  3. A facility location model for municipal solid waste management system under uncertain environment.

    PubMed

    Yadav, Vinay; Bhurjee, A K; Karmakar, Subhankar; Dikshit, A K

    2017-12-15

    In municipal solid waste management system, decision makers have to develop an insight into the processes namely, waste generation, collection, transportation, processing, and disposal methods. Many parameters (e.g., waste generation rate, functioning costs of facilities, transportation cost, and revenues) in this system are associated with uncertainties. Often, these uncertainties of parameters need to be modeled under a situation of data scarcity for generating probability distribution function or membership function for stochastic mathematical programming or fuzzy mathematical programming respectively, with only information of extreme variations. Moreover, if uncertainties are ignored, then the problems like insufficient capacities of waste management facilities or improper utilization of available funds may be raised. To tackle uncertainties of these parameters in a more efficient manner an algorithm, based on interval analysis, has been developed. This algorithm is applied to find optimal solutions for a facility location model, which is formulated to select economically best locations of transfer stations in a hypothetical urban center. Transfer stations are an integral part of contemporary municipal solid waste management systems, and economic siting of transfer stations ensures financial sustainability of this system. The model is written in a mathematical programming language AMPL with KNITRO as a solver. The developed model selects five economically best locations out of ten potential locations with an optimum overall cost of [394,836, 757,440] Rs. 1 /day ([5906, 11,331] USD/day) approximately. Further, the requirement of uncertainty modeling is explained based on the results of sensitivity analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. An interval-based possibilistic programming method for waste management with cost minimization and environmental-impact abatement under uncertainty.

    PubMed

    Li, Y P; Huang, G H

    2010-09-15

    Considerable public concerns have been raised in the past decades since a large amount of pollutant emissions from municipal solid waste (MSW) disposal of processes pose risks on surrounding environment and human health. Moreover, in MSW management, various uncertainties exist in the related costs, impact factors and objectives, which can affect the optimization processes and the decision schemes generated. In this study, an interval-based possibilistic programming (IBPP) method is developed for planning the MSW management with minimized system cost and environmental impact under uncertainty. The developed method can deal with uncertainties expressed as interval values and fuzzy sets in the left- and right-hand sides of constraints and objective function. An interactive algorithm is provided for solving the IBPP problem, which does not lead to more complicated intermediate submodels and has a relatively low computational requirement. The developed model is applied to a case study of planning a MSW management system, where mixed integer linear programming (MILP) technique is introduced into the IBPP framework to facilitate dynamic analysis for decisions of timing, sizing and siting in terms of capacity expansion for waste-management facilities. Three cases based on different waste-management policies are examined. The results obtained indicate that inclusion of environmental impacts in the optimization model can change the traditional waste-allocation pattern merely based on the economic-oriented planning approach. The results obtained can help identify desired alternatives for managing MSW, which has advantages in providing compromised schemes under an integrated consideration of economic efficiency and environmental impact under uncertainty. Copyright 2010 Elsevier B.V. All rights reserved.

  5. Multi-objective calibration and uncertainty analysis of hydrologic models; A comparative study between formal and informal methods

    NASA Astrophysics Data System (ADS)

    Shafii, M.; Tolson, B.; Matott, L. S.

    2012-04-01

    Hydrologic modeling has benefited from significant developments over the past two decades. This has resulted in building of higher levels of complexity into hydrologic models, which eventually makes the model evaluation process (parameter estimation via calibration and uncertainty analysis) more challenging. In order to avoid unreasonable parameter estimates, many researchers have suggested implementation of multi-criteria calibration schemes. Furthermore, for predictive hydrologic models to be useful, proper consideration of uncertainty is essential. Consequently, recent research has emphasized comprehensive model assessment procedures in which multi-criteria parameter estimation is combined with statistically-based uncertainty analysis routines such as Bayesian inference using Markov Chain Monte Carlo (MCMC) sampling. Such a procedure relies on the use of formal likelihood functions based on statistical assumptions, and moreover, the Bayesian inference structured on MCMC samplers requires a considerably large number of simulations. Due to these issues, especially in complex non-linear hydrological models, a variety of alternative informal approaches have been proposed for uncertainty analysis in the multi-criteria context. This study aims at exploring a number of such informal uncertainty analysis techniques in multi-criteria calibration of hydrological models. The informal methods addressed in this study are (i) Pareto optimality which quantifies the parameter uncertainty using the Pareto solutions, (ii) DDS-AU which uses the weighted sum of objective functions to derive the prediction limits, and (iii) GLUE which describes the total uncertainty through identification of behavioral solutions. The main objective is to compare such methods with MCMC-based Bayesian inference with respect to factors such as computational burden, and predictive capacity, which are evaluated based on multiple comparative measures. The measures for comparison are calculated both for calibration and evaluation periods. The uncertainty analysis methodologies are applied to a simple 5-parameter rainfall-runoff model, called HYMOD.

  6. Dynamic analysis for solid waste management systems: an inexact multistage integer programming approach.

    PubMed

    Li, Yongping; Huang, Guohe

    2009-03-01

    In this study, a dynamic analysis approach based on an inexact multistage integer programming (IMIP) model is developed for supporting municipal solid waste (MSW) management under uncertainty. Techniques of interval-parameter programming and multistage stochastic programming are incorporated within an integer-programming framework. The developed IMIP can deal with uncertainties expressed as probability distributions and interval numbers, and can reflect the dynamics in terms of decisions for waste-flow allocation and facility-capacity expansion over a multistage context. Moreover, the IMIP can be used for analyzing various policy scenarios that are associated with different levels of economic consequences. The developed method is applied to a case study of long-term waste-management planning. The results indicate that reasonable solutions have been generated for binary and continuous variables. They can help generate desired decisions of system-capacity expansion and waste-flow allocation with a minimized system cost and maximized system reliability.

  7. Uncertainty-based Optimization Algorithms in Designing Fractionated Spacecraft

    PubMed Central

    Ning, Xin; Yuan, Jianping; Yue, Xiaokui

    2016-01-01

    A fractionated spacecraft is an innovative application of a distributive space system. To fully understand the impact of various uncertainties on its development, launch and in-orbit operation, we use the stochastic missioncycle cost to comprehensively evaluate the survivability, flexibility, reliability and economy of the ways of dividing the various modules of the different configurations of fractionated spacecraft. We systematically describe its concept and then analyze its evaluation and optimal design method that exists during recent years and propose the stochastic missioncycle cost for comprehensive evaluation. We also establish the models of the costs such as module development, launch and deployment and the impacts of their uncertainties respectively. Finally, we carry out the Monte Carlo simulation of the complete missioncycle costs of various configurations of the fractionated spacecraft under various uncertainties and give and compare the probability density distribution and statistical characteristics of its stochastic missioncycle cost, using the two strategies of timing module replacement and non-timing module replacement. The simulation results verify the effectiveness of the comprehensive evaluation method and show that our evaluation method can comprehensively evaluate the adaptability of the fractionated spacecraft under different technical and mission conditions. PMID:26964755

  8. Selection of Representative Models for Decision Analysis Under Uncertainty

    NASA Astrophysics Data System (ADS)

    Meira, Luis A. A.; Coelho, Guilherme P.; Santos, Antonio Alberto S.; Schiozer, Denis J.

    2016-03-01

    The decision-making process in oil fields includes a step of risk analysis associated with the uncertainties present in the variables of the problem. Such uncertainties lead to hundreds, even thousands, of possible scenarios that are supposed to be analyzed so an effective production strategy can be selected. Given this high number of scenarios, a technique to reduce this set to a smaller, feasible subset of representative scenarios is imperative. The selected scenarios must be representative of the original set and also free of optimistic and pessimistic bias. This paper is devoted to propose an assisted methodology to identify representative models in oil fields. To do so, first a mathematical function was developed to model the representativeness of a subset of models with respect to the full set that characterizes the problem. Then, an optimization tool was implemented to identify the representative models of any problem, considering not only the cross-plots of the main output variables, but also the risk curves and the probability distribution of the attribute-levels of the problem. The proposed technique was applied to two benchmark cases and the results, evaluated by experts in the field, indicate that the obtained solutions are richer than those identified by previously adopted manual approaches. The program bytecode is available under request.

  9. Evaluation of Spacecraft Shielding Effectiveness for Radiation Protection

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Wilson, John W.

    1999-01-01

    The potential for serious health risks from solar particle events (SPE) and galactic cosmic rays (GCR) is a critical issue in the NASA strategic plan for the Human Exploration and Development of Space (HEDS). The excess cost to protect against the GCR and SPE due to current uncertainties in radiation transmission properties and cancer biology could be exceedingly large based on the excess launch costs to shield against uncertainties. The development of advanced shielding concepts is an important risk mitigation area with the potential to significantly reduce risk below conventional mission designs. A key issue in spacecraft material selection is the understanding of nuclear reactions on the transmission properties of materials. High-energy nuclear particles undergo nuclear reactions in passing through materials and tissue altering their composition and producing new radiation types. Spacecraft and planetary habitat designers can utilize radiation transport codes to identify optimal materials for lowering exposures and to optimize spacecraft design to reduce astronaut exposures. To reach these objectives will require providing design engineers with accurate data bases and computationally efficient software for describing the transmission properties of space radiation in materials. Our program will reduce the uncertainty in the transmission properties of space radiation by improving the theoretical description of nuclear reactions and radiation transport, and provide accurate physical descriptions of the track structure of microscopic energy deposition.

  10. Cost-utility analysis of screening for diabetic retinopathy in Japan: a probabilistic Markov modeling study.

    PubMed

    Kawasaki, Ryo; Akune, Yoko; Hiratsuka, Yoshimune; Fukuhara, Shunichi; Yamada, Masakazu

    2015-02-01

    To evaluate the cost-effectiveness for a screening interval longer than 1 year detecting diabetic retinopathy (DR) through the estimation of incremental costs per quality-adjusted life year (QALY) based on the best available clinical data in Japan. A Markov model with a probabilistic cohort analysis was framed to calculate incremental costs per QALY gained by implementing a screening program detecting DR in Japan. A 1-year cycle length and population size of 50,000 with a 50-year time horizon (age 40-90 years) was used. Best available clinical data from publications and national surveillance data was used, and a model was designed including current diagnosis and management of DR with corresponding visual outcomes. One-way and probabilistic sensitivity analyses were performed considering uncertainties in the parameters. In the base-case analysis, the strategy with a screening program resulted in an incremental cost of 5,147 Japanese yen (¥; US$64.6) and incremental effectiveness of 0.0054 QALYs per person screened. The incremental cost-effectiveness ratio was ¥944,981 (US$11,857) per QALY. The simulation suggested that screening would result in a significant reduction in blindness in people aged 40 years or over (-16%). Sensitivity analyses suggested that in order to achieve both reductions in blindness and cost-effectiveness in Japan, the screening program should screen those aged 53-84 years, at intervals of 3 years or less. An eye screening program in Japan would be cost-effective in detecting DR and preventing blindness from DR, even allowing for the uncertainties in estimates of costs, utility, and current management of DR.

  11. Evaluation of thermal cameras in quality systems according to ISO 9000 or EN 45000 standards

    NASA Astrophysics Data System (ADS)

    Chrzanowski, Krzysztof

    2001-03-01

    According to the international standards ISO 9001-9004 and EN 45001-45003 the industrial plants and the accreditation laboratories that implemented the quality systems according to these standards are required to evaluate an uncertainty of measurements. Manufacturers of thermal cameras do not offer any data that could enable estimation of measurement uncertainty of these imagers. Difficulties in determining the measurement uncertainty is an important limitation of thermal cameras for applications in the industrial plants and the cooperating accreditation laboratories that have implemented these quality systems. A set of parameters for characterization of commercial thermal cameras, a measuring set, some results of testing of these cameras, a mathematical model of uncertainty, and a software that enables quick calculation of uncertainty of temperature measurements with thermal cameras are presented in this paper.

  12. Which uncertainty? Using expert elicitation and expected value of information to design an adaptive program

    USGS Publications Warehouse

    Runge, Michael C.; Converse, Sarah J.; Lyons, James E.

    2011-01-01

    Natural resource management is plagued with uncertainty of many kinds, but not all uncertainties are equally important to resolve. The promise of adaptive management is that learning in the short-term will improve management in the long-term; that promise is best kept if the focus of learning is on those uncertainties that most impede achievement of management objectives. In this context, an existing tool of decision analysis, the expected value of perfect information (EVPI), is particularly valuable in identifying the most important uncertainties. Expert elicitation can be used to develop preliminary predictions of management response under a series of hypotheses, as well as prior weights for those hypotheses, and the EVPI can be used to determine how much management could improve if uncertainty was resolved. These methods were applied to management of whooping cranes (Grus americana), an endangered migratory bird that is being reintroduced in several places in North America. The Eastern Migratory Population of whooping cranes had exhibited almost no successful reproduction through 2009. Several dozen hypotheses can be advanced to explain this failure, and many of them lead to very different management responses. An expert panel articulated the hypotheses, provided prior weights for them, developed potential management strategies, and made predictions about the response of the population to each strategy under each hypothesis. Multi-criteria decision analysis identified a preferred strategy in the face of uncertainty, and analysis of the expected value of information identified how informative each strategy could be. These results provide the foundation for design of an adaptive management program.

  13. U.S. Eastern Continental Shelf Carbon Cycling (USECoS): Modeling, Data Assimilation, and Analysis

    NASA Technical Reports Server (NTRS)

    Mannino, Antonio

    2008-01-01

    Although the oceans play a major role in the uptake of fossil fuel CO2 from the atmosphere, there is much debate about the contribution from continental shelves, since many key shelf fluxes are not yet well quantified: the exchange of carbon across the land-ocean and shelf-slope interfaces, air-sea exchange of CO2, burial, and biological processes including productivity. Our goal is to quantify these carbon fluxes along the eastern U.S. coast using models quantitatively verified by comparison to observations, and to establish a framework for predicting how these fluxes may be modified as a result of climate and land use change. Our research questions build on those addressed with previous NASA funding for the USECoS (U.S. Eastern Continental Shelf Carbon Cycling) project. We have developed a coupled biogeochemical ocean circulation model configured for this study region and have extensively evaluated this model with both in situ and remotely-sensed data. Results indicate that to further reduce uncertainties in the shelf component of the global carbon cycle, future efforts must be directed towards 1) increasing the resolution of the physical model via nesting and 2) making refinements to the biogeochemical model and quantitatively evaluating these via the assimilation of biogeochemical data (in situ and remotely-sensed). These model improvements are essential for better understanding and reducing estimates of uncertainties in current and future carbon transformations and cycling in continental shelf systems. Our approach and science questions are particularly germane to the carbon cycle science goals of the NASA Earth Science Research Program as well as the U.S. Climate Change Research Program and the North American Carbon Program. Our interdisciplinary research team consists of scientists who have expertise in the physics and biogeochemistry of the U.S. eastern continental shelf, remote-sensing data analysis and data assimilative numerical models.

  14. Uncertainty Estimation Cheat Sheet for Probabilistic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.

    2017-01-01

    "Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This paper will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.

  15. Lognormal Uncertainty Estimation for Failure Rates

    NASA Technical Reports Server (NTRS)

    Britton, Paul T.; Al Hassan, Mohammad; Ring, Robert W.

    2017-01-01

    "Uncertainty analysis itself is uncertain, therefore, you cannot evaluate it exactly," Source Uncertain. Quantitative results for aerospace engineering problems are influenced by many sources of uncertainty. Uncertainty analysis aims to make a technical contribution to decision-making through the quantification of uncertainties in the relevant variables as well as through the propagation of these uncertainties up to the result. Uncertainty can be thought of as a measure of the 'goodness' of a result and is typically represented as statistical dispersion. This presentation will explain common measures of centrality and dispersion; and-with examples-will provide guidelines for how they may be estimated to ensure effective technical contributions to decision-making.

  16. Study on the effectiveness and impact of pentavalent vaccination program in India and other south Asian countries.

    PubMed

    Sreedhar, Sreelakshmi; Antony, Anil; Poulose, Neethu

    2014-01-01

    Penta-valent-vaccine is a combination vaccine administered in a 3-dose schedule, offers protection against diphtheria, tetanus, pertussis (DPT), hepatitis B, and Haemophilus influenza type B (Hib). The vaccine is widely recommended by WHO and GAVI as a substitute for prevailing vaccination practices against the above mentioned diseases and viruses. The vaccine has met with both positive and negative responses, which leads to uncertainties about the vaccine's safety. The pros and cons of the vaccine are to be evaluated carefully before the same is added to routine immunization schedule.

  17. Multiscale Materials Science - A Mathematical Approach to the Role of Defects and Uncertainty

    DTIC Science & Technology

    2016-10-28

    AFRL-AFOSR-UK-TR-2016-0034 Multiscale materials science - a mathematical approach to the role of defects and uncertainty Claude Le Bris ECOLE...science - a mathematical approach to the role of defects and uncertainty 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER FA8655-13-1-3061 5c.  PROGRAM ELEMENT...1FORM SF 298 10/31/2016https://livelink.ebs.afrl.af.mil/livelink/llisapi.dll Contract FA 8655-13-1-3061 Multiscale materials science: a mathematical

  18. Toward a Climate OSSE for NASA Earth Sciences

    NASA Astrophysics Data System (ADS)

    Leroy, S. S.; Collins, W. D.; Feldman, D.; Field, R. D.; Ming, Y.; Pawson, S.; Sanderson, B.; Schmidt, G. A.

    2016-12-01

    In the Continuity Study, the National Academy of Sciences advised that future space missions be rated according to five categories: the importance of a well-defined scientific objective, the utility of the observation in addressing the scientific objective, the quality with which the observation can be made, the probability of the mission's success, and the mission's affordability. The importance, probability, and affordability are evaluated subjectively by scientific consensus, by engineering review panels, and by cost models; however, the utility and quality can be evaluated objectively by a climate observation system simulation experiment (COSSE). A discussion of the philosophical underpinnings of a COSSE for NASA Earth Sciences will be presented. A COSSE is built upon a perturbed physics ensemble of a sophisticated climate model that can simulate a mission's prospective observations and its well-defined quantitative scientific objective and that can capture the uncertainty associated with each. A strong correlation between observation and scientific objective after consideration of physical uncertainty leads to a high quality. Persistence of a high correlation after inclusion of the proposed measurement error leads to a high utility. There are five criteria that govern that nature of a particular COSSE: (1) whether the mission's scientific objective is one of hypothesis testing or climate prediction, (2) whether the mission is empirical or inferential, (3) whether the core climate model captures essential physical uncertainties, (4) the level of detail of the simulated observations, and (5) whether complementarity or redundancy of information is to be valued. Computation of the quality and utility is done using Bayesian statistics, as has been done previously for multi-decadal climate prediction conditioned on existing data. We advocate for a new program within NASA Earth Sciences to establish a COSSE capability. Creation of a COSSE program within NASA Earth Sciences will require answers from the climate research community to basic questions, such as whether a COSSE capability should be centralized or de-centralized. Most importantly, the quantified scientific objective of a proposed mission must be defined with extreme specificity for a COSSE to be applied.

  19. 78 FR 8128 - Request for Nominations of Experts to the EPA Office of Research and Development's Board of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-05

    ... Program; Homeland Security Research Program; Human Health Risk Assessment Research Program; Safe and... --atmospheric physics Biology --biogeochemistry --cell biology --endocrinology (endocrine disruptors... analysis --uncertainty analysis Nanotechnology Public Health --children's health --community health...

  20. Evaluating Uncertainty in Integrated Environmental Models: A Review of Concepts and Tools

    EPA Science Inventory

    This paper reviews concepts for evaluating integrated environmental models and discusses a list of relevant software-based tools. A simplified taxonomy for sources of uncertainty and a glossary of key terms with standard definitions are provided in the context of integrated appro...

  1. Probability and Confidence Trade-Space (PACT) Evaluation: Accounting for Uncertainty in Sparing Assessments

    NASA Technical Reports Server (NTRS)

    Anderson, Leif; Box, Neil; Carter-Journet, Katrina; DiFilippo, Denise; Harrington, Sean; Jackson, David; Lutomski, Michael

    2012-01-01

    Purpose of presentation: (1) Status update on the developing methodology to revise sub-system sparing targets. (2) To describe how to incorporate uncertainty into spare assessments and why it is important to do so (3) Demonstrate hardware risk postures through PACT evaluation

  2. Uncertainty in low-flow data from three streamflow-gaging stations on the upper Verde River, Arizona

    USGS Publications Warehouse

    Anning, D.W.; ,

    2004-01-01

    The evaluation of uncertainty in low-flow data collected from three streamflow-gaging stations on the upper Verde River, Arizona, was presented. In downstream order, the stations are Verde River near Paulden, Verde River near Clarkdale, and Verde River near Camp Verde. A monitoring objective of the evaluation was to characterize discharge of the lower flow regime through a variety of procedures such as frequency analysis and base-flow analysis. For Verde River near Paulden and near Camp Verde, the uncertainty of daily low flows can be reduced by decreasing the uncertainty of discharge-measurement frequency, or building an artificial control that would have a stable stage-discharge relation over time.

  3. Blaming for a better future: future orientation and associated intolerance of personal uncertainty lead to harsher reactions toward innocent victims.

    PubMed

    Bal, Michèlle; van den Bos, Kees

    2012-07-01

    People are often encouraged to focus on the future and strive for long-term goals. This noted, the authors argue that this future orientation is associated with intolerance of personal uncertainty, as people usually cannot be certain that their efforts will pay off. To be able to tolerate personal uncertainty, people adhere strongly to the belief in a just world, paradoxically resulting in harsher reactions toward innocent victims. In three experiments, the authors show that a future orientation indeed leads to more negative evaluations of an innocent victim (Study 1), enhances intolerance of personal uncertainty (Study 2), and that experiencing personal uncertainty leads to more negative evaluations of a victim (Study 3). So, while a future orientation enables people to strive for long-term goals, it also leads them to be harsher toward innocent victims. One underlying mechanism causing these reactions is intolerance of personal uncertainty, associated with a future orientation.

  4. Uncertainty Analysis in Humidity Measurements by the Psychrometer Method

    PubMed Central

    Chen, Jiunyuan; Chen, Chiachung

    2017-01-01

    The most common and cheap indirect technique to measure relative humidity is by using psychrometer based on a dry and a wet temperature sensor. In this study, the measurement uncertainty of relative humidity was evaluated by this indirect method with some empirical equations for calculating relative humidity. Among the six equations tested, the Penman equation had the best predictive ability for the dry bulb temperature range of 15–50 °C. At a fixed dry bulb temperature, an increase in the wet bulb depression increased the error. A new equation for the psychrometer constant was established by regression analysis. This equation can be computed by using a calculator. The average predictive error of relative humidity was <0.1% by this new equation. The measurement uncertainty of the relative humidity affected by the accuracy of dry and wet bulb temperature and the numeric values of measurement uncertainty were evaluated for various conditions. The uncertainty of wet bulb temperature was the main factor on the RH measurement uncertainty. PMID:28216599

  5. Uncertainty Analysis in Humidity Measurements by the Psychrometer Method.

    PubMed

    Chen, Jiunyuan; Chen, Chiachung

    2017-02-14

    The most common and cheap indirect technique to measure relative humidity is by using psychrometer based on a dry and a wet temperature sensor. In this study, the measurement uncertainty of relative humidity was evaluated by this indirect method with some empirical equations for calculating relative humidity. Among the six equations tested, the Penman equation had the best predictive ability for the dry bulb temperature range of 15-50 °C. At a fixed dry bulb temperature, an increase in the wet bulb depression increased the error. A new equation for the psychrometer constant was established by regression analysis. This equation can be computed by using a calculator. The average predictive error of relative humidity was <0.1% by this new equation. The measurement uncertainty of the relative humidity affected by the accuracy of dry and wet bulb temperature and the numeric values of measurement uncertainty were evaluated for various conditions. The uncertainty of wet bulb temperature was the main factor on the RH measurement uncertainty.

  6. Evaluation of seepage and discharge uncertainty in the middle Snake River, southwestern Idaho

    USGS Publications Warehouse

    Wood, Molly S.; Williams, Marshall L.; Evetts, David M.; Vidmar, Peter J.

    2014-01-01

    The U.S. Geological Survey, in cooperation with the State of Idaho, Idaho Power Company, and the Idaho Department of Water Resources, evaluated seasonal seepage gains and losses in selected reaches of the middle Snake River, Idaho, during November 2012 and July 2013, and uncertainty in measured and computed discharge at four Idaho Power Company streamgages. Results from this investigation will be used by resource managers in developing a protocol to calculate and report Adjusted Average Daily Flow at the Idaho Power Company streamgage on the Snake River below Swan Falls Dam, near Murphy, Idaho, which is the measurement point for distributing water to owners of hydropower and minimum flow water rights in the middle Snake River. The evaluated reaches of the Snake River were from King Hill to Murphy, Idaho, for the seepage studies and downstream of Lower Salmon Falls Dam to Murphy, Idaho, for evaluations of discharge uncertainty. Computed seepage was greater than cumulative measurement uncertainty for subreaches along the middle Snake River during November 2012, the non-irrigation season, but not during July 2013, the irrigation season. During the November 2012 seepage study, the subreach between King Hill and C J Strike Dam had a meaningful (greater than cumulative measurement uncertainty) seepage gain of 415 cubic feet per second (ft3/s), and the subreach between Loveridge Bridge and C J Strike Dam had a meaningful seepage gain of 217 ft3/s. The meaningful seepage gain measured in the November 2012 seepage study was expected on the basis of several small seeps and springs present along the subreach, regional groundwater table contour maps, and results of regional groundwater flow model simulations. Computed seepage along the subreach from C J Strike Dam to Murphy was less than cumulative measurement uncertainty during November 2012 and July 2013; therefore, seepage cannot be quantified with certainty along this subreach. For the uncertainty evaluation, average uncertainty in discharge measurements at the four Idaho Power Company streamgages in the study reach ranged from 4.3 percent (Snake River below Lower Salmon Falls Dam) to 7.8 percent (Snake River below C J Strike Dam) for discharges less than 7,000 ft3/s in water years 2007–11. This range in uncertainty constituted most of the total quantifiable uncertainty in computed discharge, represented by prediction intervals calculated from the discharge rating of each streamgage. Uncertainty in computed discharge in the Snake River below Swan Falls Dam near Murphy was 10.1 and 6.0 percent at the Adjusted Average Daily Flow thresholds of 3,900 and 5,600 ft3/s, respectively. All discharge measurements and records computed at streamgages have some level of uncertainty that cannot be entirely eliminated. Knowledge of uncertainty at the Adjusted Average Daily Flow thresholds is useful for developing a measurement and reporting protocol for purposes of distributing water to hydropower and minimum flow water rights in the middle Snake River.

  7. The need for precise and well-documented experimental data on prompt fission neutron spectra from neutron-induced fission of 239Pu

    DOE PAGES

    Neudecker, Denise; Taddeucci, Terry Nicholas; Haight, Robert Cameron; ...

    2016-01-06

    The spectrum of neutrons emitted promptly after 239Pu(n,f)—a so-called prompt fission neutron spectrum (PFNS)—is a quantity of high interest, for instance, for reactor physics and global security. However, there are only few experimental data sets available that are suitable for evaluations. In addition, some of those data sets differ by more than their 1-σ uncertainty boundaries. We present the results of MCNP studies indicating that these differences are partly caused by underestimated multiple scattering contributions, over-corrected background, and inconsistent deconvolution methods. A detailed uncertainty quantification for suitable experimental data was undertaken including these effects, and test-evaluations were performed with themore » improved uncertainty information. The test-evaluations illustrate that the inadequately estimated effects and detailed uncertainty quantification have an impact on the evaluated PFNS and associated uncertainties as well as the neutron multiplicity of selected critical assemblies. A summary of data and documentation needs to improve the quality of the experimental database is provided based on the results of simulations and test-evaluations. Furthermore, given the possibly substantial distortion of the PFNS by multiple scattering and background effects, special care should be taken to reduce these effects in future measurements, e.g., by measuring the 239Pu PFNS as a ratio to either the 235U or 252Cf PFNS.« less

  8. Highly efficient evaluation of a gas mixer using a hollow waveguide based laser spectral sensor

    NASA Astrophysics Data System (ADS)

    Du, Z.; Yang, X.; Li, J.; Yang, Y.; Qiao, C.

    2017-05-01

    This paper aims to provide a fast, sensitive, and accurate characterization of a Mass Flow Controller (MFC) based gas mixer. The gas mixer was evaluated by using a hollow waveguide based laser spectral sensor with high efficiency. Benefiting from the sensor's fast response, high sensitivity and continuous operation, multiple key parameters of the mixer, including mixing uncertainty, linearity, and response time, were acquired by a one-round test. The test results show that the mixer can blend multi-compound gases quite efficiently with an uncertainty of 1.44% occurring at a flow rate of 500 ml/min, with the linearity of 0.998 43 and the response time of 92.6 s. The results' reliability was confirmed by the relative measurement of gas concentration, in which the isolation of the sensor's uncertainty was conducted. The measured uncertainty has shown well coincidence with the theoretical uncertainties of the mixer, which proves the method to be a reliable characterization. Consequently, this sort of laser based characterization's wide appliance on gas analyzer's evaluations is demonstrated.

  9. Highly efficient evaluation of a gas mixer using a hollow waveguide based laser spectral sensor.

    PubMed

    Du, Z; Yang, X; Li, J; Yang, Y; Qiao, C

    2017-05-01

    This paper aims to provide a fast, sensitive, and accurate characterization of a Mass Flow Controller (MFC) based gas mixer. The gas mixer was evaluated by using a hollow waveguide based laser spectral sensor with high efficiency. Benefiting from the sensor's fast response, high sensitivity and continuous operation, multiple key parameters of the mixer, including mixing uncertainty, linearity, and response time, were acquired by a one-round test. The test results show that the mixer can blend multi-compound gases quite efficiently with an uncertainty of 1.44% occurring at a flow rate of 500 ml/min, with the linearity of 0.998 43 and the response time of 92.6 s. The results' reliability was confirmed by the relative measurement of gas concentration, in which the isolation of the sensor's uncertainty was conducted. The measured uncertainty has shown well coincidence with the theoretical uncertainties of the mixer, which proves the method to be a reliable characterization. Consequently, this sort of laser based characterization's wide appliance on gas analyzer's evaluations is demonstrated.

  10. A trial-based economic evaluation of 2 nurse-led disease management programs in heart failure.

    PubMed

    Postmus, Douwe; Pari, Anees A Abdul; Jaarsma, Tiny; Luttik, Marie Louise; van Veldhuisen, Dirk J; Hillege, Hans L; Buskens, Erik

    2011-12-01

    Although previously conducted meta-analyses suggest that nurse-led disease management programs in heart failure (HF) can improve patient outcomes, uncertainty regarding the cost-effectiveness of such programs remains. To compare the relative merits of 2 variants of a nurse-led disease management program (basic or intensive support by a nurse specialized in the management of patients with HF) against care as usual (routine follow-up by a cardiologist), a trial-based economic evaluation was conducted alongside the COACH study. In terms of costs per life-year, basic support was found to dominate care as usual, whereas the incremental cost-effectiveness ratio between intensive support and basic support was found to be equal to €532,762 per life-year; in terms of costs per quality-adjusted life-year (QALY), basic support was found to dominate both care as usual and intensive support. An assessment of the uncertainty surrounding these findings showed that, at a threshold value of €20,000 per life-year/€20,000 per QALY, basic support was found to have a probability of 69/62% of being optimal against 17/30% and 14/8% for care as usual and intensive support, respectively. The results of our subgroup analysis suggest that a stratified approach based on offering basic support to patients with mild to moderate HF and intensive support to patients with severe HF would be optimal if the willingness-to-pay threshold exceeds €45,345 per life-year/€59,289 per QALY. Although the differences in costs and effects among the 3 study groups were not statistically significant, from a decision-making perspective, basic support still had a relatively large probability of generating the highest health outcomes at the lowest costs. Our results also substantiated that a stratified approach based on offering basic support to patients with mild to moderate HF and intensive support to patients with severe HF could further improve health outcomes at slightly higher costs. Copyright © 2011 Mosby, Inc. All rights reserved.

  11. Impacts of Process and Prediction Uncertainties on Projected Hanford Waste Glass Amount

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gervasio, V.; Kim, D. S.; Vienna, J. D.

    Analyses were performed to evaluate the impacts of using the advanced glass models, constraints, and uncertainty descriptions on projected Hanford glass mass. The maximum allowable waste oxide loading (WOL) was estimated for waste compositions while simultaneously satisfying all applicable glass property and composition constraints with sufficient confidence. Different components of prediction and composition/process uncertainties were systematically included in the calculations to evaluate their impacts on glass mass. The analyses estimated the production of 23,360 MT of immobilized high-level waste (IHLW) glass when no uncertainties were taken into account. Accounting for prediction and composition/process uncertainties resulted in 5.01 relative percent increasemore » in estimated glass mass of 24,531 MT. Roughly equal impacts were found for prediction uncertainties (2.58 RPD) and composition/process uncertainties (2.43 RPD). The immobilized low-activity waste (ILAW) mass was predicted to be 282,350 MT without uncertainty and with waste loading “line” rules in place. Accounting for prediction and composition/process uncertainties resulted in only 0.08 relative percent increase in estimated glass mass of 282,562 MT. Without application of line rules the glass mass decreases by 10.6 relative percent (252,490 MT) for the case with no uncertainties. Addition of prediction uncertainties increases glass mass by 1.32 relative percent and the addition of composition/process uncertainties increase glass mass by an additional 7.73 relative percent (9.06 relative percent increase combined). The glass mass estimate without line rules (275,359 MT) was 2.55 relative percent lower than that with the line rules (282,562 MT), after accounting for all applicable uncertainties.« less

  12. Uncertainty of a hybrid surface temperature sensor for silicon wafers and comparison with an embedded thermocouple.

    PubMed

    Iuchi, Tohru; Gogami, Atsushi

    2009-12-01

    We have developed a user-friendly hybrid surface temperature sensor. The uncertainties of temperature readings associated with this sensor and a thermocouple embedded in a silicon wafer are compared. The expanded uncertainties (k=2) of the hybrid temperature sensor and the embedded thermocouple are 2.11 and 2.37 K, respectively, in the temperature range between 600 and 1000 K. In the present paper, the uncertainty evaluation and the sources of uncertainty are described.

  13. Uncertainty estimation of simulated water levels for the Mitch flood event in Tegucigalpa

    NASA Astrophysics Data System (ADS)

    Fuentes Andino, Diana Carolina; Halldin, Sven; Keith, Beven; Chong-Yu, Xu

    2013-04-01

    Hurricane Mitch in 1998 left a devastating flood in Tegucigalpa, the capital city of Honduras. Due to the extremely large magnitude of the Mitch flood, hydrometric measurements were not taken during the event. However, post-event indirect measurements of the discharge were obtained by the U.S. Geological Survey (USGS) and post-event surveyed high water marks were obtained by the Japan International Cooperation agency (JICA). This work proposes a methodology to simulate the water level during the Mitch event when the available data is associated with large uncertainty. The results of the two-dimensional hydrodynamic model LISFLOOD-FP will be evaluated using the Generalized Uncertainty Estimation (GLUE) framework. The main challenge in the proposed methodology is to formulate an approach to evaluate the model results when there are large uncertainties coming from both the model parameters and the evaluation data.

  14. Essays on environmental, energy, and natural resource economics

    NASA Astrophysics Data System (ADS)

    Zhang, Fan

    My dissertation focuses on examining the interrelationship among the environment, energy and economic development. In the first essay, I explore the effects of increased uncertainty over future output prices, input costs and productivity levels on intertemporal emission permits trading. In a dynamic programming setting, a permit price is a convex function of each of these three sources of uncertainty. Increased uncertainty about future market conditions increases the expected permit price and causes risk-neutral firms to reduce ex ante emissions to smooth marginal abatement costs over time. Empirical analysis shows that increased price volatility induced by electricity market restructuring could explain 8-11% of the allowances banked during Phase I of the U.S. sulfur dioxide trading program. Numerical simulation suggests that high uncertainty may generate substantial initial compliance costs, thereby deterring new entrants and reducing efficiency; sharp emission spikes are also more likely to occur under industry-wide uncertainty shocks. In the second essay, I examine whether electricity restructuring improves the efficiency of U.S. nuclear power generation. Based on the full sample of 73 investor-owned nuclear plants in the United States from 1992 to 1998, I estimate cross-sectional and longitudinal efficiency changes associated with restructuring, at the plant level. Various modeling strategies are presented to deal with the policy endogeneity bias that high cost plants are more likely to be restructured. Overall, I find a strikingly positive relationship between the multiple steps of restructuring and plant operating efficiency. In the third essay, I estimate the economic impact of China's national land conversion program on local farm-dependent economies. The impact of the program on 14 industrial sectors in Gansu provinces are investigated using an input-output model. Due to regulatory restrictions, the agricultural sector cannot automatically expand or shrink its land requirements in direct proportion to output changes. Therefore, I modify a standard input-output model to incorporate supply constraints on cropping activities. A spatially explicit analysis is also implemented in a geographical information system to capture the heterogeneous land productivity. The net cost of the conservation program is estimated to be a land rent of 487.21 per acre per year (1999).

  15. Transportation risk management : international practices for program development and project delivery.

    DOT National Transportation Integrated Search

    2012-08-01

    Managing transportation networks, including agency : management, program development, and project : delivery, is extremely complex and fraught with : uncertainty. Administrators, planners, and engineers : coordinate a multitude of organizational and ...

  16. Estimating uncertainty of Full Waveform Inversion with Ensemble-based methods

    NASA Astrophysics Data System (ADS)

    Thurin, J.; Brossier, R.; Métivier, L.

    2017-12-01

    Uncertainty estimation is one key feature of tomographic applications for robust interpretation. However, this information is often missing in the frame of large scale linearized inversions, and only the results at convergence are shown, despite the ill-posed nature of the problem. This issue is common in the Full Waveform Inversion community.While few methodologies have already been proposed in the literature, standard FWI workflows do not include any systematic uncertainty quantifications methods yet, but often try to assess the result's quality through cross-comparison with other results from seismic or comparison with other geophysical data. With the development of large seismic networks/surveys, the increase in computational power and the more and more systematic application of FWI, it is crucial to tackle this problem and to propose robust and affordable workflows, in order to address the uncertainty quantification problem faced for near surface targets, crustal exploration, as well as regional and global scales.In this work (Thurin et al., 2017a,b), we propose an approach which takes advantage of the Ensemble Transform Kalman Filter (ETKF) proposed by Bishop et al., (2001), in order to estimate a low-rank approximation of the posterior covariance matrix of the FWI problem, allowing us to evaluate some uncertainty information of the solution. Instead of solving the FWI problem through a Bayesian inversion with the ETKF, we chose to combine a conventional FWI, based on local optimization, and the ETKF strategies. This scheme allows combining the efficiency of local optimization for solving large scale inverse problems and make the sampling of the local solution space possible thanks to its embarrassingly parallel property. References:Bishop, C. H., Etherton, B. J. and Majumdar, S. J., 2001. Adaptive sampling with the ensemble transform Kalman filter. Part I: Theoretical aspects. Monthly weather review, 129(3), 420-436.Thurin, J., Brossier, R. and Métivier, L. 2017,a.: Ensemble-Based Uncertainty Estimation in Full Waveform Inversion. 79th EAGE Conference and Exhibition 2017, (12 - 15 June, 2017)Thurin, J., Brossier, R. and Métivier, L. 2017,b.: An Ensemble-Transform Kalman Filter - Full Waveform Inversion scheme for Uncertainty estimation; SEG Technical Program Expanded Abstracts 2012

  17. Uncertainty Analysis via Failure Domain Characterization: Polynomial Requirement Functions

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Munoz, Cesar A.; Narkawicz, Anthony J.; Kenny, Sean P.; Giesy, Daniel P.

    2011-01-01

    This paper proposes an uncertainty analysis framework based on the characterization of the uncertain parameter space. This characterization enables the identification of worst-case uncertainty combinations and the approximation of the failure and safe domains with a high level of accuracy. Because these approximations are comprised of subsets of readily computable probability, they enable the calculation of arbitrarily tight upper and lower bounds to the failure probability. A Bernstein expansion approach is used to size hyper-rectangular subsets while a sum of squares programming approach is used to size quasi-ellipsoidal subsets. These methods are applicable to requirement functions whose functional dependency on the uncertainty is a known polynomial. Some of the most prominent features of the methodology are the substantial desensitization of the calculations from the uncertainty model assumed (i.e., the probability distribution describing the uncertainty) as well as the accommodation for changes in such a model with a practically insignificant amount of computational effort.

  18. Performance assessment of a Bayesian Forecasting System (BFS) for real-time flood forecasting

    NASA Astrophysics Data System (ADS)

    Biondi, D.; De Luca, D. L.

    2013-02-01

    SummaryThe paper evaluates, for a number of flood events, the performance of a Bayesian Forecasting System (BFS), with the aim of evaluating total uncertainty in real-time flood forecasting. The predictive uncertainty of future streamflow is estimated through the Bayesian integration of two separate processors. The former evaluates the propagation of input uncertainty on simulated river discharge, the latter computes the hydrological uncertainty of actual river discharge associated with all other possible sources of error. A stochastic model and a distributed rainfall-runoff model were assumed, respectively, for rainfall and hydrological response simulations. A case study was carried out for a small basin in the Calabria region (southern Italy). The performance assessment of the BFS was performed with adequate verification tools suited for probabilistic forecasts of continuous variables such as streamflow. Graphical tools and scalar metrics were used to evaluate several attributes of the forecast quality of the entire time-varying predictive distributions: calibration, sharpness, accuracy, and continuous ranked probability score (CRPS). Besides the overall system, which incorporates both sources of uncertainty, other hypotheses resulting from the BFS properties were examined, corresponding to (i) a perfect hydrological model; (ii) a non-informative rainfall forecast for predicting streamflow; and (iii) a perfect input forecast. The results emphasize the importance of using different diagnostic approaches to perform comprehensive analyses of predictive distributions, to arrive at a multifaceted view of the attributes of the prediction. For the case study, the selected criteria revealed the interaction of the different sources of error, in particular the crucial role of the hydrological uncertainty processor when compensating, at the cost of wider forecast intervals, for the unreliable and biased predictive distribution resulting from the Precipitation Uncertainty Processor.

  19. Parallel Computing and Model Evaluation for Environmental Systems: An Overview of the Supermuse and Frames Software Technologies

    EPA Science Inventory

    ERD’s Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) is a key to enhancing quality assurance in environmental models and applications. Uncertainty analysis and sensitivity analysis remain critical, though often overlooked steps in the development and e...

  20. Metrics for evaluating performance and uncertainty of Bayesian network models

    Treesearch

    Bruce G. Marcot

    2012-01-01

    This paper presents a selected set of existing and new metrics for gauging Bayesian network model performance and uncertainty. Selected existing and new metrics are discussed for conducting model sensitivity analysis (variance reduction, entropy reduction, case file simulation); evaluating scenarios (influence analysis); depicting model complexity (numbers of model...

  1. Quantifying measurement uncertainty and spatial variability in the context of model evaluation

    NASA Astrophysics Data System (ADS)

    Choukulkar, A.; Brewer, A.; Pichugina, Y. L.; Bonin, T.; Banta, R. M.; Sandberg, S.; Weickmann, A. M.; Djalalova, I.; McCaffrey, K.; Bianco, L.; Wilczak, J. M.; Newman, J. F.; Draxl, C.; Lundquist, J. K.; Wharton, S.; Olson, J.; Kenyon, J.; Marquis, M.

    2017-12-01

    In an effort to improve wind forecasts for the wind energy sector, the Department of Energy and the NOAA funded the second Wind Forecast Improvement Project (WFIP2). As part of the WFIP2 field campaign, a large suite of in-situ and remote sensing instrumentation was deployed to the Columbia River Gorge in Oregon and Washington from October 2015 - March 2017. The array of instrumentation deployed included 915-MHz wind profiling radars, sodars, wind- profiling lidars, and scanning lidars. The role of these instruments was to provide wind measurements at high spatial and temporal resolution for model evaluation and improvement of model physics. To properly determine model errors, the uncertainties in instrument-model comparisons need to be quantified accurately. These uncertainties arise from several factors such as measurement uncertainty, spatial variability, and interpolation of model output to instrument locations, to name a few. In this presentation, we will introduce a formalism to quantify measurement uncertainty and spatial variability. The accuracy of this formalism will be tested using existing datasets such as the eXperimental Planetary boundary layer Instrumentation Assessment (XPIA) campaign. Finally, the uncertainties in wind measurement and the spatial variability estimates from the WFIP2 field campaign will be discussed to understand the challenges involved in model evaluation.

  2. Uncertainty evaluation with increasing borehole drilling in subsurface hydrogeological explorations

    NASA Astrophysics Data System (ADS)

    Amano, K.; Ohyama, T.; Kumamoto, S.; Shimo, M.

    2016-12-01

    Quantities of drilling boreholes have been a difficult subject for field investigators in such as subsurface hydrogeological explorations. This problem becomes a bigger in heterogeneous formations or rock masses so we need to develop quantitative criteria for evaluating uncertainties during borehole investigations.To test an uncertainty reduction with increasing boreholes, we prepared a simple hydrogeological model and virtual hydraulic tests were carried out by using this model. The model consists of 125,000 elements of which hydraulic conductivities are generated randomly from the log-normal distribution in a 2-kilometer cube. Uncertainties were calculated by the difference of head distributions between the original model and the inchoate models made by virtual hydraulic test one by one.The results show the level and the variance of uncertainty are strongly correlated to the average and variance of the hydraulic conductivities. This kind of trends also could be seen in the actual field data obtained from the deep borehole investigations in Horonobe Town, northern Hokkaido, Japan. Here, a new approach using fractional bias (FB) and normalized mean square error (NMSE) for evaluating uncertainty characteristics will be introduced and the possibility of use as an indicator for decision making (i.e. to stop borehole drilling or to continue borehole drilling) in field investigations will be discussed.

  3. Uncertainty Quantification in High Throughput Screening: Applications to Models of Endocrine Disruption, Cytotoxicity, and Zebrafish Development (GRC Drug Safety)

    EPA Science Inventory

    Using uncertainty quantification, we aim to improve the quality of modeling data from high throughput screening assays for use in risk assessment. ToxCast is a large-scale screening program that analyzes thousands of chemicals using over 800 assays representing hundreds of bioche...

  4. Uncertainty Exposed: A Field Lab Exercise Where GIS Meets the Real World

    ERIC Educational Resources Information Center

    Prisley, Stephen P.; Luebbering, Candice

    2011-01-01

    Students in natural resources programs commonly take courses in geospatial technologies. An awareness of the uncertainty of spatial data and algorithms can be an important outcome of such courses. This article describes a laboratory exercise in a graduate geographic information system (GIS) class that involves collection of data for the assessment…

  5. Forest Management Under Uncertainty for Multiple Bird Population Objectives

    Treesearch

    Clinton T. Moore; W. Todd Plummer; Michael J. Conroy

    2005-01-01

    We advocate adaptive programs of decision making and monitoring for the management of forest birds when responses by populations to management, and particularly management trade-offs among populations, are uncertain. Models are necessary components of adaptive management. Under this approach, uncertainty about the behavior of a managed system is explicitly captured in...

  6. Biological Impact of the Chippewa Off-Reservation Treaty Harvest, 1983-1989.

    ERIC Educational Resources Information Center

    Busiahn, Thomas R.

    1991-01-01

    In Wisconsin, Chippewa tribal harvests have not had a negative impact on populations of lake trout, walleye, fishers, and white-tailed deer. The treaty rights controversy is fueled by uncertainties about the status of natural resources, uncertainties that could be addressed by cooperative state-tribal wildlife management programs. (SV)

  7. Applications of explicitly-incorporated/post-processing measurement uncertainty in watershed modeling

    USDA-ARS?s Scientific Manuscript database

    The importance of measurement uncertainty in terms of calculation of model evaluation error statistics has been recently stated in the literature. The impact of measurement uncertainty on calibration results indicates the potential vague zone in the field of watershed modeling where the assumption ...

  8. Recognizing and responding to uncertainty: a grounded theory of nurses' uncertainty.

    PubMed

    Cranley, Lisa A; Doran, Diane M; Tourangeau, Ann E; Kushniruk, Andre; Nagle, Lynn

    2012-08-01

    There has been little research to date exploring nurses' uncertainty in their practice. Understanding nurses' uncertainty is important because it has potential implications for how care is delivered. The purpose of this study is to develop a substantive theory to explain how staff nurses experience and respond to uncertainty in their practice. Between 2006 and 2008, a grounded theory study was conducted that included in-depth semi-structured interviews. Fourteen staff nurses working in adult medical-surgical intensive care units at two teaching hospitals in Ontario, Canada, participated in the study. The theory recognizing and responding to uncertainty characterizes the processes through which nurses' uncertainty manifested and how it was managed. Recognizing uncertainty involved the processes of assessing, reflecting, questioning, and/or being unable to predict aspects of the patient situation. Nurses' responses to uncertainty highlighted the cognitive-affective strategies used to manage uncertainty. Study findings highlight the importance of acknowledging uncertainty and having collegial support to manage uncertainty. The theory adds to our understanding the processes involved in recognizing uncertainty, strategies and outcomes of managing uncertainty, and influencing factors. Tailored nursing education programs should be developed to assist nurses in developing skills in articulating and managing their uncertainty. Further research is needed to extend, test and refine the theory of recognizing and responding to uncertainty to develop strategies for managing uncertainty. This theory advances the nursing perspective of uncertainty in clinical practice. The theory is relevant to nurses who are faced with uncertainty and complex clinical decisions, to managers who support nurses in their clinical decision-making, and to researchers who investigate ways to improve decision-making and care delivery. ©2012 Sigma Theta Tau International.

  9. Decadal Trend in Agricultural Abandonment and Woodland Expansion in an Agro-Pastoral Transition Band in Northern China.

    PubMed

    Wang, Chao; Gao, Qiong; Wang, Xian; Yu, Mei

    2015-01-01

    Land use land cover (LULC) changes frequently in ecotones due to the large climate and soil gradients, and complex landscape composition and configuration. Accurate mapping of LULC changes in ecotones is of great importance for assessment of ecosystem functions/services and policy-decision support. Decadal or sub-decadal mapping of LULC provides scenarios for modeling biogeochemical processes and their feedbacks to climate, and evaluating effectiveness of land-use policies, e.g. forest conversion. However, it remains a great challenge to produce reliable LULC maps in moderate resolution and to evaluate their uncertainties over large areas with complex landscapes. In this study we developed a robust LULC classification system using multiple classifiers based on MODIS (Moderate Resolution Imaging Spectroradiometer) data and posterior data fusion. Not only does the system create LULC maps with high statistical accuracy, but also it provides pixel-level uncertainties that are essential for subsequent analyses and applications. We applied the classification system to the Agro-pasture transition band in northern China (APTBNC) to detect the decadal changes in LULC during 2003-2013 and evaluated the effectiveness of the implementation of major Key Forestry Programs (KFPs). In our study, the random forest (RF), support vector machine (SVM), and weighted k-nearest neighbors (WKNN) classifiers outperformed the artificial neural networks (ANN) and naive Bayes (NB) in terms of high classification accuracy and low sensitivity to training sample size. The Bayesian-average data fusion based on the results of RF, SVM, and WKNN achieved the 87.5% Kappa statistics, higher than any individual classifiers and the majority-vote integration. The pixel-level uncertainty map agreed with the traditional accuracy assessment. However, it conveys spatial variation of uncertainty. Specifically, it pinpoints the southwestern area of APTBNC has higher uncertainty than other part of the region, and the open shrubland is likely to be misclassified to the bare ground in some locations. Forests, closed shrublands, and grasslands in APTBNC expanded by 23%, 50%, and 9%, respectively, during 2003-2013. The expansion of these land cover types is compensated with the shrinkages in croplands (20%), bare ground (15%), and open shrublands (30%). The significant decline in agricultural lands is primarily attributed to the KFPs implemented in the end of last century and the nationwide urbanization in recent decade. The increased coverage of grass and woody plants would largely reduce soil erosion, improve mitigation of climate change, and enhance carbon sequestration in this region.

  10. Evaluation of the AirNow Satellite Data Processor for 2010-2012

    NASA Astrophysics Data System (ADS)

    Pasch, A. N.; DeWinter, J. L.; Dye, T.; Haderman, M.; Zahn, P. H.; Szykman, J.; White, J. E.; Dickerson, P.; van Donkelaar, A.; Martin, R.

    2013-12-01

    The U.S. Environmental Protection Agency's (EPA) AirNow program provides the public with real-time and forecasted air quality conditions. Millions of people each day use information from AirNow to protect their health. The AirNow program (http://www.airnow.gov) reports ground-level ozone (O3) and fine particulate matter (PM2.5) with a standardized index called the Air Quality Index (AQI). AirNow aggregates information from over 130 state, local, and federal air quality agencies and provides tools for over 2,000 agency staff responsible for monitoring, forecasting, and communicating local air quality. Each hour, AirNow systems generate thousands of maps and products. The usefulness of the AirNow air quality maps depends on the accuracy and spatial coverage of air quality measurements. Currently, the maps use only ground-based measurements, which have significant gaps in coverage in some parts of the United States. As a result, contoured AQI levels have high uncertainty in regions far from monitors. To improve the usefulness of air quality maps, scientists at EPA, Dalhousie University, and Sonoma Technology, Inc., in collaboration with the National Aeronautics and Space Administration (NASA) and the National Oceanic and Atmospheric Administration (NOAA), have completed a project to incorporate satellite-estimated surface PM2.5 concentrations into the maps via the AirNow Satellite Data Processor (ASDP). These satellite estimates are derived using NASA/NOAA satellite aerosol optical depth (AOD) retrievals and GEOS-Chem modeled ratios of surface PM2.5 concentrations to AOD. GEOS-Chem is a three-dimensional chemical transport model for atmospheric composition driven by meteorological input from the Goddard Earth Observing System (GEOS). The ASDP can fuse multiple PM2.5 concentration data sets to generate AQI maps with improved spatial coverage. The goals of ASDP are to provide more detailed AQI information in monitor-sparse locations and to augment monitor-dense locations with more information. The ASDP system uses a weighted-average approach using uncertainty information about each data set. Recent improvements in the estimation of the uncertainty of interpolated ground-based monitor data have allowed for a more complete characterization of the uncertainty of the surface measurements. We will present a statistical analysis for 2010-2012 of the ASDP predictions of PM2.5 focusing on performance at validation sites. In addition, we will present several case studies evaluating the ASDP's performance for multiple regions and seasons, focusing specifically on days when large spatial gradients in AQI and wildfire smoke impacts were observed.

  11. A PC program for estimating measurement uncertainty for aeronautics test instrumentation

    NASA Technical Reports Server (NTRS)

    Blumenthal, Philip Z.

    1995-01-01

    A personal computer program was developed which provides aeronautics and operations engineers at Lewis Research Center with a uniform method to quickly provide values for the uncertainty in test measurements and research results. The software package used for performing the calculations is Mathcad 4.0, a Windows version of a program which provides an interactive user interface for entering values directly into equations with immediate display of results. The error contribution from each component of the system is identified individually in terms of the parameter measured. The final result is given in common units, SI units, and percent of full scale range. The program also lists the specifications for all instrumentation and calibration equipment used for the analysis. It provides a presentation-quality printed output which can be used directly for reports and documents.

  12. [The Effects of an Empowerment Education Program for Kidney Transplantation Patients].

    PubMed

    Kim, Sung Hee; You, Hye Sook

    2017-08-01

    This study was conducted to develop an Empowerment Education Program (EEP) for kidney transplant patients and to test the program's effects on uncertainty, self-care ability, and compliance. The research was conducted using a nonequivalent control group with a pretest-posttest design. The participants were 53 outpatients (experimental group: 25, control group: 28) who were receiving hospital treatment after kidney transplants. After the pre-test, patients in the experimental group underwent a weekly EEP for six weeks. The post-test was conducted immediately after, and four weeks after the program's completion in the same manner as the pre-test. For the control group, we conducted a post-test six and ten weeks after the pre-test, without and program intervention. A repeated measure ANOVA was performed to compare the change scores on main outcomes. Uncertainty was significantly lower in the experimental group than in the control group, both immediately after (t=-3.84, p=<.001) and 4 weeks after (t=-4.51 p=<.001) the program, whereas self-care ability (t=5.81, p=<.001), (t=5.84, p=<.001) and compliance (t=5.07, p=<.001), (t=5.45, p=<.001) were significantly higher. Kidney transplant patients who underwent an EEP showed a decrease in uncertainty and an improvement in self-care ability and compliance. Thus, our findings confirmed that an EEP can be an independent intervention method for improving and maintaining the health of kidney transplant patients. © 2017 Korean Society of Nursing Science

  13. Lessons from a Train-the-Trainer Professional Development Program: The Sustainable Trainer Engagement Program (STEP)

    NASA Astrophysics Data System (ADS)

    Shupla, Christine; Gladney, Alicia; Dalton, Heather; LaConte, Keliann; Truxillo, Jeannette; Shipp, Stephanie

    2015-11-01

    The Sustainable Trainer Engagement Program (STEP) is a modified train-the-trainer professional development program being conducted by the Lunar and Planetary Institute (LPI). STEP has provided two cohorts of 6-8th grade science specialists and lead teachers in the Houston region with in-depth Earth and Space Science (ESS) content, activities, and pedagogy over 15 days each, aligned with Texas science standards. This project has two over-arching goals: to improve middle school ESS instruction, and to create and test an innovative model for Train-the-Trainer.This poster will share details regarding STEP’s activities and resources, program achievements, and its main findings to date. STEP is being evaluated by external evaluators at the Research Institute of Texas, part of the Harris County Department of Education. External evaluation shows an increase after one year in STEP participants’ knowledge (cohort 1 showed a 10% increase; cohort 2 showed a 20% increase), confidence in teaching Earth and Space Science effectively (cohort 1 demonstrated a 10% increase; cohort 2 showed a 20% increase), and confidence in preparing other teachers (cohort 1 demonstrated a 12% increase; cohort 2 showed a 20% increase). By September 2015, STEP participants led (or assisted in leading) approximately 40 workshops for about 1800 science teachers in Texas. Surveys of teachers attending professional development conducted by STEP participants show very positive responses, with averages for conference workshop evaluations ranging from 3.6 on a 4 point scale, and other evaluations averaging from 4.1 to 5.0 on a 5 point scale.Main lessons for the team on the train-the-trainer model include: a lack of confidence by leaders in K-12 science education in presenting ESS professional development, difficulties in arranging for school or district content-specific professional development, the minimal duration of most school and district professional development sessions, and uncertainties in partnerships between scientists and educators.

  14. Cognitive Structuring and Its Cognitive-Motivational Determinants as an Explanatory Framework of the Fear-Then-Relief Social Influence Strategy.

    PubMed

    Dolinski, Dariusz; Dolinska, Barbara; Bar-Tal, Yoram

    2017-01-01

    According to the fear-then-relief technique of social influence, people who experience anxiety whose source is abruptly withdrawn usually respond positively to various requests and commands addressed to them. This effect is usually explained by the fact that fear invokes a specific program of action, and that when the source of this emotion is suddenly and unexpectedly removed, the program is no longer operative, but the person has not yet invoked a new program. This specific state of disorientation makes compliance more likely. In this paper, an alternative explanation of the fear-then-relief effect is offered. It is assumed that the rapid change of emotions is associated with feelings of uncertainty and confusion. The positive response to the request is a form of coping with uncertainty. In line with this reasoning, while individuals with a high need for closure (NFC) should comply with a request after a fear-then-relief situation, low NFC individuals who are less threatened by uncertainty should not. This assumption was confirmed in the experiment.

  15. Design of experiments enhanced statistical process control for wind tunnel check standard testing

    NASA Astrophysics Data System (ADS)

    Phillips, Ben D.

    The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.

  16. Waste management with recourse: an inexact dynamic programming model containing fuzzy boundary intervals in objectives and constraints.

    PubMed

    Tan, Q; Huang, G H; Cai, Y P

    2010-09-01

    The existing inexact optimization methods based on interval-parameter linear programming can hardly address problems where coefficients in objective functions are subject to dual uncertainties. In this study, a superiority-inferiority-based inexact fuzzy two-stage mixed-integer linear programming (SI-IFTMILP) model was developed for supporting municipal solid waste management under uncertainty. The developed SI-IFTMILP approach is capable of tackling dual uncertainties presented as fuzzy boundary intervals (FuBIs) in not only constraints, but also objective functions. Uncertainties expressed as a combination of intervals and random variables could also be explicitly reflected. An algorithm with high computational efficiency was provided to solve SI-IFTMILP. SI-IFTMILP was then applied to a long-term waste management case to demonstrate its applicability. Useful interval solutions were obtained. SI-IFTMILP could help generate dynamic facility-expansion and waste-allocation plans, as well as provide corrective actions when anticipated waste management plans are violated. It could also greatly reduce system-violation risk and enhance system robustness through examining two sets of penalties resulting from variations in fuzziness and randomness. Moreover, four possible alternative models were formulated to solve the same problem; solutions from them were then compared with those from SI-IFTMILP. The results indicate that SI-IFTMILP could provide more reliable solutions than the alternatives. 2010 Elsevier Ltd. All rights reserved.

  17. Synchronic interval Gaussian mixed-integer programming for air quality management.

    PubMed

    Cheng, Guanhui; Huang, Guohe Gordon; Dong, Cong

    2015-12-15

    To reveal the synchronism of interval uncertainties, the tradeoff between system optimality and security, the discreteness of facility-expansion options, the uncertainty of pollutant dispersion processes, and the seasonality of wind features in air quality management (AQM) systems, a synchronic interval Gaussian mixed-integer programming (SIGMIP) approach is proposed in this study. A robust interval Gaussian dispersion model is developed for approaching the pollutant dispersion process under interval uncertainties and seasonal variations. The reflection of synchronic effects of interval uncertainties in the programming objective is enabled through introducing interval functions. The proposition of constraint violation degrees helps quantify the tradeoff between system optimality and constraint violation under interval uncertainties. The overall optimality of system profits of an SIGMIP model is achieved based on the definition of an integrally optimal solution. Integer variables in the SIGMIP model are resolved by the existing cutting-plane method. Combining these efforts leads to an effective algorithm for the SIGMIP model. An application to an AQM problem in a region in Shandong Province, China, reveals that the proposed SIGMIP model can facilitate identifying the desired scheme for AQM. The enhancement of the robustness of optimization exercises may be helpful for increasing the reliability of suggested schemes for AQM under these complexities. The interrelated tradeoffs among control measures, emission sources, flow processes, receptors, influencing factors, and economic and environmental goals are effectively balanced. Interests of many stakeholders are reasonably coordinated. The harmony between economic development and air quality control is enabled. Results also indicate that the constraint violation degree is effective at reflecting the compromise relationship between constraint-violation risks and system optimality under interval uncertainties. This can help decision makers mitigate potential risks, e.g. insufficiency of pollutant treatment capabilities, exceedance of air quality standards, deficiency of pollution control fund, or imbalance of economic or environmental stress, in the process of guiding AQM. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. [Uncertainty evaluation of the determination of toxic equivalent quantity of polychlorinated dibenzo-p-dioxins and dibenzofurans in soil by isotope dilution high resolution gas chromatography and high resolution mass spectrometry].

    PubMed

    Du, Bing; Liu Aimin; Huang, Yeru

    2014-09-01

    Polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs) in soil samples were analyzed by isotope dilution method with high resolution gas chromatography and high resolution mass spectrometry (ID-HRGC/HRMS), and the toxic equivalent quantity (TEQ) were calculated. The impacts of major source of measurement uncertainty are discussed, and the combined relative standard uncertainties were calculated for each 2, 3, 7, 8 substituted con- gener. Furthermore, the concentration, combined uncertainty and expanded uncertainty for TEQ of PCDD/Fs in a soil sample in I-TEF, WHO-1998-TEF and WHO-2005-TEF schemes are provided as an example. I-TEF, WHO-1998-TEF and WHO-2005-TEF are the evaluation schemes of toxic equivalent factor (TEF), and are all currently used to describe 2,3,7,8 sub- stituted relative potencies.

  19. Social, institutional, and psychological factors affecting wildfire incident decision making

    Treesearch

    Matthew P. Thompson

    2014-01-01

    Managing wildland fire incidents can be fraught with complexity and uncertainty. Myriad human factors can exert significant influence on incident decision making, and can contribute additional uncertainty regarding programmatic evaluations of wildfire management and attainment of policy goals. This article develops a framework within which human sources of uncertainty...

  20. TU-AB-BRB-01: Coverage Evaluation and Probabilistic Treatment Planning as a Margin Alternative

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siebers, J.

    The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less

  1. An ensemble constrained variational analysis of atmospheric forcing data and its application to evaluate clouds in CAM5: Ensemble 3DCVA and Its Application

    DOE PAGES

    Tang, Shuaiqi; Zhang, Minghua; Xie, Shaocheng

    2016-01-05

    Large-scale atmospheric forcing data can greatly impact the simulations of atmospheric process models including Large Eddy Simulations (LES), Cloud Resolving Models (CRMs) and Single-Column Models (SCMs), and impact the development of physical parameterizations in global climate models. This study describes the development of an ensemble variationally constrained objective analysis of atmospheric large-scale forcing data and its application to evaluate the cloud biases in the Community Atmospheric Model (CAM5). Sensitivities of the variational objective analysis to background data, error covariance matrix and constraint variables are described and used to quantify the uncertainties in the large-scale forcing data. Application of the ensemblemore » forcing in the CAM5 SCM during March 2000 intensive operational period (IOP) at the Southern Great Plains (SGP) of the Atmospheric Radiation Measurement (ARM) program shows systematic biases in the model simulations that cannot be explained by the uncertainty of large-scale forcing data, which points to the deficiencies of physical parameterizations. The SCM is shown to overestimate high clouds and underestimate low clouds. These biases are found to also exist in the global simulation of CAM5 when it is compared with satellite data.« less

  2. An ensemble constrained variational analysis of atmospheric forcing data and its application to evaluate clouds in CAM5: Ensemble 3DCVA and Its Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tang, Shuaiqi; Zhang, Minghua; Xie, Shaocheng

    Large-scale atmospheric forcing data can greatly impact the simulations of atmospheric process models including Large Eddy Simulations (LES), Cloud Resolving Models (CRMs) and Single-Column Models (SCMs), and impact the development of physical parameterizations in global climate models. This study describes the development of an ensemble variationally constrained objective analysis of atmospheric large-scale forcing data and its application to evaluate the cloud biases in the Community Atmospheric Model (CAM5). Sensitivities of the variational objective analysis to background data, error covariance matrix and constraint variables are described and used to quantify the uncertainties in the large-scale forcing data. Application of the ensemblemore » forcing in the CAM5 SCM during March 2000 intensive operational period (IOP) at the Southern Great Plains (SGP) of the Atmospheric Radiation Measurement (ARM) program shows systematic biases in the model simulations that cannot be explained by the uncertainty of large-scale forcing data, which points to the deficiencies of physical parameterizations. The SCM is shown to overestimate high clouds and underestimate low clouds. These biases are found to also exist in the global simulation of CAM5 when it is compared with satellite data.« less

  3. The evaluation of uncertainty in low-level LSC measurements of water samples.

    PubMed

    Rusconi, R; Forte, M; Caresana, M; Bellinzona, S; Cazzaniga, M T; Sgorbati, G

    2006-01-01

    The uncertainty in measurements of gross alpha and beta activities in water samples by liquid scintillation counting with alpha/beta discrimination has been evaluated considering the problems typical of low-level measurements of environmental samples. The use of a pulse shape analysis device to discriminate alpha and beta events introduces a correlation between some of the input quantities, and it has to be considered. Main contributors to total uncertainty have been assessed by specifically designed experimental tests. Results have been fully examined and discussed.

  4. Probabilistic structural analysis to quantify uncertainties associated with turbopump blades

    NASA Technical Reports Server (NTRS)

    Nagpal, Vinod K.; Rubinstein, Robert; Chamis, Christos C.

    1987-01-01

    A probabilistic study of turbopump blades has been in progress at NASA Lewis Research Center for over the last two years. The objectives of this study are to evaluate the effects of uncertainties in geometry and material properties on the structural response of the turbopump blades to evaluate the tolerance limits on the design. A methodology based on probabilistic approach has been developed to quantify the effects of the random uncertainties. The results of this study indicate that only the variations in geometry have significant effects.

  5. Evaluating Precipitation from Orbital Data Products of TRMM and GPM over the Indian Subcontinent

    NASA Astrophysics Data System (ADS)

    Jayaluxmi, I.; Kumar, D. N.

    2015-12-01

    The rapidly growing records of microwave based precipitation data made available from various earth observation satellites have instigated a pressing need towards evaluating the associated uncertainty which arise from different sources such as retrieval error, spatial/temporal sampling error and sensor dependent error. Pertaining to microwave remote sensing, most of the studies in literature focus on gridded data products, fewer studies exist on evaluating the uncertainty inherent in orbital data products. Evaluation of the latter are essential as they potentially cause large uncertainties during real time flood forecasting studies especially at the watershed scale. The present study evaluates the uncertainty of precipitation data derived from the orbital data products of the Tropical Rainfall Measuring Mission (TRMM) satellite namely the 2A12, 2A25 and 2B31 products. Case study results over the flood prone basin of Mahanadi, India, are analyzed for precipitation uncertainty through these three facets viz., a) Uncertainty quantification using the volumetric metrics from the contingency table [Aghakouchak and Mehran 2014] b) Error characterization using additive and multiplicative error models c) Error decomposition to identify systematic and random errors d) Comparative assessment with the orbital data from GPM mission. The homoscedastic random errors from multiplicative error models justify a better representation of precipitation estimates by the 2A12 algorithm. It can be concluded that although the radiometer derived 2A12 precipitation data is known to suffer from many sources of uncertainties, spatial analysis over the case study region of India testifies that they are in excellent agreement with the reference estimates for the data period considered [Indu and Kumar 2015]. References A. AghaKouchak and A. Mehran (2014), Extended contingency table: Performance metrics for satellite observations and climate model simulations, Water Resources Research, vol. 49, 7144-7149; J. Indu and D. Nagesh Kumar (2015), Evaluation of Precipitation Retrievals from Orbital Data Products of TRMM over a Subtropical basin in India, IEEE Transactions on Geoscience and Remote Sensing, in press, doi: 10.1109/TGRS.2015.2440338.

  6. Multi-criteria evaluation of wastewater treatment plant control strategies under uncertainty.

    PubMed

    Flores-Alsina, Xavier; Rodríguez-Roda, Ignasi; Sin, Gürkan; Gernaey, Krist V

    2008-11-01

    The evaluation of activated sludge control strategies in wastewater treatment plants (WWTP) via mathematical modelling is a complex activity because several objectives; e.g. economic, environmental, technical and legal; must be taken into account at the same time, i.e. the evaluation of the alternatives is a multi-criteria problem. Activated sludge models are not well characterized and some of the parameters can present uncertainty, e.g. the influent fractions arriving to the facility and the effect of either temperature or toxic compounds on the kinetic parameters, having a strong influence in the model predictions used during the evaluation of the alternatives and affecting the resulting rank of preferences. Using a simplified version of the IWA Benchmark Simulation Model No. 2 as a case study, this article shows the variations in the decision making when the uncertainty in activated sludge model (ASM) parameters is either included or not during the evaluation of WWTP control strategies. This paper comprises two main sections. Firstly, there is the evaluation of six WWTP control strategies using multi-criteria decision analysis setting the ASM parameters at their default value. In the following section, the uncertainty is introduced, i.e. input uncertainty, which is characterized by probability distribution functions based on the available process knowledge. Next, Monte Carlo simulations are run to propagate input through the model and affect the different outcomes. Thus (i) the variation in the overall degree of satisfaction of the control objectives for the generated WWTP control strategies is quantified, (ii) the contributions of environmental, legal, technical and economic objectives to the existing variance are identified and finally (iii) the influence of the relative importance of the control objectives during the selection of alternatives is analyzed. The results show that the control strategies with an external carbon source reduce the output uncertainty in the criteria used to quantify the degree of satisfaction of environmental, technical and legal objectives, but increasing the economical costs and their variability as a trade-off. Also, it is shown how a preliminary selected alternative with cascade ammonium controller becomes less desirable when input uncertainty is included, having simpler alternatives more chance of success.

  7. An optimization based sampling approach for multiple metrics uncertainty analysis using generalized likelihood uncertainty estimation

    NASA Astrophysics Data System (ADS)

    Zhou, Rurui; Li, Yu; Lu, Di; Liu, Haixing; Zhou, Huicheng

    2016-09-01

    This paper investigates the use of an epsilon-dominance non-dominated sorted genetic algorithm II (ɛ-NSGAII) as a sampling approach with an aim to improving sampling efficiency for multiple metrics uncertainty analysis using Generalized Likelihood Uncertainty Estimation (GLUE). The effectiveness of ɛ-NSGAII based sampling is demonstrated compared with Latin hypercube sampling (LHS) through analyzing sampling efficiency, multiple metrics performance, parameter uncertainty and flood forecasting uncertainty with a case study of flood forecasting uncertainty evaluation based on Xinanjiang model (XAJ) for Qing River reservoir, China. Results obtained demonstrate the following advantages of the ɛ-NSGAII based sampling approach in comparison to LHS: (1) The former performs more effective and efficient than LHS, for example the simulation time required to generate 1000 behavioral parameter sets is shorter by 9 times; (2) The Pareto tradeoffs between metrics are demonstrated clearly with the solutions from ɛ-NSGAII based sampling, also their Pareto optimal values are better than those of LHS, which means better forecasting accuracy of ɛ-NSGAII parameter sets; (3) The parameter posterior distributions from ɛ-NSGAII based sampling are concentrated in the appropriate ranges rather than uniform, which accords with their physical significance, also parameter uncertainties are reduced significantly; (4) The forecasted floods are close to the observations as evaluated by three measures: the normalized total flow outside the uncertainty intervals (FOUI), average relative band-width (RB) and average deviation amplitude (D). The flood forecasting uncertainty is also reduced a lot with ɛ-NSGAII based sampling. This study provides a new sampling approach to improve multiple metrics uncertainty analysis under the framework of GLUE, and could be used to reveal the underlying mechanisms of parameter sets under multiple conflicting metrics in the uncertainty analysis process.

  8. Decision Support Model for Municipal Solid Waste Management at Department of Defense Installations.

    DTIC Science & Technology

    1995-12-01

    Huang uses "Grey Dynamic Programming for Waste Management Planning Under Uncertainty." Fuzzy Dynamic Programming (FDP) is usually designed to...and Composting Programs. Washington: Island Press, 1991. Junio, D.F. Development of an Analytical Hierarchy Process ( AHP ) Model for Siting of

  9. A cost-effectiveness analysis evaluating endoscopic surveillance for gastric cancer for populations with low to intermediate risk.

    PubMed

    Zhou, Hui Jun; Dan, Yock Young; Naidoo, Nasheen; Li, Shu Chuen; Yeoh, Khay Guan

    2013-01-01

    Gastric cancer (GC) surveillance based on oesophagogastroduodenoscopy (OGD) appears to be a promising strategy for GC prevention. By evaluating the cost-effectiveness of endoscopic surveillance in Singaporean Chinese, this study aimed to inform the implementation of such a program in a population with a low to intermediate GC risk. USING A REFERENCE STRATEGY OF NO OGD INTERVENTION, WE EVALUATED FOUR STRATEGIES: 2-yearly OGD surveillance, annual OGD surveillance, 2-yearly OGD screening and 2-yearly screening plus annual surveillance in Singaporean Chinese aged 50-69 years. From a perspective of the healthcare system, Markov models were built to simulate the life experience of the target population. The models projected discounted lifetime costs ($), quality adjusted life year (QALY), and incremental cost-effectiveness ratio (ICER) indicating the cost-effectiveness of each strategy against a Singapore willingness-to-pay of $46,200/QALY. Deterministic and probabilistic sensitivity analyses were used to identify the influential variables and their associated thresholds, and to quantify the influence of parameter uncertainties respectively. With an ICER of $44,098/QALY, the annual OGD surveillance was the optimal strategy while the 2-yearly surveillance was the most cost-effective strategy (ICER = $25,949/QALY). The screening-based strategies were either extendedly dominated or cost-ineffective. The cost-effectiveness heterogeneity of the four strategies was observed across age-gender subgroups. Eight influential parameters were identified each with their specific thresholds to define the choice of optimal strategy. Accounting for the model uncertainties, the probability that the annual surveillance is the optimal strategy in Singapore was 44.5%. Endoscopic surveillance is potentially cost-effective in the prevention of GC for populations at low to intermediate risk. Regarding program implementation, a detailed analysis of influential factors and their associated thresholds is necessary. Multiple strategies should be considered in order to recommend the right strategy for the right population.

  10. Needs of Accurate Prompt and Delayed γ-spectrum and Multiplicity for Nuclear Reactor Designs

    NASA Astrophysics Data System (ADS)

    Rimpault, G.; Bernard, D.; Blanchet, D.; Vaglio-Gaudard, C.; Ravaux, S.; Santamarina, A.

    The local energy photon deposit must be accounted accurately for Gen-IV fast reactors, advanced light-water nuclear reactors (Gen-III+) and the new experimental Jules Horowitz Reactor (JHR). The γ energy accounts for about 10% of the total energy released in the core of a thermal or fast reactor. The γ-energy release is much greater in the core of the reactor than in its structural sub-assemblies (such as reflector, control rod followers, dummy sub-assemblies). However, because of the propagation of γ from the core regions to the neighboring fuel-free assemblies, the contribution of γ energy to the total heating can be dominant. For reasons related to their performance, power reactors require a 7.5% (1σ) uncertainty for the energy deposition in non-fuelled zones. For the JHR material-testing reactor, a 5% (1 s) uncertainty is required in experimental positions. In order to verify the adequacy of the calculation of γ-heating, TLD and γ-fission chambers were used to derive the experimental heating values. Experimental programs were and are still conducted in different Cadarache facilities such as MASURCA (for SFR), MINERVE and EOLE (for JHR and Gen-III+ reactors). The comparison of calculated and measured γ-heating values shows an underestimation in all experimental programs indicating that for the most γ-production data from 239Pu in current nuclear-data libraries is highly suspicious.The first evaluation priority is for prompt γ-multiplicity for U and Pu fission but similar values for otheractinides such as Pu and U are also required. The nuclear data library JEFF3.1.1 contains most of the photon production data. However, there are some nuclei for which there are missing or erroneous data which need to be completed or modified. A review of the data available shows a lack of measurements for conducting serious evaluation efforts. New measurements are needed to guide new evaluation efforts which benefit from consolidated modeling techniques.

  11. Evaluating observations in the context of predictions for the death valley regional groundwater system

    USGS Publications Warehouse

    Ely, D.M.; Hill, M.C.; Tiedeman, C.R.; O'Brien, G. M.

    2004-01-01

    When a model is calibrated by nonlinear regression, calculated diagnostic and inferential statistics provide a wealth of information about many aspects of the system. This work uses linear inferential statistics that are measures of prediction uncertainty to investigate the likely importance of continued monitoring of hydraulic head to the accuracy of model predictions. The measurements evaluated are hydraulic heads; the predictions of interest are subsurface transport from 15 locations. The advective component of transport is considered because it is the component most affected by the system dynamics represented by the regional-scale model being used. The problem is addressed using the capabilities of the U.S. Geological Survey computer program MODFLOW-2000, with its Advective Travel Observation (ADV) Package. Copyright ASCE 2004.

  12. Sensitivity analysis of limit state functions for probability-based plastic design

    NASA Technical Reports Server (NTRS)

    Frangopol, D. M.

    1984-01-01

    The evaluation of the total probability of a plastic collapse failure P sub f for a highly redundant structure of random interdependent plastic moments acted on by random interdepedent loads is a difficult and computationally very costly process. The evaluation of reasonable bounds to this probability requires the use of second moment algebra which involves man statistical parameters. A computer program which selects the best strategy for minimizing the interval between upper and lower bounds of P sub f is now in its final stage of development. The relative importance of various uncertainties involved in the computational process on the resulting bounds of P sub f, sensitivity is analyzed. Response sensitivities for both mode and system reliability of an ideal plastic portal frame are shown.

  13. [Safety of food additives from a German and European point of view].

    PubMed

    Gürtler, R

    2010-06-01

    There are about 300 food additives permitted in the EU for which a re-evaluation program was initiated recently. Occasionally, it is speculated that the use of single food additives might be of safety concern. First results of the re-evaluation could give an impression on how such concerns were taken into account by responsible authorities, such as the European Food Safety Authority (EFSA). For some of the food additives, the lowest dose resulting in adverse effects was lower in recent studies compared to previous studies. Thus, the acceptable daily intake (ADI) derived applying the common uncertainty factor was lower than the ADI derived using data from previous studies. Therefore, it has to be considered whether the conditions of use need to be modified for these food additives.

  14. Not Normal: the uncertainties of scientific measurements

    NASA Astrophysics Data System (ADS)

    Bailey, David C.

    2017-01-01

    Judging the significance and reproducibility of quantitative research requires a good understanding of relevant uncertainties, but it is often unclear how well these have been evaluated and what they imply. Reported scientific uncertainties were studied by analysing 41 000 measurements of 3200 quantities from medicine, nuclear and particle physics, and interlaboratory comparisons ranging from chemistry to toxicology. Outliers are common, with 5σ disagreements up to five orders of magnitude more frequent than naively expected. Uncertainty-normalized differences between multiple measurements of the same quantity are consistent with heavy-tailed Student's t-distributions that are often almost Cauchy, far from a Gaussian Normal bell curve. Medical research uncertainties are generally as well evaluated as those in physics, but physics uncertainty improves more rapidly, making feasible simple significance criteria such as the 5σ discovery convention in particle physics. Contributions to measurement uncertainty from mistakes and unknown problems are not completely unpredictable. Such errors appear to have power-law distributions consistent with how designed complex systems fail, and how unknown systematic errors are constrained by researchers. This better understanding may help improve analysis and meta-analysis of data, and help scientists and the public have more realistic expectations of what scientific results imply.

  15. Not Normal: the uncertainties of scientific measurements

    PubMed Central

    2017-01-01

    Judging the significance and reproducibility of quantitative research requires a good understanding of relevant uncertainties, but it is often unclear how well these have been evaluated and what they imply. Reported scientific uncertainties were studied by analysing 41 000 measurements of 3200 quantities from medicine, nuclear and particle physics, and interlaboratory comparisons ranging from chemistry to toxicology. Outliers are common, with 5σ disagreements up to five orders of magnitude more frequent than naively expected. Uncertainty-normalized differences between multiple measurements of the same quantity are consistent with heavy-tailed Student’s t-distributions that are often almost Cauchy, far from a Gaussian Normal bell curve. Medical research uncertainties are generally as well evaluated as those in physics, but physics uncertainty improves more rapidly, making feasible simple significance criteria such as the 5σ discovery convention in particle physics. Contributions to measurement uncertainty from mistakes and unknown problems are not completely unpredictable. Such errors appear to have power-law distributions consistent with how designed complex systems fail, and how unknown systematic errors are constrained by researchers. This better understanding may help improve analysis and meta-analysis of data, and help scientists and the public have more realistic expectations of what scientific results imply. PMID:28280557

  16. Uncertainty Analysis of Thermal Comfort Parameters

    NASA Astrophysics Data System (ADS)

    Ribeiro, A. Silva; Alves e Sousa, J.; Cox, Maurice G.; Forbes, Alistair B.; Matias, L. Cordeiro; Martins, L. Lages

    2015-08-01

    International Standard ISO 7730:2005 defines thermal comfort as that condition of mind that expresses the degree of satisfaction with the thermal environment. Although this definition is inevitably subjective, the Standard gives formulae for two thermal comfort indices, predicted mean vote ( PMV) and predicted percentage dissatisfied ( PPD). The PMV formula is based on principles of heat balance and experimental data collected in a controlled climate chamber under steady-state conditions. The PPD formula depends only on PMV. Although these formulae are widely recognized and adopted, little has been done to establish measurement uncertainties associated with their use, bearing in mind that the formulae depend on measured values and tabulated values given to limited numerical accuracy. Knowledge of these uncertainties are invaluable when values provided by the formulae are used in making decisions in various health and civil engineering situations. This paper examines these formulae, giving a general mechanism for evaluating the uncertainties associated with values of the quantities on which the formulae depend. Further, consideration is given to the propagation of these uncertainties through the formulae to provide uncertainties associated with the values obtained for the indices. Current international guidance on uncertainty evaluation is utilized.

  17. A methodology for calibration of hyperspectral and multispectral satellite data in coastal areas

    NASA Astrophysics Data System (ADS)

    Pennucci, Giuliana; Fargion, Giulietta; Alvarez, Alberto; Trees, Charles; Arnone, Robert

    2012-06-01

    The objective of this work is to determine the location(s) in any given oceanic area during different temporal periods where in situ sampling for Calibration/Validation (Cal/Val) provides the best capability to retrieve accurate radiometric and derived product data (lowest uncertainties). We present a method to merge satellite imagery with in situ measurements, to determine the best in situ sampling strategy suitable for satellite Cal/Val and to evaluate the present in situ locations through uncertainty indices. This analysis is required to determine if the present in situ sites are adequate for assessing uncertainty and where additional sites and ship programs should be located to improve Calibration/Validation (Cal/Val) procedures. Our methodology uses satellite acquisitions to build a covariance matrix encoding the spatial-temporal variability of the area of interest. The covariance matrix is used in a Bayesian framework to merge satellite and in situ data providing a product with lower uncertainty. The best in situ location for Cal/Val is then identified by using a design principle (A-optimum design) that looks for minimizing the estimated variance of the merged products. Satellite products investigated in this study include Ocean Color water leaving radiance, chlorophyll, and inherent and apparent optical properties (retrieved from MODIS and VIIRS). In situ measurements are obtained from systems operated on fixed deployment platforms (e.g., sites of the Ocean Color component of the AErosol RObotic NETwork- AERONET-OC), moorings (e.g, Marine Optical Buoy-MOBY), ships or autonomous vehicles (such as Autonomous Underwater Vehicles and/or Gliders).

  18. Integrated planning for regional development planning and water resources management under uncertainty: A case study of Xining, China

    NASA Astrophysics Data System (ADS)

    Fu, Z. H.; Zhao, H. J.; Wang, H.; Lu, W. T.; Wang, J.; Guo, H. C.

    2017-11-01

    Economic restructuring, water resources management, population planning and environmental protection are subjects to inner uncertainties of a compound system with objectives which are competitive alternatives. Optimization model and water quality model are usually used to solve problems in a certain aspect. To overcome the uncertainty and coupling in reginal planning management, an interval fuzzy program combined with water quality model for regional planning and management has been developed to obtain the absolutely ;optimal; solution in this study. The model is a hybrid methodology of interval parameter programming (IPP), fuzzy programing (FP), and a general one-dimensional water quality model. The method extends on the traditional interval parameter fuzzy programming method by integrating water quality model into the optimization framework. Meanwhile, as an abstract concept, water resources carrying capacity has been transformed into specific and calculable index. Besides, unlike many of the past studies about water resource management, population as a significant factor has been considered. The results suggested that the methodology was applicable for reflecting the complexities of the regional planning and management systems within the planning period. The government policy makers could establish effective industrial structure, water resources utilization patterns and population planning, and to better understand the tradeoffs among economic, water resources, population and environmental objectives.

  19. Space Radiation Cancer Risks

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.

    2007-01-01

    Space radiation presents major challenges to astronauts on the International Space Station and for future missions to the Earth s moon or Mars. Methods used to project risks on Earth need to be modified because of the large uncertainties in projecting cancer risks from space radiation, and thus impact safety factors. We describe NASA s unique approach to radiation safety that applies uncertainty based criteria within the occupational health program for astronauts: The two terrestrial criteria of a point estimate of maximum acceptable level of risk and application of the principle of As Low As Reasonably Achievable (ALARA) are supplemented by a third requirement that protects against risk projection uncertainties using the upper 95% confidence level (CL) in the radiation cancer projection model. NASA s acceptable level of risk for ISS and their new lunar program have been set at the point-estimate of a 3-percent risk of exposure induced death (REID). Tissue-averaged organ dose-equivalents are combined with age at exposure and gender-dependent risk coefficients to project the cumulative occupational radiation risks incurred by astronauts. The 95% CL criteria in practice is a stronger criterion than ALARA, but not an absolute cut-off as is applied to a point projection of a 3% REID. We describe the most recent astronaut dose limits, and present a historical review of astronaut organ doses estimates from the Mercury through the current ISS program, and future projections for lunar and Mars missions. NASA s 95% CL criteria is linked to a vibrant ground based radiobiology program investigating the radiobiology of high-energy protons and heavy ions. The near-term goal of research is new knowledge leading to the reduction of uncertainties in projection models. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. The current model for projecting space radiation cancer risk relies on the three assumptions of linearity, additivity, and scaling along with the use of population averages. We describe uncertainty estimates for this model, and new experimental data that sheds light on the accuracy of the underlying assumptions. These methods make it possible to express risk management objectives in terms of quantitative metrics, i.e., the number of days in space without exceeding a given risk level within well defined confidence limits. The resulting methodology is applied to several human space exploration mission scenarios including lunar station, deep space outpost, and a Mars mission. Factors that dominate risk projection uncertainties and application of this approach to assess candidate mitigation approaches are described.

  20. An integrated GIS-based interval-probabilistic programming model for land-use planning management under uncertainty--a case study at Suzhou, China.

    PubMed

    Lu, Shasha; Zhou, Min; Guan, Xingliang; Tao, Lizao

    2015-03-01

    A large number of mathematical models have been developed for supporting optimization of land-use allocation; however, few of them simultaneously consider land suitability (e.g., physical features and spatial information) and various uncertainties existing in many factors (e.g., land availabilities, land demands, land-use patterns, and ecological requirements). This paper incorporates geographic information system (GIS) technology into interval-probabilistic programming (IPP) for land-use planning management (IPP-LUPM). GIS is utilized to assemble data for the aggregated land-use alternatives, and IPP is developed for tackling uncertainties presented as discrete intervals and probability distribution. Based on GIS, the suitability maps of different land users are provided by the outcomes of land suitability assessment and spatial analysis. The maximum area of every type of land use obtained from the suitability maps, as well as various objectives/constraints (i.e., land supply, land demand of socioeconomic development, future development strategies, and environmental capacity), is used as input data for the optimization of land-use areas with IPP-LUPM model. The proposed model not only considers the outcomes of land suitability evaluation (i.e., topography, ground conditions, hydrology, and spatial location) but also involves economic factors, food security, and eco-environmental constraints, which can effectively reflect various interrelations among different aspects in a land-use planning management system. The case study results at Suzhou, China, demonstrate that the model can help to examine the reliability of satisfying (or risk of violating) system constraints under uncertainty. Moreover, it may identify the quantitative relationship between land suitability and system benefits. Willingness to arrange the land areas based on the condition of highly suitable land will not only reduce the potential conflicts on the environmental system but also lead to a lower economic benefit. However, a strong desire to develop lower suitable land areas will bring not only a higher economic benefit but also higher risks of violating environmental and ecological constraints. The land manager should make decisions through trade-offs between economic objectives and environmental/ecological objectives.

  1. Robust Decision Making to Support Water Quality Climate Adaptation: a Case Study in the Chesapeake Bay Watershed

    NASA Astrophysics Data System (ADS)

    Fischbach, J. R.; Lempert, R. J.; Molina-Perez, E.

    2017-12-01

    The U.S. Environmental Protection Agency (USEPA), together with state and local partners, develops watershed implementation plans designed to meet water quality standards. Climate uncertainty, along with uncertainty about future land use changes or the performance of water quality best management practices (BMPs), may make it difficult for these implementation plans to meet water quality goals. In this effort, we explored how decision making under deep uncertainty (DMDU) methods such as Robust Decision Making (RDM) could help USEPA and its partners develop implementation plans that are more robust to future uncertainty. The study focuses on one part of the Chesapeake Bay watershed, the Patuxent River, which is 2,479 sq km in area, highly urbanized, and has a rapidly growing population. We simulated the contribution of stormwater contaminants from the Patuxent to the overall Total Maximum Daily Load (TMDL) for the Chesapeake Bay under multiple scenarios reflecting climate and other uncertainties. Contaminants considered included nitrogen, phosphorus, and sediment loads. The assessment included a large set of scenario simulations using the USEPA Chesapeake Bay Program's Phase V watershed model. Uncertainties represented in the analysis included 18 downscaled climate projections (based on 6 general circulation models and 3 emissions pathways), 12 land use scenarios with different population projections and development patterns, and alternative assumptions about BMP performance standards and efficiencies associated with different suites of stormwater BMPs. Finally, we developed cost estimates for each of the performance standards and compared cost to TMDL performance as a key tradeoff for future water quality management decisions. In this talk, we describe how this research can help inform climate-related decision support at USEPA's Chesapeake Bay Program, and more generally how RDM and other DMDU methods can support improved water quality management under climate uncertainty.

  2. A duality theorem-based algorithm for inexact quadratic programming problems: Application to waste management under uncertainty

    NASA Astrophysics Data System (ADS)

    Kong, X. M.; Huang, G. H.; Fan, Y. R.; Li, Y. P.

    2016-04-01

    In this study, a duality theorem-based algorithm (DTA) for inexact quadratic programming (IQP) is developed for municipal solid waste (MSW) management under uncertainty. It improves upon the existing numerical solution method for IQP problems. The comparison between DTA and derivative algorithm (DAM) shows that the DTA method provides better solutions than DAM with lower computational complexity. It is not necessary to identify the uncertain relationship between the objective function and decision variables, which is required for the solution process of DAM. The developed method is applied to a case study of MSW management and planning. The results indicate that reasonable solutions have been generated for supporting long-term MSW management and planning. They could provide more information as well as enable managers to make better decisions to identify desired MSW management policies in association with minimized cost under uncertainty.

  3. Development of Risk Uncertainty Factors from Historical NASA Projects

    NASA Technical Reports Server (NTRS)

    Amer, Tahani R.

    2011-01-01

    NASA is a good investment of federal funds and strives to provide the best value to the nation. NASA has consistently budgeted to unrealistic cost estimates, which are evident in the cost growth in many of its programs. In this investigation, NASA has been using available uncertainty factors from the Aerospace Corporation, Air Force, and Booz Allen Hamilton to develop projects risk posture. NASA has no insight into the developmental of these factors and, as demonstrated here, this can lead to unrealistic risks in many NASA Programs and projects (P/p). The primary contribution of this project is the development of NASA missions uncertainty factors, from actual historical NASA projects, to aid cost-estimating as well as for independent reviews which provide NASA senior management with information and analysis to determine the appropriate decision regarding P/p. In general terms, this research project advances programmatic analysis for NASA projects.

  4. Assessment of adaptation measures to high-mountain risks in Switzerland under climate uncertainties

    NASA Astrophysics Data System (ADS)

    Muccione, Veruska; Lontzek, Thomas; Huggel, Christian; Ott, Philipp; Salzmann, Nadine

    2015-04-01

    The economic evaluation of different adaptation options is important to support policy-makers that need to set priorities in the decision-making process. However, the decision-making process faces considerable uncertainties regarding current and projected climate impacts. First, physical climate and related impact systems are highly complex and not fully understood. Second, the further we look into the future, the more important the emission pathways become, with effects on the frequency and severity of climate impacts. Decision on adaptation measures taken today and in the future must be able to adequately consider the uncertainties originating from the different sources. Decisions are not taken in a vacuum but always in the context of specific social, economic, institutional and political conditions. Decision finding processes strongly depend on the socio-political system and usually have evolved over some time. Finding and taking decisions in the respective socio-political and economic context multiplies the uncertainty challenge. Our presumption is that a sound assessment of the different adaptation options in Switzerland under uncertainty necessitates formulating and solving a dynamic, stochastic optimization problem. Economic optimization models in the field of climate change are not new. Typically, such models are applied for global-scale studies but barely for local-scale problems. In this analysis, we considered the case of the Guttannen-Grimsel Valley, situated in the Swiss Bernese Alps. The alpine community has been affected by high-magnitude, high-frequency debris flows that started in 2009 and were historically unprecendented. They were related to thaw of permafrost in the rock slopes of Ritzlihorn and repeated rock fall events that accumulated at the debris fan and formed a sediment source for debris flows and were transported downvalley. An important transit road, a trans-European gas pipeline and settlements were severely affected and partly destroyed. Several adaptation measures were discussed by the responsible authorities but decision making is particularly challenging under multiple uncertainties. For this area, we developed a stochastic optimization model for concrete and real-case adaptation options and measures and use dynamic programming to explore the optimal adaptation decisions under uncertainty in face of uncertain impacts from climate change of debris flows and flooding. Even though simplification needed to be made the results produced were concrete and tangible, indicating that excavation is a preferable adaptation option based on our assumption and modeling in comparison to building a dam or relocation, which is not necessarily intuitive and adds an additional perspective to what has so far been sketched and evaluated by cantonal and communal authorities for Guttannen. Moreover, the building of an alternative cantonal road appears to be more expensive than costs incurring due to road closure.

  5. Equifinality and process-based modelling

    NASA Astrophysics Data System (ADS)

    Khatami, S.; Peel, M. C.; Peterson, T. J.; Western, A. W.

    2017-12-01

    Equifinality is understood as one of the fundamental difficulties in the study of open complex systems, including catchment hydrology. A review of the hydrologic literature reveals that the term equifinality has been widely used, but in many cases inconsistently and without coherent recognition of the various facets of equifinality, which can lead to ambiguity but also methodological fallacies. Therefore, in this study we first characterise the term equifinality within the context of hydrological modelling by reviewing the genesis of the concept of equifinality and then presenting a theoretical framework. During past decades, equifinality has mainly been studied as a subset of aleatory (arising due to randomness) uncertainty and for the assessment of model parameter uncertainty. Although the connection between parameter uncertainty and equifinality is undeniable, we argue there is more to equifinality than just aleatory parameter uncertainty. That is, the importance of equifinality and epistemic uncertainty (arising due to lack of knowledge) and their implications is overlooked in our current practice of model evaluation. Equifinality and epistemic uncertainty in studying, modelling, and evaluating hydrologic processes are treated as if they can be simply discussed in (or often reduced to) probabilistic terms (as for aleatory uncertainty). The deficiencies of this approach to conceptual rainfall-runoff modelling are demonstrated for selected Australian catchments by examination of parameter and internal flux distributions and interactions within SIMHYD. On this basis, we present a new approach that expands equifinality concept beyond model parameters to inform epistemic uncertainty. The new approach potentially facilitates the identification and development of more physically plausible models and model evaluation schemes particularly within the multiple working hypotheses framework, and is generalisable to other fields of environmental modelling as well.

  6. Impacts of uncertainties in weather and streamflow observations in calibration and evaluation of an elevation distributed HBV-model

    NASA Astrophysics Data System (ADS)

    Engeland, K.; Steinsland, I.; Petersen-Øverleir, A.; Johansen, S.

    2012-04-01

    The aim of this study is to assess the uncertainties in streamflow simulations when uncertainties in both observed inputs (precipitation and temperature) and streamflow observations used in the calibration of the hydrological model are explicitly accounted for. To achieve this goal we applied the elevation distributed HBV model operating on daily time steps to a small catchment in high elevation in Southern Norway where the seasonal snow cover is important. The uncertainties in precipitation inputs were quantified using conditional simulation. This procedure accounts for the uncertainty related to the density of the precipitation network, but neglects uncertainties related to measurement bias/errors and eventual elevation gradients in precipitation. The uncertainties in temperature inputs were quantified using a Bayesian temperature interpolation procedure where the temperature lapse rate is re-estimated every day. The uncertainty in the lapse rate was accounted for whereas the sampling uncertainty related to network density was neglected. For every day a random sample of precipitation and temperature inputs were drawn to be applied as inputs to the hydrologic model. The uncertainties in observed streamflow were assessed based on the uncertainties in the rating curve model. A Bayesian procedure was applied to estimate the probability for rating curve models with 1 to 3 segments and the uncertainties in their parameters. This method neglects uncertainties related to errors in observed water levels. Note that one rating curve was drawn to make one realisation of a whole time series of streamflow, thus the rating curve errors lead to a systematic bias in the streamflow observations. All these uncertainty sources were linked together in both calibration and evaluation of the hydrologic model using a DREAM based MCMC routine. Effects of having less information (e.g. missing one streamflow measurement for defining the rating curve or missing one precipitation station) was also investigated.

  7. Defining and Measuring Diagnostic Uncertainty in Medicine: A Systematic Review.

    PubMed

    Bhise, Viraj; Rajan, Suja S; Sittig, Dean F; Morgan, Robert O; Chaudhary, Pooja; Singh, Hardeep

    2018-01-01

    Physicians routinely encounter diagnostic uncertainty in practice. Despite its impact on health care utilization, costs and error, measurement of diagnostic uncertainty is poorly understood. We conducted a systematic review to describe how diagnostic uncertainty is defined and measured in medical practice. We searched OVID Medline and PsycINFO databases from inception until May 2017 using a combination of keywords and Medical Subject Headings (MeSH). Additional search strategies included manual review of references identified in the primary search, use of a topic-specific database (AHRQ-PSNet) and expert input. We specifically focused on articles that (1) defined diagnostic uncertainty; (2) conceptualized diagnostic uncertainty in terms of its sources, complexity of its attributes or strategies for managing it; or (3) attempted to measure diagnostic uncertainty. We identified 123 articles for full review, none of which defined diagnostic uncertainty. Three attributes of diagnostic uncertainty were relevant for measurement: (1) it is a subjective perception experienced by the clinician; (2) it has the potential to impact diagnostic evaluation-for example, when inappropriately managed, it can lead to diagnostic delays; and (3) it is dynamic in nature, changing with time. Current methods for measuring diagnostic uncertainty in medical practice include: (1) asking clinicians about their perception of uncertainty (surveys and qualitative interviews), (2) evaluating the patient-clinician encounter (such as by reviews of medical records, transcripts of patient-clinician communication and observation), and (3) experimental techniques (patient vignette studies). The term "diagnostic uncertainty" lacks a clear definition, and there is no comprehensive framework for its measurement in medical practice. Based on review findings, we propose that diagnostic uncertainty be defined as a "subjective perception of an inability to provide an accurate explanation of the patient's health problem." Methodological advancements in measuring diagnostic uncertainty can improve our understanding of diagnostic decision-making and inform interventions to reduce diagnostic errors and overuse of health care resources.

  8. Fear of self-annihilation and existential uncertainty as predictors of worldview defense: Comparing terror management and uncertainty theories.

    PubMed

    Rubin, Mark

    2018-01-01

    Terror management theory (TMT) proposes that thoughts of death trigger a concern about self-annihilation that motivates the defense of cultural worldviews. In contrast, uncertainty theorists propose that thoughts of death trigger feelings of uncertainty that motivate worldview defense. University students (N = 414) completed measures of the chronic fear of self-annihilation and existential uncertainty as well as the need for closure. They then evaluated either a meaning threat stimulus or a control stimulus. Consistent with TMT, participants with a high fear of self-annihilation and a high need for closure showed the greatest dislike of the meaning threat stimulus, even after controlling for their existential uncertainty. Contrary to the uncertainty perspective, fear of existential uncertainty showed no significant effects.

  9. The Need for Precise and Well-documented Experimental Data on Prompt Fission Neutron Spectra from Neutron-induced Fission of {sup 239}Pu

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neudecker, D., E-mail: dneudecker@lanl.gov; Taddeucci, T.N.; Haight, R.C.

    2016-01-15

    The spectrum of neutrons emitted promptly after {sup 239}Pu(n,f)—a so-called prompt fission neutron spectrum (PFNS)—is a quantity of high interest, for instance, for reactor physics and global security. However, there are only few experimental data sets available that are suitable for evaluations. In addition, some of those data sets differ by more than their 1-σ uncertainty boundaries. We present the results of MCNP studies indicating that these differences are partly caused by underestimated multiple scattering contributions, over-corrected background, and inconsistent deconvolution methods. A detailed uncertainty quantification for suitable experimental data was undertaken including these effects, and test-evaluations were performed withmore » the improved uncertainty information. The test-evaluations illustrate that the inadequately estimated effects and detailed uncertainty quantification have an impact on the evaluated PFNS and associated uncertainties as well as the neutron multiplicity of selected critical assemblies. A summary of data and documentation needs to improve the quality of the experimental database is provided based on the results of simulations and test-evaluations. Given the possibly substantial distortion of the PFNS by multiple scattering and background effects, special care should be taken to reduce these effects in future measurements, e.g., by measuring the {sup 239}Pu PFNS as a ratio to either the {sup 235}U or {sup 252}Cf PFNS.« less

  10. A safety rule approach to surveillance and eradication of biological invasions

    Treesearch

    Denys Yemshanov; Robert G. Haight; Frank H. Koch; Robert Venette; Kala Studens; Ronald E. Fournier; Tom Swystun; Jean J. Turgeon; Yulin Gao

    2017-01-01

    Uncertainty about future spread of invasive organisms hinders planning of effective response measures. We present a two-stage scenario optimization model that accounts for uncertainty about the spread of an invader, and determines survey and eradication strategies that minimize the expected program cost subject to a safety rule for eradication success. The safety rule...

  11. Robust surveillance and control of invasive species using a scenario optimization approach

    Treesearch

    Denys Yemshanov; Robert G. Haight; Frank H. Koch; Bo Lu; Robert C. Venette; Ronald E. Fournier; Jean J. Turgeon

    2017-01-01

    Uncertainty about future outcomes of invasions is a major hurdle in the planning of invasive species management programs. We present a scenario optimization model that incorporates uncertainty about the spread of an invasive species and allocates survey and eradication measures to minimize the number of infested or potentially infested host plants on the landscape. We...

  12. Uncertainties in the Item Parameter Estimates and Robust Automated Test Assembly

    ERIC Educational Resources Information Center

    Veldkamp, Bernard P.; Matteucci, Mariagiulia; de Jong, Martijn G.

    2013-01-01

    Item response theory parameters have to be estimated, and because of the estimation process, they do have uncertainty in them. In most large-scale testing programs, the parameters are stored in item banks, and automated test assembly algorithms are applied to assemble operational test forms. These algorithms treat item parameters as fixed values,…

  13. “What an eye-opener” – a qualitative study of vulnerable citizens participating in a municipality-based intervention

    PubMed Central

    Ilvig, Pia Maria; Kjær, Michaela; Jones, Dorrie; Christensen, Jeanette Reffstrup; Andersen, Lotte Nygaard

    2018-01-01

    ABSTRACT Purpose: To explore how psychologically vulnerable citizens experienced performing their everyday-life activities, identify activities experienced as particularly challenging and evaluate the significance of the Acceptance and Commitment Theory-based (ACT)-based program, Well-being in Daily Life, had on the participants everyday-life activities. Methods: Semi-structured interviews were conducted with eight participants from the Well-being in Daily Life program. Data were analysed using Systematic Text Condensation. Results and Conclusion: The participants experienced anxiety, fatigue, lack of structure, and chaos when performing their everyday-life activities; in addition to being uncertain about the limitations of their own resources. Furthermore, balancing between demands and resources was challenging, also leading to uncertainty and identity conflicts that contributed to the participants’ concerns about re-entering the workforce. The program enabled the participants to develop social skills and trust which contributed to providing the participants with confidence, individually-tailored-possibilities for developing new competencies and courage; thus, facilitating their recovery process. PMID:29488443

  14. Data-driven Modelling for decision making under uncertainty

    NASA Astrophysics Data System (ADS)

    Angria S, Layla; Dwi Sari, Yunita; Zarlis, Muhammad; Tulus

    2018-01-01

    The rise of the issues with the uncertainty of decision making has become a very warm conversation in operation research. Many models have been presented, one of which is with data-driven modelling (DDM). The purpose of this paper is to extract and recognize patterns in data, and find the best model in decision-making problem under uncertainty by using data-driven modeling approach with linear programming, linear and nonlinear differential equation, bayesian approach. Model criteria tested to determine the smallest error, and it will be the best model that can be used.

  15. Robust Satisficing Decision Making for Unmanned Aerial Vehicle Complex Missions under Severe Uncertainty

    PubMed Central

    Ji, Xiaoting; Niu, Yifeng; Shen, Lincheng

    2016-01-01

    This paper presents a robust satisficing decision-making method for Unmanned Aerial Vehicles (UAVs) executing complex missions in an uncertain environment. Motivated by the info-gap decision theory, we formulate this problem as a novel robust satisficing optimization problem, of which the objective is to maximize the robustness while satisfying some desired mission requirements. Specifically, a new info-gap based Markov Decision Process (IMDP) is constructed to abstract the uncertain UAV system and specify the complex mission requirements with the Linear Temporal Logic (LTL). A robust satisficing policy is obtained to maximize the robustness to the uncertain IMDP while ensuring a desired probability of satisfying the LTL specifications. To this end, we propose a two-stage robust satisficing solution strategy which consists of the construction of a product IMDP and the generation of a robust satisficing policy. In the first stage, a product IMDP is constructed by combining the IMDP with an automaton representing the LTL specifications. In the second, an algorithm based on robust dynamic programming is proposed to generate a robust satisficing policy, while an associated robustness evaluation algorithm is presented to evaluate the robustness. Finally, through Monte Carlo simulation, the effectiveness of our algorithms is demonstrated on an UAV search mission under severe uncertainty so that the resulting policy can maximize the robustness while reaching the desired performance level. Furthermore, by comparing the proposed method with other robust decision-making methods, it can be concluded that our policy can tolerate higher uncertainty so that the desired performance level can be guaranteed, which indicates that the proposed method is much more effective in real applications. PMID:27835670

  16. Evaluation of linear regression techniques for atmospheric applications: the importance of appropriate weighting

    NASA Astrophysics Data System (ADS)

    Wu, Cheng; Zhen Yu, Jian

    2018-03-01

    Linear regression techniques are widely used in atmospheric science, but they are often improperly applied due to lack of consideration or inappropriate handling of measurement uncertainty. In this work, numerical experiments are performed to evaluate the performance of five linear regression techniques, significantly extending previous works by Chu and Saylor. The five techniques are ordinary least squares (OLS), Deming regression (DR), orthogonal distance regression (ODR), weighted ODR (WODR), and York regression (YR). We first introduce a new data generation scheme that employs the Mersenne twister (MT) pseudorandom number generator. The numerical simulations are also improved by (a) refining the parameterization of nonlinear measurement uncertainties, (b) inclusion of a linear measurement uncertainty, and (c) inclusion of WODR for comparison. Results show that DR, WODR and YR produce an accurate slope, but the intercept by WODR and YR is overestimated and the degree of bias is more pronounced with a low R2 XY dataset. The importance of a properly weighting parameter λ in DR is investigated by sensitivity tests, and it is found that an improper λ in DR can lead to a bias in both the slope and intercept estimation. Because the λ calculation depends on the actual form of the measurement error, it is essential to determine the exact form of measurement error in the XY data during the measurement stage. If a priori error in one of the variables is unknown, or the measurement error described cannot be trusted, DR, WODR and YR can provide the least biases in slope and intercept among all tested regression techniques. For these reasons, DR, WODR and YR are recommended for atmospheric studies when both X and Y data have measurement errors. An Igor Pro-based program (Scatter Plot) was developed to facilitate the implementation of error-in-variables regressions.

  17. Robust Satisficing Decision Making for Unmanned Aerial Vehicle Complex Missions under Severe Uncertainty.

    PubMed

    Ji, Xiaoting; Niu, Yifeng; Shen, Lincheng

    2016-01-01

    This paper presents a robust satisficing decision-making method for Unmanned Aerial Vehicles (UAVs) executing complex missions in an uncertain environment. Motivated by the info-gap decision theory, we formulate this problem as a novel robust satisficing optimization problem, of which the objective is to maximize the robustness while satisfying some desired mission requirements. Specifically, a new info-gap based Markov Decision Process (IMDP) is constructed to abstract the uncertain UAV system and specify the complex mission requirements with the Linear Temporal Logic (LTL). A robust satisficing policy is obtained to maximize the robustness to the uncertain IMDP while ensuring a desired probability of satisfying the LTL specifications. To this end, we propose a two-stage robust satisficing solution strategy which consists of the construction of a product IMDP and the generation of a robust satisficing policy. In the first stage, a product IMDP is constructed by combining the IMDP with an automaton representing the LTL specifications. In the second, an algorithm based on robust dynamic programming is proposed to generate a robust satisficing policy, while an associated robustness evaluation algorithm is presented to evaluate the robustness. Finally, through Monte Carlo simulation, the effectiveness of our algorithms is demonstrated on an UAV search mission under severe uncertainty so that the resulting policy can maximize the robustness while reaching the desired performance level. Furthermore, by comparing the proposed method with other robust decision-making methods, it can be concluded that our policy can tolerate higher uncertainty so that the desired performance level can be guaranteed, which indicates that the proposed method is much more effective in real applications.

  18. Measures of GCM Performance as Functions of Model Parameters Affecting Clouds and Radiation

    NASA Astrophysics Data System (ADS)

    Jackson, C.; Mu, Q.; Sen, M.; Stoffa, P.

    2002-05-01

    This abstract is one of three related presentations at this meeting dealing with several issues surrounding optimal parameter and uncertainty estimation of model predictions of climate. Uncertainty in model predictions of climate depends in part on the uncertainty produced by model approximations or parameterizations of unresolved physics. Evaluating these uncertainties is computationally expensive because one needs to evaluate how arbitrary choices for any given combination of model parameters affects model performance. Because the computational effort grows exponentially with the number of parameters being investigated, it is important to choose parameters carefully. Evaluating whether a parameter is worth investigating depends on two considerations: 1) does reasonable choices of parameter values produce a large range in model response relative to observational uncertainty? and 2) does the model response depend non-linearly on various combinations of model parameters? We have decided to narrow our attention to selecting parameters that affect clouds and radiation, as it is likely that these parameters will dominate uncertainties in model predictions of future climate. We present preliminary results of ~20 to 30 AMIPII style climate model integrations using NCAR's CCM3.10 that show model performance as functions of individual parameters controlling 1) critical relative humidity for cloud formation (RHMIN), and 2) boundary layer critical Richardson number (RICR). We also explore various definitions of model performance that include some or all observational data sources (surface air temperature and pressure, meridional and zonal winds, clouds, long and short-wave cloud forcings, etc...) and evaluate in a few select cases whether the model's response depends non-linearly on the parameter values we have selected.

  19. Where do uncertainties reside within environmental risk assessments? Expert opinion on uncertainty distributions for pesticide risks to surface water organisms.

    PubMed

    Skinner, Daniel J C; Rocks, Sophie A; Pollard, Simon J T

    2016-12-01

    A reliable characterisation of uncertainties can aid uncertainty identification during environmental risk assessments (ERAs). However, typologies can be implemented inconsistently, causing uncertainties to go unidentified. We present an approach based on nine structured elicitations, in which subject-matter experts, for pesticide risks to surface water organisms, validate and assess three dimensions of uncertainty: its level (the severity of uncertainty, ranging from determinism to ignorance); nature (whether the uncertainty is epistemic or aleatory); and location (the data source or area in which the uncertainty arises). Risk characterisation contains the highest median levels of uncertainty, associated with estimating, aggregating and evaluating the magnitude of risks. Regarding the locations in which uncertainty is manifest, data uncertainty is dominant in problem formulation, exposure assessment and effects assessment. The comprehensive description of uncertainty described will enable risk analysts to prioritise the required phases, groups of tasks, or individual tasks within a risk analysis according to the highest levels of uncertainty, the potential for uncertainty to be reduced or quantified, or the types of location-based uncertainty, thus aiding uncertainty prioritisation during environmental risk assessments. In turn, it is expected to inform investment in uncertainty reduction or targeted risk management action. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  20. Measuring the Gas Constant "R": Propagation of Uncertainty and Statistics

    ERIC Educational Resources Information Center

    Olsen, Robert J.; Sattar, Simeen

    2013-01-01

    Determining the gas constant "R" by measuring the properties of hydrogen gas collected in a gas buret is well suited for comparing two approaches to uncertainty analysis using a single data set. The brevity of the experiment permits multiple determinations, allowing for statistical evaluation of the standard uncertainty u[subscript…

  1. Design and Evaluation of a Dynamic Programming Flight Routing Algorithm Using the Convective Weather Avoidance Model

    NASA Technical Reports Server (NTRS)

    Ng, Hok K.; Grabbe, Shon; Mukherjee, Avijit

    2010-01-01

    The optimization of traffic flows in congested airspace with varying convective weather is a challenging problem. One approach is to generate shortest routes between origins and destinations while meeting airspace capacity constraint in the presence of uncertainties, such as weather and airspace demand. This study focuses on development of an optimal flight path search algorithm that optimizes national airspace system throughput and efficiency in the presence of uncertainties. The algorithm is based on dynamic programming and utilizes the predicted probability that an aircraft will deviate around convective weather. It is shown that the running time of the algorithm increases linearly with the total number of links between all stages. The optimal routes minimize a combination of fuel cost and expected cost of route deviation due to convective weather. They are considered as alternatives to the set of coded departure routes which are predefined by FAA to reroute pre-departure flights around weather or air traffic constraints. A formula, which calculates predicted probability of deviation from a given flight path, is also derived. The predicted probability of deviation is calculated for all path candidates. Routes with the best probability are selected as optimal. The predicted probability of deviation serves as a computable measure of reliability in pre-departure rerouting. The algorithm can also be extended to automatically adjust its design parameters to satisfy the desired level of reliability.

  2. Investigating the Gap Between Estimated and Actual Energy Efficiency and Conservation Savings for Public Buildings Projects & Programs in United States

    NASA Astrophysics Data System (ADS)

    Qaddus, Muhammad Kamil

    The gap between estimated and actual savings in energy efficiency and conservation (EE&C) projects or programs forms the problem statement for the scope of public and government buildings. This gap has been analyzed first on impact and then on process-level. On the impact-level, the methodology leads to categorization of the gap as 'Realization Gap'. It then views the categorization of gap within the context of past and current narratives linked to realization gap. On process-level, the methodology leads to further analysis of realization gap on process evaluation basis. The process evaluation criterion, a product of this basis is then applied to two different programs (DESEU and NYC ACE) linked to the scope of this thesis. Utilizing the synergies of impact and process level analysis, it offers proposals on program development and its structure using our process evaluation criterion. Innovative financing and benefits distribution structure is thus developed and will remain part of the proposal. Restricted Stakeholder Crowd Financing and Risk-Free Incentivized return are the products of proposed financing and benefit distribution structure respectively. These products are then complimented by proposing an alternative approach in estimating EE&C savings. The approach advocates estimation based on range-allocation rather than currently utilized unique estimated savings approach. The Way Ahead section thus explores synergy between financial and engineering ranges of energy savings as a multi-discipline approach for future research. Moreover, it provides the proposed program structure with risk aversion and incentive allocation while dealing with uncertainty. This set of new approaches are believed to better fill the realization gap between estimated and actual energy efficiency savings.

  3. Adaptive management of large aquatic ecosystem recovery programs in the United States.

    PubMed

    Thom, Ronald; St Clair, Tom; Burns, Rebecca; Anderson, Michael

    2016-12-01

    Adaptive management (AM) is being employed in a number of programs in the United States to guide actions to restore aquatic ecosystems because these programs are both expensive and are faced with significant uncertainties. Many of these uncertainties are associated with prioritizing when, where, and what kind of actions are needed to meet the objectives of enhancing ecosystem services and recovering threatened and endangered species. We interviewed nine large-scale aquatic ecosystem restoration programs across the United States to document the lessons learned from implementing AM. In addition, we recorded information on ecological drivers (e.g., endangered fish species) for the program, and inferred how these drivers reflected more generic ecosystem services. Ecosystem services (e.g., genetic diversity, cultural heritage), albeit not explicit drivers, were either important to the recovery or enhancement of the drivers, or were additional benefits associated with actions to recover or enhance the program drivers. Implementing programs using AM lessons learned has apparently helped achieve better results regarding enhancing ecosystem services and restoring target species populations. The interviews yielded several recommendations. The science and AM program must be integrated into how the overall restoration program operates in order to gain understanding and support, and effectively inform management decision-making. Governance and decision-making varied based on its particular circumstances. Open communication within and among agency and stakeholder groups and extensive vetting lead up to decisions. It was important to have an internal agency staff member to implement the AM plan, and a clear designation of roles and responsibilities, and long-term commitment of other involved parties. The most important management questions and information needs must be identified up front. It was imperative to clearly identify, link and continually reinforce the essential components of an AM plan, including objectives, constraints, uncertainties, hypotheses, management actions, decision criteria and triggers, monitoring, and research. Some employed predictive models and the results of research on uncertainties to vet options for actions. Many relied on best available science and professional judgment to decide if adjustments to actions were needed. All programs emphasized the need to be nimble enough to be responsive to new information and make necessary adjustments to management action implementation. We recommend that ecosystem services be explicit drivers of restoration programs to facilitate needed funding and communicate to the general public and with the global efforts on restoring and conserving ecosystems. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. DIETARY EXPOSURE MEASUREMENTS

    EPA Science Inventory

    This research constitutes the MCEARD base dietary exposure research program and is conducted to complement the NERL total human exposure program. The research builds on previous work to reduce the level of uncertainty in exposure assessment by improving NERL's ability to evaluat...

  5. ANALYTICAL METHODS DEVELOPMENT FOR DIETARY SAMPLES

    EPA Science Inventory

    The Microbiological and Chemical Exposure Assessment Research Division's (MCEARD) dietary exposure research program is conducted to complement the NERL aggregate and cumulative exposure program. Its purpose is to reduce the level of uncertainty in exposure assessment by improvin...

  6. NEW PROGRAMMING ENVIRONMENTS FOR UNCERTAINTY ANALYSIS

    EPA Science Inventory

    We live in a world of faster computers, better GUI's and visualization technology, increasing international cooperation made possible by new digital infrastructure, new agreements between US federal agencies (such as ISCMEM), new European Union programs (such as Harmoniqua), and ...

  7. Large-scale linear programs in planning and prediction.

    DOT National Transportation Integrated Search

    2017-06-01

    Large-scale linear programs are at the core of many traffic-related optimization problems in both planning and prediction. Moreover, many of these involve significant uncertainty, and hence are modeled using either chance constraints, or robust optim...

  8. SLFP: a stochastic linear fractional programming approach for sustainable waste management.

    PubMed

    Zhu, H; Huang, G H

    2011-12-01

    A stochastic linear fractional programming (SLFP) approach is developed for supporting sustainable municipal solid waste management under uncertainty. The SLFP method can solve ratio optimization problems associated with random information, where chance-constrained programming is integrated into a linear fractional programming framework. It has advantages in: (1) comparing objectives of two aspects, (2) reflecting system efficiency, (3) dealing with uncertainty expressed as probability distributions, and (4) providing optimal-ratio solutions under different system-reliability conditions. The method is applied to a case study of waste flow allocation within a municipal solid waste (MSW) management system. The obtained solutions are useful for identifying sustainable MSW management schemes with maximized system efficiency under various constraint-violation risks. The results indicate that SLFP can support in-depth analysis of the interrelationships among system efficiency, system cost and system-failure risk. Copyright © 2011 Elsevier Ltd. All rights reserved.

  9. Uncertainty quantification in volumetric Particle Image Velocimetry

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Sayantan; Charonko, John; Vlachos, Pavlos

    2016-11-01

    Particle Image Velocimetry (PIV) uncertainty quantification is challenging due to coupled sources of elemental uncertainty and complex data reduction procedures in the measurement chain. Recent developments in this field have led to uncertainty estimation methods for planar PIV. However, no framework exists for three-dimensional volumetric PIV. In volumetric PIV the measurement uncertainty is a function of reconstructed three-dimensional particle location that in turn is very sensitive to the accuracy of the calibration mapping function. Furthermore, the iterative correction to the camera mapping function using triangulated particle locations in space (volumetric self-calibration) has its own associated uncertainty due to image noise and ghost particle reconstructions. Here we first quantify the uncertainty in the triangulated particle position which is a function of particle detection and mapping function uncertainty. The location uncertainty is then combined with the three-dimensional cross-correlation uncertainty that is estimated as an extension of the 2D PIV uncertainty framework. Finally the overall measurement uncertainty is quantified using an uncertainty propagation equation. The framework is tested with both simulated and experimental cases. For the simulated cases the variation of estimated uncertainty with the elemental volumetric PIV error sources are also evaluated. The results show reasonable prediction of standard uncertainty with good coverage.

  10. Uncertainty in Climate Change Research: An Integrated Approach

    NASA Astrophysics Data System (ADS)

    Mearns, L.

    2017-12-01

    Uncertainty has been a major theme in research regarding climate change from virtually the very beginning. And appropriately characterizing and quantifying uncertainty has been an important aspect of this work. Initially, uncertainties were explored regarding the climate system and how it would react to future forcing. A concomitant area of concern was viewed in the future emissions and concentrations of important forcing agents such as greenhouse gases and aerosols. But, of course we know there are important uncertainties in all aspects of climate change research, not just that of the climate system and emissions. And as climate change research has become more important and of pragmatic concern as possible solutions to the climate change problem are addressed, exploring all the relevant uncertainties has become more relevant and urgent. More recently, over the past five years or so, uncertainties in impacts models, such as agricultural and hydrological models, have received much more attention, through programs such as AgMIP, and some research in this arena has indicated that the uncertainty in the impacts models can be as great or greater than that in the climate system. Still there remains other areas of uncertainty that remain underexplored and/or undervalued. This includes uncertainty in vulnerability and governance. Without more thoroughly exploring these last uncertainties, we likely will underestimate important uncertainties particularly regarding how different systems can successfully adapt to climate change . In this talk I will discuss these different uncertainties and how to combine them to give a complete picture of the total uncertainty individual systems are facing. And as part of this, I will discuss how the uncertainty can be successfully managed even if it is fairly large and deep. Part of my argument will be that large uncertainty is not the enemy, but rather false certainty is the true danger.

  11. Factoring uncertainty into restoration modeling of in-situ leach uranium mines

    USGS Publications Warehouse

    Johnson, Raymond H.; Friedel, Michael J.

    2009-01-01

    Postmining restoration is one of the greatest concerns for uranium in-situ leach (ISL) mining operations. The ISL-affected aquifer needs to be returned to conditions specified in the mining permit (either premining or other specified conditions). When uranium ISL operations are completed, postmining restoration is usually achieved by injecting reducing agents into the mined zone. The objective of this process is to restore the aquifer to premining conditions by reducing the solubility of uranium and other metals in the ground water. Reactive transport modeling is a potentially useful method for simulating the effectiveness of proposed restoration techniques. While reactive transport models can be useful, they are a simplification of reality that introduces uncertainty through the model conceptualization, parameterization, and calibration processes. For this reason, quantifying the uncertainty in simulated temporal and spatial hydrogeochemistry is important for postremedial risk evaluation of metal concentrations and mobility. Quantifying the range of uncertainty in key predictions (such as uranium concentrations at a specific location) can be achieved using forward Monte Carlo or other inverse modeling techniques (trial-and-error parameter sensitivity, calibration constrained Monte Carlo). These techniques provide simulated values of metal concentrations at specified locations that can be presented as nonlinear uncertainty limits or probability density functions. Decisionmakers can use these results to better evaluate environmental risk as future metal concentrations with a limited range of possibilities, based on a scientific evaluation of uncertainty.

  12. Using Marine and Freshwater Fish Environmental Intelligence Networks Under Different Climate Change Scenarios to Evaluate the Effectiveness of the Minamata Convention on Mercury

    NASA Astrophysics Data System (ADS)

    Bank, M. S.

    2017-12-01

    The Minamata Convention on Mercury was recently ratified and will go into effect on August 16, 2017. As noted in the convention text, fish are an important source of nutrition to consumers worldwide and several marine and freshwater species represent important links in the global source-receptor dynamics of methylmercury. However, despite its importance, a coordinated global program for marine and freshwater fish species using accredited laboratories, reproducible data and reliable models is still lacking. In recent years fish mercury science has evolved significantly with its use of advanced technologies and computational models to address this complex and ubiquitous environmental and public health issue. These advances in the field have made it essential that transparency be enhanced to ensure that fish mercury studies used in support of the convention are truly reproducible and scientifically sound. One primary goal of this presentation is to evaluate fish bioinformatics and methods, results and inferential reproducibility as it relates to aggregated uncertainty in mercury fish research models, science, and biomonitoring. I use models, environmental intelligence networks and simulations of the effects of a changing climate on methylmercury in marine and freshwater fish to examine how climate change and the convention itself may create further uncertainties for policymakers to consider. Lastly, I will also present an environmental intelligence framework for fish mercury bioaccumulation models and biomonitoring in support of the evaluation of the effectiveness of the Minamata Convention on Mercury.

  13. Effectiveness of a GUM-Compliant Course for Teaching Measurement in the Introductory Physics Laboratory

    ERIC Educational Resources Information Center

    Pillay, Seshini; Buffler, Andy; Lubben, Fred; Allie, Saalih

    2008-01-01

    An evaluation of a course aimed at developing university students' understanding of the nature of scientific measurement and uncertainty is described. The course materials follow the framework for metrology as recommended in the "Guide to the Expression of Uncertainty in Measurement" (GUM). The evaluation of the course is based on…

  14. Modifying climate change habitat models using tree species-specific assessments of model uncertainty and life history-factors

    Treesearch

    Stephen N. Matthews; Louis R. Iverson; Anantha M. Prasad; Matthew P. Peters; Paul G. Rodewald

    2011-01-01

    Species distribution models (SDMs) to evaluate trees' potential responses to climate change are essential for developing appropriate forest management strategies. However, there is a great need to better understand these models' limitations and evaluate their uncertainties. We have previously developed statistical models of suitable habitat, based on both...

  15. Accuracy of neutron self-activation method with iodine-containing scintillators for quantifying 128I generation using decay-fitting technique

    NASA Astrophysics Data System (ADS)

    Nohtomi, Akihiro; Wakabayashi, Genichiro

    2015-11-01

    We evaluated the accuracy of a self-activation method with iodine-containing scintillators in quantifying 128I generation in an activation detector; the self-activation method was recently proposed for photo-neutron on-line measurements around X-ray radiotherapy machines. Here, we consider the accuracy of determining the initial count rate R0, observed just after termination of neutron irradiation of the activation detector. The value R0 is directly related to the amount of activity generated by incident neutrons; the detection efficiency of radiation emitted from the activity should be taken into account for such an evaluation. Decay curves of 128I activity were numerically simulated by a computer program for various conditions including different initial count rates (R0) and background rates (RB), as well as counting statistical fluctuations. The data points sampled at minute intervals and integrated over the same period were fit by a non-linear least-squares fitting routine to obtain the value R0 as a fitting parameter with an associated uncertainty. The corresponding background rate RB was simultaneously calculated in the same fitting routine. Identical data sets were also evaluated by a well-known integration algorithm used for conventional activation methods and the results were compared with those of the proposed fitting method. When we fixed RB = 500 cpm, the relative uncertainty σR0 /R0 ≤ 0.02 was achieved for R0/RB ≥ 20 with 20 data points from 1 min to 20 min following the termination of neutron irradiation used in the fitting; σR0 /R0 ≤ 0.01 was achieved for R0/RB ≥ 50 with the same data points. Reasonable relative uncertainties to evaluate initial count rates were reached by the decay-fitting method using practically realistic sampling numbers. These results clarified the theoretical limits of the fitting method. The integration method was found to be potentially vulnerable to short-term variations in background levels, especially instantaneous contaminations by spike-like noise. The fitting method easily detects and removes such spike-like noise.

  16. Evaluation of single and multiple Doppler lidar techniques to measure complex flow during the XPIA field campaign

    NASA Astrophysics Data System (ADS)

    Choukulkar, Aditya; Brewer, W. Alan; Sandberg, Scott P.; Weickmann, Ann; Bonin, Timothy A.; Hardesty, R. Michael; Lundquist, Julie K.; Delgado, Ruben; Valerio Iungo, G.; Ashton, Ryan; Debnath, Mithu; Bianco, Laura; Wilczak, James M.; Oncley, Steven; Wolfe, Daniel

    2017-01-01

    Accurate three-dimensional information of wind flow fields can be an important tool in not only visualizing complex flow but also understanding the underlying physical processes and improving flow modeling. However, a thorough analysis of the measurement uncertainties is required to properly interpret results. The XPIA (eXperimental Planetary boundary layer Instrumentation Assessment) field campaign conducted at the Boulder Atmospheric Observatory (BAO) in Erie, CO, from 2 March to 31 May 2015 brought together a large suite of in situ and remote sensing measurement platforms to evaluate complex flow measurement strategies. In this paper, measurement uncertainties for different single and multi-Doppler strategies using simple scan geometries (conical, vertical plane and staring) are investigated. The tradeoffs (such as time-space resolution vs. spatial coverage) among the different measurement techniques are evaluated using co-located measurements made near the BAO tower. Sensitivity of the single-/multi-Doppler measurement uncertainties to averaging period are investigated using the sonic anemometers installed on the BAO tower as the standard reference. Finally, the radiometer measurements are used to partition the measurement periods as a function of atmospheric stability to determine their effect on measurement uncertainty. It was found that with an increase in spatial coverage and measurement complexity, the uncertainty in the wind measurement also increased. For multi-Doppler techniques, the increase in uncertainty for temporally uncoordinated measurements is possibly due to requiring additional assumptions of stationarity along with horizontal homogeneity and less representative line-of-sight velocity statistics. It was also found that wind speed measurement uncertainty was lower during stable conditions compared to unstable conditions.

  17. Determination of the Distribution and Inventory of Radionuclides within a Savannah River Site Waterway - 13202

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hiergesell, R.A.; Phifer, M.A.

    2013-07-01

    An investigation was conducted to evaluate the radionuclide inventory within the Lower Three Runs (LTR) Integrator Operable Unit (IOU) at the U.S. Department of Energy's (DOE's) Savannah River Site (SRS). The scope of this effort included the analysis of previously existing sampling and analysis data as well as additional stream bed and flood plain sampling and analysis data acquired to delineate horizontal and vertical distributions of the radionuclide as part of the ongoing SRS environmental restoration program, and specifically for the LTR IOU program. While cesium-137 (Cs-137) is the most significant and abundant radionuclide associated with the LTR IOU itmore » is not the only radionuclide, hence the scope included evaluating all radionuclides present and includes an evaluation of inventory uncertainty for use in sensitivity and uncertainty analyses. The scope involved evaluation of the radionuclide inventory in the P-Reactor and R-Reactor cooling water effluent canal systems, PAR Pond (including Pond C) and the flood plain and stream sediment sections of LTR between the PAR Pond Dam and the Savannah River. The approach taken was to examine all of the available Sediment and Sediment/Soil analysis data available along the P- and R-Reactor cooling water re-circulation canal system, the ponds situated along those canal reaches and along the length of LTR below Par Pond dam. By breaking the IOU into a series of sub-components and sub-sections, the mass of contaminated material was estimated and a representative central concentration of each radionuclide was computed for each compartment. The radionuclide inventory associated with each sub-compartment was then aggregated to determine the total radionuclide inventory that represented the full LTR IOU. Of special interest was the inventory of Cs-137 due to its role in contributing to the potential dose to an offsite member of the public. The overall LTR IOU inventory of Cs-137 was determined to be 2.87 E+02 GBq, which is similar to two earlier estimates. This investigation provides an independent, ground-up estimate of Cs-137 inventory in LTR IOU utilizing the most recent field data. (authors)« less

  18. A high-precision velocity measuring system design for projectiles based on S-shaped laser screen

    NASA Astrophysics Data System (ADS)

    Liu, Huayi; Qian, Zheng; Yu, Hao; Li, Yutao

    2018-03-01

    The high-precision measurement of the velocity of high-speed flying projectile is of great significance for the evaluation and development of modern weapons. The velocity of the high-speed flying projectile is usually measured by laser screen velocity measuring system. But this method cannot achieve the repeated measurements, so we cannot make an indepth evaluation of the uncertainty about the measuring system. This paper presents a design based on S-shaped laser screen velocity measuring system. This design can achieve repeated measurements. Therefore, it can effectively reduce the uncertainty of the velocity measuring system. In addition, we made a detailed analysis of the uncertainty of the measuring system. The measurement uncertainty is 0.2% when the velocity of the projectile is about 200m/s.

  19. Planning for robust reserve networks using uncertainty analysis

    USGS Publications Warehouse

    Moilanen, A.; Runge, M.C.; Elith, Jane; Tyre, A.; Carmel, Y.; Fegraus, E.; Wintle, B.A.; Burgman, M.; Ben-Haim, Y.

    2006-01-01

    Planning land-use for biodiversity conservation frequently involves computer-assisted reserve selection algorithms. Typically such algorithms operate on matrices of species presence?absence in sites, or on species-specific distributions of model predicted probabilities of occurrence in grid cells. There are practically always errors in input data?erroneous species presence?absence data, structural and parametric uncertainty in predictive habitat models, and lack of correspondence between temporal presence and long-run persistence. Despite these uncertainties, typical reserve selection methods proceed as if there is no uncertainty in the data or models. Having two conservation options of apparently equal biological value, one would prefer the option whose value is relatively insensitive to errors in planning inputs. In this work we show how uncertainty analysis for reserve planning can be implemented within a framework of information-gap decision theory, generating reserve designs that are robust to uncertainty. Consideration of uncertainty involves modifications to the typical objective functions used in reserve selection. Search for robust-optimal reserve structures can still be implemented via typical reserve selection optimization techniques, including stepwise heuristics, integer-programming and stochastic global search.

  20. The atmospheric effects of stratospheric aircraft

    NASA Technical Reports Server (NTRS)

    Stolarski, Richard S. (Editor); Wesoky, Howard L. (Editor)

    1993-01-01

    This document presents a second report from the Atmospheric Effects of Stratospheric Aircraft (AESA) component of NASA's High-Speed Research Program (HSRP). This document presents a second report from the Atmospheric Effects of Stratospheric Aircraft (AESA) component of NASA's High Speed Research Program (HSRP). Market and technology considerations continue to provide an impetus for high-speed civil transport research. A recent United Nations Environment Program scientific assessment has shown that considerable uncertainty still exists about the possible impact of aircraft on the atmosphere. The AESA was designed to develop the body of scientific knowledge necessary for the evaluation of the impact of stratospheric aircraft on the atmosphere. The first Program report presented the basic objectives and plans for AESA. This second report presents the status of the ongoing research as reported by the principal investigators at the second annual AESA Program meeting in May 1992: Laboratory studies are probing the mechanism responsible for many of the heterogeneous reactions that occur on stratospheric particles. Understanding how the atmosphere redistributes aircraft exhaust is critical to our knowing where the perturbed air will go and for how long it will remain in the stratosphere. The assessment of fleet effects is dependent on the ability to develop scenarios which correctly simulate fleet operations.

  1. Development of the X-33 Aerodynamic Uncertainty Model

    NASA Technical Reports Server (NTRS)

    Cobleigh, Brent R.

    1998-01-01

    An aerodynamic uncertainty model for the X-33 single-stage-to-orbit demonstrator aircraft has been developed at NASA Dryden Flight Research Center. The model is based on comparisons of historical flight test estimates to preflight wind-tunnel and analysis code predictions of vehicle aerodynamics documented during six lifting-body aircraft and the Space Shuttle Orbiter flight programs. The lifting-body and Orbiter data were used to define an appropriate uncertainty magnitude in the subsonic and supersonic flight regions, and the Orbiter data were used to extend the database to hypersonic Mach numbers. The uncertainty data consist of increments or percentage variations in the important aerodynamic coefficients and derivatives as a function of Mach number along a nominal trajectory. The uncertainty models will be used to perform linear analysis of the X-33 flight control system and Monte Carlo mission simulation studies. Because the X-33 aerodynamic uncertainty model was developed exclusively using historical data rather than X-33 specific characteristics, the model may be useful for other lifting-body studies.

  2. Southern Regional Center for Lightweight Innovative Design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horstemeyer, Mark F.; Wang, Paul

    The three major objectives of this Phase III project are: To develop experimentally validated cradle-to-grave modeling and simulation tools to optimize automotive and truck components for lightweighting materials (aluminum, steel, and Mg alloys and polymer-based composites) with consideration of uncertainty to decrease weight and cost, yet increase the performance and safety in impact scenarios; To develop multiscale computational models that quantify microstructure-property relations by evaluating various length scales, from the atomic through component levels, for each step of the manufacturing process for vehicles; and To develop an integrated K-12 educational program to educate students on lightweighting designs and impact scenarios.

  3. Recent studies of float and stall curves in controlled-clearance deadweight testers with a simple piston.

    PubMed

    Newhall, D H; Ogawa, I; Zilberstein, V

    1979-08-01

    The effect of piston rotation speed and fluid viscosity on the performance of free-piston gauges with a controlled clearance was studied as part of an experimental program aiming at the better evaluation of pressure by these primary pressure standards. Calculated effective area is shown to be greatly influenced by both speed of rotation and choice of a fluid. An optimum rpm resulting in the smallest possible uncertainty in effective area should be determined experimentally for each fluid and pressure range involved. When all the pertinent parameters are properly selected an appreciable improvement in accuracy can be achieved.

  4. DIETARY EXPOSURE METHODS AND MODELS

    EPA Science Inventory

    The research reported in this task description constitutes the MCEARD base dietary exposure research program and is conducted to complement the NERL aggregate and cumulative exposure program. Its purpose is to reduce the level of uncertainty in exposure assessment by improving N...

  5. Uncertainty As a Trigger for a Paradigm Change in Science Communication

    NASA Astrophysics Data System (ADS)

    Schneider, S.

    2014-12-01

    Over the last decade, the need to communicate uncertainty increased. Climate sciences and environmental sciences have faced massive propaganda campaigns by global industry and astroturf organizations. These organizations use the deep societal mistrust in uncertainty to point out alleged unethical and intentional delusion of decision makers and the public by scientists and their consultatory function. Scientists, who openly communicate uncertainty of climate model calculations, earthquake occurrence frequencies, or possible side effects of genetic manipulated semen have to face massive campaigns against their research, and sometimes against their person and live as well. Hence, new strategies to communicate uncertainty have to face the societal roots of the misunderstanding of the concept of uncertainty itself. Evolutionary biology has shown, that human mind is well suited for practical decision making by its sensory structures. Therefore, many of the irrational concepts about uncertainty are mitigated if data is presented in formats the brain is adapted to understand. At the end, the impact of uncertainty to the decision-making process is finally dominantly driven by preconceptions about terms such as uncertainty, vagueness or probabilities. Parallel to the increasing role of scientific uncertainty in strategic communication, science communicators for example at the Research and Development Program GEOTECHNOLOGIEN developed a number of techniques to master the challenge of putting uncertainty in the focus. By raising the awareness of scientific uncertainty as a driving force for scientific development and evolution, the public perspective on uncertainty is changing. While first steps to implement this process are under way, the value of uncertainty still is underestimated in the public and in politics. Therefore, science communicators are in need for new and innovative ways to talk about scientific uncertainty.

  6. Active adaptive management for reintroduction of an animal population

    USGS Publications Warehouse

    Runge, Michael C.

    2013-01-01

    Captive animals are frequently reintroduced to the wild in the face of uncertainty, but that uncertainty can often be reduced over the course of the reintroduction effort, providing the opportunity for adaptive management. One common uncertainty in reintroductions is the short-term survival rate of released adults (a release cost), an important factor because it can affect whether releasing adults or juveniles is better. Information about this rate can improve the success of the reintroduction program, but does the expected gain offset the costs of obtaining the information? I explored this question for reintroduction of the griffon vulture (Gyps fulvus) by framing the management question as a belief Markov decision process, characterizing uncertainty about release cost with 2 information state variables, and finding the solution using stochastic dynamic programming. For a reintroduction program of fixed length (e.g., 5 years of releases), the optimal policy in the final release year resembles the deterministic solution: release either all adults or all juveniles depending on whether the point estimate for the survival rate in question is above or below a specific threshold. But the optimal policy in the earlier release years 1) includes release of a mixture of juveniles and adults under some circumstances, and 2) recommends release of adults even when the point estimate of survival is much less than the deterministic threshold. These results show that in an iterated decision setting, the optimal decision in early years can be quite different from that in later years because of the value of learning. 

  7. Uncertainty in eddy covariance measurements and its application to physiological models

    Treesearch

    D.Y. Hollinger; A.D. Richardson; A.D. Richardson

    2005-01-01

    Flux data are noisy, and this uncertainty is largely due to random measurement error. Knowledge of uncertainty is essential for the statistical evaluation of modeled andmeasured fluxes, for comparison of parameters derived by fitting models to measured fluxes and in formal data-assimilation efforts. We used the difference between simultaneous measurements from two...

  8. Optimizing Irrigation Water Allocation under Multiple Sources of Uncertainty in an Arid River Basin

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Tang, D.; Gao, H.; Ding, Y.

    2015-12-01

    Population growth and climate change add additional pressures affecting water resources management strategies for meeting demands from different economic sectors. It is especially challenging in arid regions where fresh water is limited. For instance, in the Tailanhe River Basin (Xinjiang, China), a compromise must be made between water suppliers and users during drought years. This study presents a multi-objective irrigation water allocation model to cope with water scarcity in arid river basins. To deal with the uncertainties from multiple sources in the water allocation system (e.g., variations of available water amount, crop yield, crop prices, and water price), the model employs a interval linear programming approach. The multi-objective optimization model developed from this study is characterized by integrating eco-system service theory into water-saving measures. For evaluation purposes, the model is used to construct an optimal allocation system for irrigation areas fed by the Tailan River (Xinjiang Province, China). The objective functions to be optimized are formulated based on these irrigation areas' economic, social, and ecological benefits. The optimal irrigation water allocation plans are made under different hydroclimate conditions (wet year, normal year, and dry year), with multiple sources of uncertainty represented. The modeling tool and results are valuable for advising decision making by the local water authority—and the agricultural community—especially on measures for coping with water scarcity (by incorporating uncertain factors associated with crop production planning).

  9. Impact of Pilot Delay and Non-Responsiveness on the Safety Performance of Airborne Separation

    NASA Technical Reports Server (NTRS)

    Consiglio, Maria; Hoadley, Sherwood; Wing, David; Baxley, Brian; Allen, Bonnie Danette

    2008-01-01

    Assessing the safety effects of prediction errors and uncertainty on automationsupported functions in the Next Generation Air Transportation System concept of operations is of foremost importance, particularly safety critical functions such as separation that involve human decision-making. Both ground-based and airborne, the automation of separation functions must be designed to account for, and mitigate the impact of, information uncertainty and varying human response. This paper describes an experiment that addresses the potential impact of operator delay when interacting with separation support systems. In this study, we evaluated an airborne separation capability operated by a simulated pilot. The experimental runs are part of the Safety Performance of Airborne Separation (SPAS) experiment suite that examines the safety implications of prediction errors and system uncertainties on airborne separation assistance systems. Pilot actions required by the airborne separation automation to resolve traffic conflicts were delayed within a wide range, varying from five to 240 seconds while a percentage of randomly selected pilots were programmed to completely miss the conflict alerts and therefore take no action. Results indicate that the strategicAirborne Separation Assistance System (ASAS) functions exercised in the experiment can sustain pilot response delays of up to 90 seconds and more, depending on the traffic density. However, when pilots or operators fail to respond to conflict alerts the safety effects are substantial, particularly at higher traffic densities.

  10. TU-AB-BRB-03: Coverage-Based Treatment Planning to Accommodate Organ Deformable Motions and Contouring Uncertainties for Prostate Treatment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, H.

    The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less

  11. Economic and environmental costs of regulatory uncertainty for coal-fired power plants.

    PubMed

    Patiño-Echeverri, Dalia; Fischbeck, Paul; Kriegler, Elmar

    2009-02-01

    Uncertainty about the extent and timing of CO2 emissions regulations for the electricity-generating sector exacerbates the difficulty of selecting investment strategies for retrofitting or alternatively replacing existent coal-fired power plants. This may result in inefficient investments imposing economic and environmental costs to society. In this paper, we construct a multiperiod decision model with an embedded multistage stochastic dynamic program minimizing the expected total costs of plant operation, installations, and pollution allowances. We use the model to forecast optimal sequential investment decisions of a power plant operator with and without uncertainty about future CO2 allowance prices. The comparison of the two cases demonstrates that uncertainty on future CO2 emissions regulations might cause significant economic costs and higher air emissions.

  12. Uncertainty and control in the context of a category-five tornado.

    PubMed

    Afifi, Walid A; Afifi, Tamara D; Merrill, Annie

    2014-10-01

    The purpose of this qualitative descriptive study was to illuminate the experience and management of uncertainty during a natural disaster. Interviews were conducted with 26 survivors of a category-five tornado that entirely demolished the small, rural town of Greensburg, Kansas. Three primary themes were found in the survivors' accounts. First, the survivors experienced rapidly shifting levels and kinds of uncertainty as they proceeded through the stages of the disaster. Second, the fluidity of much-needed information added to uncertainty. Third, the feeling of lack of control over outcomes of the disaster and its aftermath was pervasive and was often managed through reliance on communal coping. Recommendations for disaster-related intervention programs are suggested. © 2014 Wiley Periodicals, Inc.

  13. A generalized Kruskal-Wallis test incorporating group uncertainty with application to genetic association studies.

    PubMed

    Acar, Elif F; Sun, Lei

    2013-06-01

    Motivated by genetic association studies of SNPs with genotype uncertainty, we propose a generalization of the Kruskal-Wallis test that incorporates group uncertainty when comparing k samples. The extended test statistic is based on probability-weighted rank-sums and follows an asymptotic chi-square distribution with k - 1 degrees of freedom under the null hypothesis. Simulation studies confirm the validity and robustness of the proposed test in finite samples. Application to a genome-wide association study of type 1 diabetic complications further demonstrates the utilities of this generalized Kruskal-Wallis test for studies with group uncertainty. The method has been implemented as an open-resource R program, GKW. © 2013, The International Biometric Society.

  14. Using Creativity and Collaboration to Develop Innovative Programs That Embrace Diversity in Higher Education

    ERIC Educational Resources Information Center

    Robinson, A. Helene

    2012-01-01

    This paper provides an example of an innovative solution to program development that addresses the diverse needs of teacher educators throughout various geographical locations in Florida, through a collaborative multi-university, multi-agency teacher training program funded by one collaborative grant. In this time of economic uncertainties,…

  15. Communicating diagnostic uncertainty in surgical pathology reports: disparities between sender and receiver.

    PubMed

    Lindley, Sarah W; Gillies, Elizabeth M; Hassell, Lewis A

    2014-10-01

    Surgical pathologists use a variety of phrases to communicate varying degrees of diagnostic certainty which have the potential to be interpreted differently than intended. This study sought to: (1) assess the setting, varieties and frequency of use of phrases of diagnostic uncertainty in the diagnostic line of surgical pathology reports, (2) evaluate use of uncertainty expressions by experience and gender, (3) determine how these phrases are interpreted by clinicians and pathologists, and (4) assess solutions to this communication problem. We evaluated 1500 surgical pathology reports to determine frequency of use of uncertainty terms, identified those most commonly used, and looked for variations in usage rates on the basis of case type, experience and gender. We surveyed 76 physicians at tumor boards who were asked to assign a percentage of certainty to diagnoses containing expressions of uncertainty. We found expressions of uncertainty in 35% of diagnostic reports, with no statistically significant difference in usage based on age or gender. We found wide variation in the percentage of certainty clinicians assigned to the phrases studied. We conclude that non-standardized language used in the communication of diagnostic uncertainty is a significant source of miscommunication, both amongst pathologists and between pathologists and clinicians. Copyright © 2014 The Authors. Published by Elsevier GmbH.. All rights reserved.

  16. [Evaluation of uncertainty for determination of tin and its compounds in air of workplace by flame atomic absorption spectrometry].

    PubMed

    Wei, Qiuning; Wei, Yuan; Liu, Fangfang; Ding, Yalei

    2015-10-01

    To investigate the method for uncertainty evaluation of determination of tin and its compounds in the air of workplace by flame atomic absorption spectrometry. The national occupational health standards, GBZ/T160.28-2004 and JJF1059-1999, were used to build a mathematical model of determination of tin and its compounds in the air of workplace and to calculate the components of uncertainty. In determination of tin and its compounds in the air of workplace using flame atomic absorption spectrometry, the uncertainty for the concentration of the standard solution, atomic absorption spectrophotometer, sample digestion, parallel determination, least square fitting of the calibration curve, and sample collection was 0.436%, 0.13%, 1.07%, 1.65%, 3.05%, and 2.89%, respectively. The combined uncertainty was 9.3%.The concentration of tin in the test sample was 0.132 mg/m³, and the expanded uncertainty for the measurement was 0.012 mg/m³ (K=2). The dominant uncertainty for determination of tin and its compounds in the air of workplace comes from least squares fitting of the calibration curve and sample collection. Quality control should be improved in the process of calibration curve fitting and sample collection.

  17. Multivariate Copula Analysis Toolbox (MvCAT): Describing dependence and underlying uncertainty using a Bayesian framework

    NASA Astrophysics Data System (ADS)

    Sadegh, Mojtaba; Ragno, Elisa; AghaKouchak, Amir

    2017-06-01

    We present a newly developed Multivariate Copula Analysis Toolbox (MvCAT) which includes a wide range of copula families with different levels of complexity. MvCAT employs a Bayesian framework with a residual-based Gaussian likelihood function for inferring copula parameters and estimating the underlying uncertainties. The contribution of this paper is threefold: (a) providing a Bayesian framework to approximate the predictive uncertainties of fitted copulas, (b) introducing a hybrid-evolution Markov Chain Monte Carlo (MCMC) approach designed for numerical estimation of the posterior distribution of copula parameters, and (c) enabling the community to explore a wide range of copulas and evaluate them relative to the fitting uncertainties. We show that the commonly used local optimization methods for copula parameter estimation often get trapped in local minima. The proposed method, however, addresses this limitation and improves describing the dependence structure. MvCAT also enables evaluation of uncertainties relative to the length of record, which is fundamental to a wide range of applications such as multivariate frequency analysis.

  18. Study of synthesis techniques for insensitive aircraft control systems

    NASA Technical Reports Server (NTRS)

    Harvey, C. A.; Pope, R. E.

    1977-01-01

    Insensitive flight control system design criteria was defined in terms of maximizing performance (handling qualities, RMS gust response, transient response, stability margins) over a defined parameter range. Wing load alleviation for the C-5A was chosen as a design problem. The C-5A model was a 79-state, two-control structure with uncertainties assumed to exist in dynamic pressure, structural damping and frequency, and the stability derivative, M sub w. Five new techniques (mismatch estimation, uncertainty weighting, finite dimensional inverse, maximum difficulty, dual Lyapunov) were developed. Six existing techniques (additive noise, minimax, multiplant, sensitivity vector augmentation, state dependent noise, residualization) and the mismatch estimation and uncertainty weighting techniques were synthesized and evaluated on the design example. Evaluation and comparison of these six techniques indicated that the minimax and the uncertainty weighting techniques were superior to the other six, and of these two, uncertainty weighting has lower computational requirements. Techniques based on the three remaining new concepts appear promising and are recommended for further research.

  19. SU-F-BRA-11: An Experimental Commissioning Test of Brachytherapy MBDCA Dosimetry, Based On a Commercial Radiochromic Gel/optical CT System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pappas, E; Karaiskos, P; Zourari, K

    2015-06-15

    Purpose: To implement a 3D dose verification procedure of Model-Based Dose Calculation Algorithms (MBDCAs) for {sup 192}Ir HDR brachytherapy, based on a novel Ferrous Xylenol-orange gel (FXG) and optical CT read-out. Methods: The TruView gel was employed for absolute dosimetry in conjunction with cone-beam optical CT read-out with the VISTA scanner (both from Modus Medical Inc, London, ON, Canada). A multi-catheter skin flap was attached to a cylindrical PETE jar (d=9.6cm, h=16cm) filled with FXG, which served as both the dosimeter and the water equivalent phantom of bounded dimensions. X- ray CT image series of the jar with flap attachedmore » was imported to Oncentra Brachy v.4.5. A treatment plan consisting of 8 catheters and 56 dwell positions was generated, and Oncentra-ACE MBDCA as well as TG43 dose results were exported for further evaluation. The irradiation was carried out with a microSelecton v2 source. The FXG dose-response, measured via an electron irradiation of a second dosimeter from the same batch, was linear (R2>0.999) at least up to 12Gy. A MCNP6 input file was prepared from the DICOM-RT plan data using BrachyGuide to facilitate Monte Carlo (MC) simulation dosimetry in the actual experimental geometry. Agreement between experimental (reference) and calculated dose distributions was evaluated using the 3D gamma index (GI) method with criteria (5%-2mm applied locally) determined from uncertainty analysis. Results: The TG-43 GI failed, as expected, in the majority of voxels away from the flap (pass rate 59% for D>0.8Gy, corresponding to 10% of prescribed dose). ACE performed significantly better (corresponding pass rate 92%). The GI evaluation for the MC data (corresponding pass rate 97%) failed mainly at low dose points of increased uncertainty. Conclusion: FXG gel/optical CT is an efficient method for level-2 commissioning of brachytherapy MBDCAs. Target dosimetry is not affected from uncertainty introduced by TG43 assumptions in 192Ir skin brachytherapy. Research co-financed by the ESF and Greek funds through the Operational Program Education and Lifelong Learning Investing in Knowledge Society of the NSRF. Research Funding Program: Aristeia. Modus Medical Devices Inc. provided a TruView dosimeter batch and Nucletron, and Elekta company, provided access to Oncentra Brachy v4.5, for research purposes.« less

  20. How much is new information worth? Evaluating the financial benefit of resolving management uncertainty

    USGS Publications Warehouse

    Maxwell, Sean L.; Rhodes, Jonathan R.; Runge, Michael C.; Possingham, Hugh P.; Ng, Chooi Fei; McDonald Madden, Eve

    2015-01-01

    Conservation decision-makers face a trade-off between spending limited funds on direct management action, or gaining new information in an attempt to improve management performance in the future. Value-of-information analysis can help to resolve this trade-off by evaluating how much management performance could improve if new information was gained. Value-of-information analysis has been used extensively in other disciplines, but there are only a few examples where it has informed conservation planning, none of which have used it to evaluate the financial value of gaining new information. We address this gap by applying value-of-information analysis to the management of a declining koala Phascolarctos cinereuspopulation. Decision-makers responsible for managing this population face uncertainty about survival and fecundity rates, and how habitat cover affects mortality threats. The value of gaining new information about these uncertainties was calculated using a deterministic matrix model of the koala population to find the expected population growth rate if koala mortality threats were optimally managed under alternative model hypotheses, which represented the uncertainties faced by koala managers. Gaining new information about survival and fecundity rates and the effect of habitat cover on mortality threats will do little to improve koala management. Across a range of management budgets, no more than 1·7% of the budget should be spent on resolving these uncertainties. The value of information was low because optimal management decisions were not sensitive to the uncertainties we considered. Decisions were instead driven by a substantial difference in the cost efficiency of management actions. The value of information was up to forty times higher when the cost efficiencies of different koala management actions were similar. Synthesis and applications. This study evaluates the ecological and financial benefits of gaining new information to inform a conservation problem. We also theoretically demonstrate that the value of reducing uncertainty is highest when it is not clear which management action is the most cost efficient. This study will help expand the use of value-of-information analyses in conservation by providing a cost efficiency metric by which to evaluate research or monitoring.

Top